US20090144661A1 - Computer implemented display, graphical user interface, design and method including scrolling features - Google Patents

Computer implemented display, graphical user interface, design and method including scrolling features Download PDF

Info

Publication number
US20090144661A1
US20090144661A1 US12/242,279 US24227908A US2009144661A1 US 20090144661 A1 US20090144661 A1 US 20090144661A1 US 24227908 A US24227908 A US 24227908A US 2009144661 A1 US2009144661 A1 US 2009144661A1
Authority
US
United States
Prior art keywords
information
display
widget
region
processing apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/242,279
Other versions
US8245155B2 (en
Inventor
Takeshi Nakajima
Kenichi Nirei
Yasuhiro Habara
Makoto Imamura
Shinichi Iriya
Takuo Ikeda
Daisuke Sato
Ryutaro Sakai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Electronics Inc
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US12/242,279 priority Critical patent/US8245155B2/en
Priority to EP08169529.8A priority patent/EP2068236B1/en
Priority to CN200910009742XA priority patent/CN101692194B/en
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NIREI, KENICHI, SATO, DAISUKE, IMAMURA, MAKOTO, IRIYA, SHINICHI, NAKAJIMA, TAKESHI, HABARA, YASUHIRO, IKEDA, TAKUO, SAKAI, RYUTARO
Priority to JP2009024947A priority patent/JP5259444B2/en
Assigned to SONY ELECTRONICS INC. (50%) reassignment SONY ELECTRONICS INC. (50%) ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SONY CORPORATION
Publication of US20090144661A1 publication Critical patent/US20090144661A1/en
Application granted granted Critical
Publication of US8245155B2 publication Critical patent/US8245155B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present invention relates to fixed or portable computing devices including graphical user interfaces. More particularly, the present invention relates to graphical user interfaces involving sub-dividable display regions that enable a user to navigate independently within the sub-dividable display regions.
  • GUI graphical user interface
  • Example applications include word processing programs, web browsers, etc.
  • the various icons may be selected by a user who selects different particular icons with through input from various computer input devices.
  • Some graphical user interfaces include touch panel displays that allow for the execution of certain applications, as a substitute for using a peripheral or built-in pointing device.
  • the present inventors recognize that conventional GUIs do not allow for the divisibility of the display region into separately “scrollable” display regions.
  • the present inventors recognized that there is a logical connection of how information is presented to a user in a spatial context.
  • conventional graphical user interfaces fail to take advantage of a human's memory to maintain a mental persistence of previously viewed material as it related to presently viewed material.
  • the present invention allows for the division of a display area into multiple subdisplay regions by use of a launcher bar.
  • additional information is displayed in logical arranged categories.
  • Some of the categories of information, such as weather information is presented on the subdisplay region, but if the user scrolls in the subdisplay region to one side or the other, the user will be presented with the weather forecast for the next day, or the weather from the past days is subsequently presented to the user when the scroll is performed by “flicking” of the screen.
  • the graphical user interface of the present invention may be implemented in any one of a number of different computer-based devices.
  • the graphical user interface is implemented on a SmartPhone, that includes wireless connectivity features.
  • the invention is not so limited, as it may also be implemented in a number of other wired and wireless computer-based applications, including desktop or mobile computers, navigation systems, menu panels and touchscreens installed in automobiles, etc.
  • FIG. 1 is a perspective view of a SmartPhone embodiment of a computing device that uses a GUI according to the present invention.
  • FIG. 2 shows another prospective view of the information processing apparatus of FIG. 1 .
  • FIG. 3 is a graphical user interface that helps illustrate the different terminology used herein.
  • FIG. 4 shows a succession of different views for how scrolling may be achieved through the “flick” of one's finger across a screen in a vertical direction.
  • FIG. 5 shows another succession of views showing how the screen may be scrolled horizontally through a flicking of the screen.
  • FIG. 6 is a flowchart showing a process flow of how either a vertical or horizontal scrolling operation is performed according to the present invention.
  • FIG. 7 is a flow chart showing the process of scrolling widget region and information area.
  • FIG. 8 is a flowchart showing the process of displaying start up information.
  • FIG. 9 is a screenshot showing a launcher bar and a widget display area of the present invention.
  • FIG. 10 is a screenshot showing horizontally scrolled widgets of the present invention.
  • FIG. 11 is a screenshot showing a map display.
  • FIG. 12 is a screenshot showing a display of multiple contacts.
  • FIG. 13 a screenshot showing detailed information on a single contact.
  • FIG. 14 is a screenshot showing a display of multiple e-mails.
  • FIG. 15 is a screenshot showing a single e-mail.
  • FIG. 16 is a screenshot showing an appointment schedule.
  • FIG. 17 is a screenshot showing detailed information on a single appointment.
  • FIG. 18 is a screenshot showing multiple webpage thumbnails.
  • FIG. 19 is a screenshot showing detailed information on a single webpage.
  • FIG. 20 is a screenshot showing a plurality of applications.
  • FIG. 21 is a screenshot showing the launcher bar near the middle of the display.
  • FIG. 22 is a screenshot showing horizontal scrolling all of the widgets.
  • FIG. 23 is a screenshot showing horizontal scrolling of the information area.
  • FIG. 24 is a block diagram showing a typical hardware configuration of the information-processing apparatus 11 .
  • FIG. 25 is a block diagram showing different hardware and/or software components that may be used to implement the present invention.
  • FIG. 1 shows a perspective view of a smart phone 1 according one embodiment of the present invention.
  • the smart phone 1 (or information processing apparatus) includes a base unit 2 that host thereon an intermediate body 3 that is covered by a cover body 4 . Hosted on the cover body 4 is a display unit 21 A that displays the GUI shown on FIG. 3 . On the intermediate body 3 are included a set of input keys 20 and 20 A for inputting data into the information processing apparatus 1 .
  • the information processing apparatus 1 includes a GPS receiver, as well as wireless communication capability for communicating over mobile networks.
  • electrical interfaces are included that allow for the exchange of information from the information processing apparatus 1 to auxiliary devices and/or a network. Such example interfaces includes USB, HDMI, IEEE 1291, etc.
  • the display unit 21 A includes a touch screen that enables a user to have his or her selection recognized by the display unit 21 A when touching the display unit 21 A with his or her figure, or other pointing instrument.
  • the information display apparatus 1 may allow for a user to use a remote pointing device, either wired or wirelessly connected to the information processing apparatus 1 .
  • the remote pointing devices enables the user to perform scrolling apparitions and execution of applications, by pointing and selecting different widgets, or information items that would otherwise be selected or scrolled by using the touching of ones finger on the touch panel of display unit 21 A.
  • the wireless connection can be made with infrared, or RF remote capability, such as using a Bluetooth interface.
  • a wired remote control head may also be used that would allow for the user to hold the remote pointing device in the users hand when used.
  • the device could be built into a convenient location such as the steering wheel of an automobile.
  • the remote pointing device would allow the user to visually observe the GUI on a display screen, such as a screen mounted on a car's dashboard, but would avoid the need for the user to reach over and make physical contact with the screen in order to input a selection or navigate in the GUI. Because the user can operate the GUI without having to extend his arm and reach the screen on the dashboard, it is much safer to operate the car during normal operations.
  • the information processing apparatus 1 includes a wireless capability, such as Bluetooth, that allows for the pairing of the information processing apparatus 1 to other accessible monitors.
  • a wireless capability such as Bluetooth
  • the apparatus 1 is capable of providing a Bluetooth connectivity to the display panel such that the GUI of FIG. 3 is visible on the dashboard display panel.
  • Information contained in the information processing apparatus 1 or the accessibility to other network information through the information processing apparatus 1 , is made available for display on the dashboard display panel by way of the Bluetooth connection.
  • the present inventors have recognized that this can be a much safer way to operate different navigation systems or other computer based display and input system when driving ones automobile.
  • FIG. 2 shows another perspective view of the information processing apparatus 1 , although the cover body 4 is made to cover the base unit 2 . Moreover, the information processing apparatus 1 in FIG. 2 as shown to be in a closed position. Because the apparatus 1 has a mobile telephone capability, the user is able to use the apparatus as a cellular telephone, as well as for other functions, such as a MP3 player, camera, web browser, etc.
  • FIG. 3 shows a main display region of a GUI displayed on a display, such as a LCD, plasma, CRT, LED or other type of unit capable of visually presenting computer-produced imagines of text and/or graphics.
  • the main display region may be implemented as a touch panel that allows for human interaction with the main display region.
  • a touch panel is not required and the interaction with the GUI may be implemented through a pointing device for remote operation that does not require touching of the main display region itself.
  • Example pointing devices include a mouse, trackball, jog-dial, touchpad, sensor pad, etc., each adapted to allow a human to interact with the pointing device so as to select and or scroll/“flick” content displayed on the display.
  • the pointing device need not require tactile input, but rather may also allow for pointing/scrolling/selection to be done by way of eye movement detection by having an integrated or peripheral eye movement detection device.
  • the eye movement detection device is particularly helpful in the context of using the display in an automotive setting, where it is safer for the driver to keep both of his hands on the steering wheel, while visually interacting with the GUI displayed on the display device.
  • the main display region is displayed on a touch panel display and the main display region is separated into a widget display region 20 (sometimes referred to as “sub-display region”), launcher bar 10 (sometimes referred to as “dividing region”), and information area 30 (sometimes referred to as “sub-display region”).
  • the launcher bar 10 separates the widget display area 20 from the information area 30 .
  • the launcher bar is movable through a dragging operation (in this embodiment the user drags his finger from on top of the launcher bar to another location) so as to change the ratio of the widget display region 20 and the information area 30 .
  • the launcher bar region 10 may be moved upwards by the user touching one of the icons 10 , 11 , 12 , 13 , 14 or 15 , and then dragging the user's finger in an upwards direction.
  • the main display region will then show the launcher bar 10 following the user's finger in an upwards direction so as to reduce the size of the widget display region 20 , and increase the size of the information area 30 .
  • each of the icons 10 - 15 are described in a different section that follows. However each of the icons 10 - 15 represents a different function such that when touched, displays information associated with the icon, or launches an application associated with that particular icon, such as an e-mail operation.
  • a particular widget, 22 which is shown to be a weather graphic, is displayed in combination with other widgets in the widget display region 20 .
  • the user has the option for scrolling either vertically or horizontally.
  • the scrolling operation is performed by “flicking” the screen, which is executed by the user dragging his finger across the screen (see e.g., FIGS. 4 and 5 ).
  • the newly displayed information is a same kind of information (e.g., same category of information).
  • the same kind of information is a next day in a schedule, or a next contact person of a list of contact persons.
  • the display when the display is flicked either to the right side or to the left side, the display is changed to show a different kind of information.
  • the operator recognizes with the operator's own memory (i.e., what the user remembers or what the user can intuitively imagine will appear) the relationship between scrolling up or down so as to stay within a particular category, or scrolling to the right or left in order to change the type of information.
  • a region 22 is one of a plurality of different widgets that is displayed.
  • a “small” application that does not require a significant amount of memory (such as a listing of movies, music, weather information, stock market information, travel information, navigation information) is displayed in the widget display region 20 .
  • the display information is displayed as part of setting information such as a number, kind, size, position information of the widget.
  • a webpage which includes detailed information regarding the widget is displayed.
  • the apparatus that displays the GUI of FIG. 3 is able to read the time setting information (e.g., a previously saved time, set the a user, or set as a default value based on a time zone that the unit can recognize based on position information from a built in, or remote, GPS function) and compare the present time and/or position information with the current time or position. The apparatus is then able to decide which widget should be displayed automatically. As an example, early in the morning, the user may choose to have weather information, time information, and traffic information displayed.
  • time setting information e.g., a previously saved time, set the a user, or set as a default value based on a time zone that the unit can recognize based on position information from a built in, or remote, GPS function
  • the information may switch to something that is more useful for that particular user at that particular time of day, such as access to a word processing program, spreadsheet program, e-mail, etc.
  • the user may choose to have displayed topical information such as sports information, etc., that is of particular interest to that user.
  • the underlying device that hosts the GUI includes a GPS data reception function.
  • the apparatus having the display may include the GPS receiver or the location information may be provided remotely, or input via user entry into the device. Nevertheless, by understanding its present position, the device may connect to a network, either wired or wirelessly, to obtain information related to that particular location. Such information may be traffic information, weather information, time information including time zone, etc.
  • the information area 30 which in FIG. 3 is displayed underneath the launcher bar, includes various information such as contact list, mailbox, the individual's schedule, or an RSS reader, or a menu selection display for selecting other types of displays.
  • the selection of different icons is made by clicking or touching one of the icons (or tabs) displayed on the launcher bar or flicking the display to the right or left so as to obtain other icons for execution.
  • the display information is scrolled in units of the widget. Therefore, if in the scrolling operation, only a portion of the widget is displayed, the system will continue to move that widget until either all of it or none of it is displayed. Further, the user may slide (or drag) the launcher bar so as to display information that is scrolled in units of the widget.
  • the launcher bar 10 moves up automatically and shows a greater portion of the information area 30 so that the mail may be displayed. The user will then be able to recognize the abstract or subject line of the mail, thus allowing the user to recognize the existence of such mail as soon as possible.
  • the user may also stop the changing of the display information in the middle of scrolling to the left or right so that the apparatus displays larger information automatically.
  • FIG. 4 shows a related of series of 5 figures.
  • the brightly colored portion of each figure shows the actual display area of the GUI.
  • the display shows the upper most portion of the widget display region 20 , and none of the sub-display region (information area 30 ).
  • the user touches the launcher bar 10 and drags his or her finger in a downward direction a portion of the widgets of the widget display region 20 are reduced in size, while the portion of the display that is allocated for the information area 30 is increased.
  • the uppermost widget showing the traffic information is now removed from sight, as additionally a greater portion of the information area 30 is shown.
  • Widget 1/4 view shows the situation where the user continues to drag his or her finger in a downward direction so that another of the widgets in the widget display region 20 is removed.
  • the widget 0/4 view shows a bottom-most position of the GUI, such that none of the widget display region 20 is shown above the launcher bar 10 .
  • the user nevertheless will be able to remember the previous locations of the widgets in the widget display region 20 . Therefore, it is relatively straight forward and easy for the user to remember how to navigate back to the widget display region 20 , and the particular widgets that were of interest to that user.
  • FIG. 5 is similar to FIG. 4 , although it shows how scrolling can be performed horizontally by flicking ones finger across the display region of the GUI.
  • FIG. 5 shows 5 different panels are present, with only the middle panel being shown in bright colors.
  • An indicator is shown to illustrate which of pages 1-5 is presented while flicking-scrolling horizontally. The indicator highlights the current widget panels so the user has an intuitive feel for where the present panel is located relative to the other panels.
  • the page 5 panel in the second figure in the series is begun to be shown as the user moves his or her finger towards the right portion FIG. 5 .
  • the full portion of panel 5 is shown.
  • the indicator turns off and is no longer displayed.
  • the panel includes a plurality of widgets relating to a single category.
  • FIG. 6 is a flowchart showing how a user interacts with the GUI of FIG. 3 .
  • the process starts in step S 1 where an inquiry is made regarding whether the device receives the user input to side the launcher bar. If the response to the inquiry in step S 1 is negative, the process proceeds to Step S 3 . However if the response to the inquiry in step S 1 is affirmative, the process proceeds to Step S 2 , where based on the user input, the launch bar is shown to be moved so as to change the display ratio of the widget display region 20 in the information area 30 . Subsequently the process proceeds to step S 3 where another inquiry is made regarding whether the user input is received for an icon displayed on the launcher bar. If the response to the inquiry is negative, the process returns the Step S 1 . However if the response to the inquiry in step S 3 is affirmative, the process proceeds to Step S 4 where displayed information in the information area 30 is changed based on the input received from the user. The process proceeds to Step S 1 unless power is turned off at which the process ends.
  • FIG. 7 shows a flowchart illustrating the process of scrolling widget region 20 and information area 30 and selecting applications for apparatus 11 .
  • apparatus 11 determines if an icon on the launcher bar 10 is selected. If so, apparatus 11 displays the corresponding display information in information area 30 in step S 5 , and then proceeds to step S 6 . If not, apparatus 11 skips to step S 6 where it determines if the user wishes to vertically scroll one of widget region 20 or information area 30 . If so, apparatus 11 changes the display corresponding to widget region 20 or information area 30 in step S 7 and then proceeds to step S 8 . If not, apparatus 11 skips to step S 8 where it determines if the user wishes to horizontally scroll one of widget region 20 or information area 30 .
  • apparatus 11 changes the display corresponding to widget region 20 or information area 30 in step S 9 and then proceeds to step S 10 . If not, apparatus 11 skips to step S 10 where it determines if the user has selected a displayed icon. If so, apparatus 11 launches the corresponding application in step S 11 and ends the process. If not, apparatus 11 returns to step S 4 .
  • FIG. 8 shows a process for displaying start up information for apparatus 11 .
  • apparatus 11 determines if it is recovering from sleep mode or it has been turned on. If not, it returns to step S 12 . If so, apparatus 11 proceeds to step S 13 where it obtains the current time and compares it with a time in the setting information.
  • Apparatus 11 determines in step S 14 if the current time is within the setting time range. If so, apparatus 11 displays corresponding display information in widget area 20 in step S 15 , and ends the process. If not, apparatus 11 obtains current position information and compares it with the position in the setting information in step S 16 .
  • Apparatus 11 determines in step S 17 if the current position is within the setting position range. If so, apparatus 11 displays corresponding display information in widget area 20 in step S 18 , and ends the process. If not, apparatus 11 displays a default widget in step S 19 and ends the process.
  • FIG. 9 illustrates an embodiment of the present invention including where a main display area is divided into a launcher bar 10 and a widget display area 20 including a plurality of widgets 22 .
  • the content of widget display area 20 can be scrolled horizontally, as shown in FIG. 10 to display additional widgets 22 . Further, the content of widget display area 20 can also be scrolled horizontally, as shown in FIG. 11 , to display a map 24 .
  • Map 24 includes icons 25 which allow zooming, rotating, and vertically or horizontally dragging the map 24 .
  • Launcher bar 10 can be scrolled vertically to reveal information display area 30 .
  • Information display area 30 includes information related to the highlighted icon of launcher bar 10 .
  • FIG. 12 shows information display area 30 containing a plurality of contact information 32 when contact icon 11 of launcher bar 10 is highlighted.
  • a contact's phone number can be dialed when the icon including the phone number is clicked on. Further, detailed information regarding a particular contact is displayed when that contact is clicked on, as shown in FIG. 13 .
  • FIG. 14 shows a plurality of the e-mails 34 displayed in the information display area 30 when the e-mail icon 12 of launcher bar 10 is highlighted. The content of each e-mail message is displayed when the e-mail is clicked on, as shown in FIG. 15 .
  • FIG. 16 shows an appointment schedule 36 when clock icon 13 of launcher bar 10 is highlighted. Detailed information regarding a particular appointment is displayed when that appointment is clicked on, as shown in FIG. 17 .
  • FIG. 18 shows a plurality of webpage thumbnails 38 when globe icon 14 of launcher bar 10 is highlighted.
  • the corresponding webpage is displayed when one of the thumbnails 38 is clicked on, as shown in FIG. 19 .
  • FIG. 20 shows a plurality of applications 40 displayed in information display area 30 when application icon 15 of launcher bar 10 is highlighted. Any of the applications 40 can be launched when the corresponding icon is clicked on.
  • FIG. 9 shows launcher bar 10 at the bottom of the screen such that widget display area 20 takes up most of the main display area.
  • FIG. 12 shows launcher bar 10 at the top of the screen such that information display area 30 takes up most of the main display area.
  • launcher bar 10 may be located at any vertical position in the main display area.
  • FIG. 21 shows launcher bar 10 near the middle of the main display area with a widget display area 20 above launcher bar 10 and an information display area 30 below launcher bar 10 .
  • information display area 30 and widget display area 20 may be independently scrolled.
  • FIG. 22 shows widget display area 20 being horizontally scrolled independent of launcher bar 10 and information display area 30 .
  • FIG. 22 shows information display area 30 being horizontally scrolled independent of launcher bar 10 widget display area 20 .
  • FIG. 22 it is possible to show a half-displayed widget, as shown in FIG. 22 .
  • one embodiment moves launcher bar 10 automatically to show an entire widget when the user stops dragging at the middle of the widget. Therefore, the display control unit determines whether or not the half displayed widget should be displayed, and moves launcher bar 10 automatically to show the entire widget based on the displayed ratio of the half displayed widget.
  • FIGS. 9 and 23 do not show half-displayed widgets.
  • FIG. 24 illustrates a computer system 1201 upon which an embodiment of the present invention may be implemented.
  • the computer system 1201 includes a bus 1202 or other communication mechanism for communicating information, and a processor 1203 coupled with the bus 1202 for processing the information.
  • the computer system 1201 also includes a main memory 1204 , such as a random access memory (RAM) or other dynamic storage device (e.g., dynamic RAM (DRAM), static RAM (SRAM), and synchronous DRAM (SDRAM)), coupled to the bus 1202 for storing information and instructions to be executed by processor 1203 .
  • the main memory 1204 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processor 1203 .
  • the computer system 1201 further includes a read only memory (ROM) 1205 or other static storage device (e.g., programmable ROM (PROM), erasable PROM (EPROM), and electrically erasable PROM (EEPROM)) coupled to the bus 1202 for storing static information and instructions for the processor 1203 .
  • ROM read only memory
  • PROM programmable ROM
  • EPROM erasable PROM
  • EEPROM electrically erasable PROM
  • the computer system 1201 also includes a disk controller 1206 coupled to the bus 1202 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 1207 , and a removable media drive 1208 (e.g., floppy disk drive, read-only compact disc drive, read/write compact disc drive, compact disc jukebox, tape drive, and removable magneto-optical drive).
  • a removable media drive 1208 e.g., floppy disk drive, read-only compact disc drive, read/write compact disc drive, compact disc jukebox, tape drive, and removable magneto-optical drive.
  • the storage devices may be added to the computer system 1201 using an appropriate device interface (e.g., small computer system interface (SCSI), integrated device electronics (IDE), enhanced-IDE (E-IDE), direct memory access (DMA), or ultra-DMA).
  • SCSI small computer system interface
  • IDE integrated device electronics
  • E-IDE enhanced-IDE
  • DMA direct memory access
  • ultra-DMA ultra-DMA
  • the computer system 1201 may also include special purpose logic devices (e.g., application specific integrated circuits (ASICs)) or configurable logic devices (e.g., simple programmable logic devices (SPLDs), complex programmable logic devices (CPLDs), and field programmable gate arrays (FPGAs)).
  • ASICs application specific integrated circuits
  • SPLDs simple programmable logic devices
  • CPLDs complex programmable logic devices
  • FPGAs field programmable gate arrays
  • the computer system 1201 may also include a display controller 1209 coupled to the bus 1202 to control a display 1210 , such as a LCD or plasma display, for displaying information to a computer user.
  • the computer system includes input devices, such as a keyboard 1211 and a pointing device 1212 , for interacting with a computer user and providing information to the processor 1203 .
  • the pointing device 1212 for example, may be a mouse, a trackball, or a pointing stick for communicating direction information and command selections to the processor 1203 and for controlling cursor movement on the display 1210 .
  • the apparatus 1 When the remote pointing device is used, the apparatus 1 , generates a pointer overlaid on the GUI, so the user knows the location of the pointer when choosing to either select an item or “flick” the display to cause a scrolling operation.
  • a printer may provide printed listings of data stored and/or generated by the computer system 1201 .
  • the computer system 1201 performs a portion or all of the processing steps of the invention in response to the processor 1203 executing one or more sequences of one or more instructions contained in a memory, such as the main memory 1204 .
  • a memory such as the main memory 1204 .
  • Such instructions may be read into the main memory 1204 from another computer readable medium, such as a hard disk 1207 or a removable media drive 1208 .
  • processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in main memory 1204 .
  • hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
  • the computer system 1201 includes at least one computer readable medium or memory for holding instructions programmed according to the teachings of the invention and for containing data structures, tables, records, or other data described herein.
  • Examples of computer readable media are compact discs, hard disks, floppy disks, tape, magneto-optical disks, PROMs (EPROM, EEPROM, flash EPROM), DRAM, SRAM, SDRAM, or any other magnetic medium, compact discs (e.g., CD-ROM), or any other optical medium, punch cards, paper tape, or other physical medium with patterns of holes, a carrier wave (described below), or any other medium from which a computer can read.
  • the present invention includes software for controlling the computer system 1201 , for driving a device or devices for implementing the invention, and for enabling the computer system 1201 to interact with a human user (e.g., print production personnel).
  • software may include, but is not limited to, device drivers, operating systems, development tools, and applications software.
  • Such computer readable media further includes the computer program product of the present invention for performing all or a portion (if processing is distributed) of the processing performed in implementing the invention.
  • the computer code devices of the present invention may be any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs), Java classes, and complete executable programs. Moreover, parts of the processing of the present invention may be distributed for better performance, reliability, and/or cost.
  • Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks, such as the hard disk 1207 or the removable media drive 1208 .
  • Volatile media includes dynamic memory, such as the main memory 1204 .
  • Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that make up the bus 1202 . Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • Various forms of computer readable media may be involved in carrying out one or more sequences of one or more instructions to processor 1203 for execution.
  • the instructions may initially be carried on a magnetic disk of a remote computer.
  • the remote computer can load the instructions for implementing all or a portion of the present invention remotely into a dynamic memory and send the instructions over a telephone line using a modem.
  • a modem local to the computer system 1201 may receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal.
  • An infrared detector coupled to the bus 1202 can receive the data carried in the infrared signal and place the data on the bus 1202 .
  • the bus 1202 carries the data to the main memory 1204 , from which the processor 1203 retrieves and executes the instructions.
  • the instructions received by the main memory 1204 may optionally be stored on storage device 1207 or 1208 either before or after execution by processor 1203 .
  • the computer system 1201 also includes a communication interface 1213 coupled to the bus 1202 .
  • the communication interface 1213 provides a two-way data communication coupling to a network link 1214 that is connected to, for example, a local area network (LAN) 1215 , or to another communications network 1216 such as the Internet.
  • LAN local area network
  • the communication interface 1213 may be a network interface card to attach to any packet switched LAN.
  • the communication interface 1213 may be an asymmetrical digital subscriber line (ADSL) card, an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of communications line.
  • Wireless links may also be implemented.
  • the communication interface 1213 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • the network link 1214 typically provides data communication through one or more networks to other data devices.
  • the network link 1214 may provide a connection to another computer through a local network 1215 (e.g., a LAN) or through equipment operated by a service provider, which provides communication services through a communications network 1216 .
  • the local network 1214 and the communications network 1216 use, for example, electrical, electromagnetic, or optical signals that carry digital data streams, and the associated physical layer (e.g., CAT 5 cable, coaxial cable, optical fiber, etc).
  • the signals through the various networks and the signals on the network link 1214 and through the communication interface 1213 , which carry the digital data to and from the computer system 1201 maybe implemented in baseband signals, or carrier wave based signals.
  • the baseband signals convey the digital data as unmodulated electrical pulses that are descriptive of a stream of digital data bits, where the term “bits” is to be construed broadly to mean symbol, where each symbol conveys at least one or more information bits.
  • the digital data may also be used to modulate a carrier wave, such as with amplitude, phase and/or frequency shift keyed signals that are propagated over a conductive media, or transmitted as electromagnetic waves through a propagation medium.
  • the digital data may be sent as unmodulated baseband data through a “wired” communication channel and/or sent within a predetermined frequency band, different than baseband, by modulating a carrier wave.
  • the computer system 1201 can transmit and receive data, including program code, through the network(s) 1215 and 1216 , the network link 1214 and the communication interface 1213 .
  • the network link 1214 may provide a connection through a LAN 1215 to a mobile device 1217 such as a personal digital assistant (PDA) laptop computer, or cellular telephone.
  • PDA personal digital assistant
  • FIG. 25 is a block diagram showing a typical hardware configuration of the information-processing apparatus 11 .
  • a CPU 101 serving as a control nucleus is connected to a control unit 102 through an FSB (Front Side Bus).
  • the control unit 102 other control units and other devices form the processing unit 3 described above.
  • the other control units and other devices will be described later.
  • the control unit 102 is a component for executing control of a main memory 103 and graphics functions.
  • the control unit 102 mainly plays a role for processing a large amount of data at a high speed. In AT compatibility, the control unit 102 is referred to as a north bridge.
  • the control unit 102 is connected to the CPU 101 , the main memory 103 , a control unit 104 and a graphic display unit 105 such as a liquid-crystal display device.
  • the control unit 104 is a component mainly for controlling elements such as control devices provided for a user interface and for controlling bus links of devices. In the AT compatibility, the control unit 104 is referred to as a south bridge. In an ISA bridge to the PCI, the control unit 104 plays a role of a bridge between a PCI (Peripheral Component Interconnect) bus and a low-speed bus such as an ISA (Industry Standard Architecture) bus.
  • the control unit 104 has the functions of controllers such as an ISA controller and an IDE (Integrated Drive Electronics) controller.
  • the PCI bus is connected to a radio LAN (or a W-LAN) serving as a radio communication device 106 and a device 107 for connection with and controlling an external memory and an external apparatus.
  • a radio LAN or a W-LAN
  • a device 107 for connection with and controlling an external memory and an external apparatus.
  • a semiconductor memory device can be employed.
  • the device 107 is provided with a control device 108 for reading out and writing data from and into a stick-shaped storage medium and a control device 109 for reading out and writing data from and into a card-shaped storage medium.
  • the device 107 has the function of a connection interface with an external apparatus.
  • An example of the connection interface is an interface conforming to IEEE 1394 defining specifications of hardware for adding a serial device to a computer.
  • the control unit 104 is connected a LAN (Local Area Network) connection device 110 and a USB (Universal Serial Bus) port connected to the touch panel 111 to detect user operation.
  • the CPU 101 receives signal of user operation from touch panel 111 and determines such as whether or not user operation is to move the launcher bar or to press an icon on it. If CPU 101 determines that user operation is to move the launcher bar, then CPU 101 changes the display, such as, a ratio of widget display region 20 and information display region 30 , or displays corresponding information in information display area based on the user operation. Furthermore, CPU 101 determines whether vertical scrolling or horizontal scrolling is selected, or compares the current time or current position information with time or position information in setting information based on the program stored on storage unit 116 . These processes are described hereafter.
  • An auxiliary storage unit 112 is a drive for driving a disk such as a magnetic or optical disk.
  • the auxiliary storage unit 112 is a drive for driving a large-capacity storage medium such as a hard disk.
  • the auxiliary storage unit 112 is connected to the control unit 104 , which serves as an internal IDE controller.
  • An audio codec 113 connected to the control unit 104 is a component for outputting an audio signal obtained as a result of a digital-analog conversion process to a component such as a speaker 114 or head phones 115 .
  • the audio signal represents a voice or a sound.
  • the audio codec 113 carries out a process to convert audio input data into a digital one.
  • a storage unit 116 is a memory for storing a control program for driving a computer.
  • the storage unit 116 is connected to the control unit 104 and a control unit 117 by using an LPC (Low Pin Count) bus or the like.
  • LPC Low Pin Count
  • the control unit 117 is a general-purpose unit for controlling a variety of signals. As the control unit 117 , for example, an EC (Embedded Controller) is employed. The control unit 117 also controls the power supply of the information-processing apparatus 11 and additional functions of the information-processing apparatus 11 . In the case of a portable information-processing apparatus, the control unit 117 is a microcomputer. It is to be noted that, by modifying a control program stored in the storage unit 116 , the method for controlling the computer can be changed.
  • EC embedded Controller
  • An operation section 118 including the operation element 17 provided on the main body of the information-processing apparatus 11 outputs a signal to the control unit 117 .
  • a connection section 119 for connecting an external apparatus to the information-processing apparatus 11 a USB connector is provided on the main body of the information-processing apparatus 11 .
  • the USB connector 119 is also connected to the control unit 104 .
  • a power-supply section not shown in the figure receives a commercial power-supply voltage from an AC adaptor.
  • the information-processing apparatus 11 may be powered by a battery pack serving as DC power supply.
  • the battery pack includes secondary batteries or fuel batteries.

Abstract

An information processing apparatus includes a display unit and a control unit. The display unit is configured to display a dividing region dividing a main display region into two sub regions. The control unit is configured to control the display unit to display a plurality of icons in the dividing region, to change position of the dividing region in the main display region based on user input, and to display, when an icon of the plurality of icons is selected, information corresponding to the icon in at least one of the sub regions.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Application No. 60/991,013, filed Nov. 29, 2007, the entire contents of which is incorporated herein by reference.
  • FIELD OF THE INVENTION
  • The present invention relates to fixed or portable computing devices including graphical user interfaces. More particularly, the present invention relates to graphical user interfaces involving sub-dividable display regions that enable a user to navigate independently within the sub-dividable display regions.
  • BACKGROUND OF THE INVENTION
  • Computer systems conventionally use a graphical user interface (GUI) that allows for the display of various types of information. Some systems, such as MICROSOFT WINDOWS uses a desktop metaphor one which different icons are arranged, which allow a user to “point and click” on different icons to execute an application.
  • Example applications include word processing programs, web browsers, etc. The various icons may be selected by a user who selects different particular icons with through input from various computer input devices.
  • Some graphical user interfaces include touch panel displays that allow for the execution of certain applications, as a substitute for using a peripheral or built-in pointing device.
  • SUMMARY OF THE INVENTION
  • The present inventors recognize that conventional GUIs do not allow for the divisibility of the display region into separately “scrollable” display regions. The present inventors recognized that there is a logical connection of how information is presented to a user in a spatial context. However, conventional graphical user interfaces fail to take advantage of a human's memory to maintain a mental persistence of previously viewed material as it related to presently viewed material.
  • In a non-limiting example, the present invention allows for the division of a display area into multiple subdisplay regions by use of a launcher bar. Within one of the subregions, either above or below the launcher bar, additional information is displayed in logical arranged categories. Some of the categories of information, such as weather information is presented on the subdisplay region, but if the user scrolls in the subdisplay region to one side or the other, the user will be presented with the weather forecast for the next day, or the weather from the past days is subsequently presented to the user when the scroll is performed by “flicking” of the screen.
  • The graphical user interface of the present invention may be implemented in any one of a number of different computer-based devices. In one example the graphical user interface is implemented on a SmartPhone, that includes wireless connectivity features. However the invention is not so limited, as it may also be implemented in a number of other wired and wireless computer-based applications, including desktop or mobile computers, navigation systems, menu panels and touchscreens installed in automobiles, etc.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of a SmartPhone embodiment of a computing device that uses a GUI according to the present invention.
  • FIG. 2 shows another prospective view of the information processing apparatus of FIG. 1.
  • FIG. 3 is a graphical user interface that helps illustrate the different terminology used herein.
  • FIG. 4 shows a succession of different views for how scrolling may be achieved through the “flick” of one's finger across a screen in a vertical direction.
  • FIG. 5 shows another succession of views showing how the screen may be scrolled horizontally through a flicking of the screen.
  • FIG. 6 is a flowchart showing a process flow of how either a vertical or horizontal scrolling operation is performed according to the present invention.
  • FIG. 7 is a flow chart showing the process of scrolling widget region and information area.
  • FIG. 8 is a flowchart showing the process of displaying start up information.
  • FIG. 9 is a screenshot showing a launcher bar and a widget display area of the present invention.
  • FIG. 10 is a screenshot showing horizontally scrolled widgets of the present invention.
  • FIG. 11 is a screenshot showing a map display.
  • FIG. 12 is a screenshot showing a display of multiple contacts.
  • FIG. 13 a screenshot showing detailed information on a single contact.
  • FIG. 14 is a screenshot showing a display of multiple e-mails.
  • FIG. 15 is a screenshot showing a single e-mail.
  • FIG. 16 is a screenshot showing an appointment schedule.
  • FIG. 17 is a screenshot showing detailed information on a single appointment.
  • FIG. 18 is a screenshot showing multiple webpage thumbnails.
  • FIG. 19 is a screenshot showing detailed information on a single webpage.
  • FIG. 20 is a screenshot showing a plurality of applications.
  • FIG. 21 is a screenshot showing the launcher bar near the middle of the display.
  • FIG. 22 is a screenshot showing horizontal scrolling all of the widgets.
  • FIG. 23 is a screenshot showing horizontal scrolling of the information area.
  • FIG. 24 is a block diagram showing a typical hardware configuration of the information-processing apparatus 11.
  • FIG. 25 is a block diagram showing different hardware and/or software components that may be used to implement the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views.
  • FIG. 1 shows a perspective view of a smart phone 1 according one embodiment of the present invention. The smart phone 1 (or information processing apparatus) includes a base unit 2 that host thereon an intermediate body 3 that is covered by a cover body 4. Hosted on the cover body 4 is a display unit 21A that displays the GUI shown on FIG. 3. On the intermediate body 3 are included a set of input keys 20 and 20A for inputting data into the information processing apparatus 1. The information processing apparatus 1 includes a GPS receiver, as well as wireless communication capability for communicating over mobile networks. Although not shown, electrical interfaces are included that allow for the exchange of information from the information processing apparatus 1 to auxiliary devices and/or a network. Such example interfaces includes USB, HDMI, IEEE 1291, etc.
  • The display unit 21A includes a touch screen that enables a user to have his or her selection recognized by the display unit 21A when touching the display unit 21A with his or her figure, or other pointing instrument. Alternatively, the information display apparatus 1 may allow for a user to use a remote pointing device, either wired or wirelessly connected to the information processing apparatus 1. The remote pointing devices enables the user to perform scrolling apparitions and execution of applications, by pointing and selecting different widgets, or information items that would otherwise be selected or scrolled by using the touching of ones finger on the touch panel of display unit 21A.
  • The wireless connection can be made with infrared, or RF remote capability, such as using a Bluetooth interface. A wired remote control head may also be used that would allow for the user to hold the remote pointing device in the users hand when used. Alternatively, the device could be built into a convenient location such as the steering wheel of an automobile. The remote pointing device would allow the user to visually observe the GUI on a display screen, such as a screen mounted on a car's dashboard, but would avoid the need for the user to reach over and make physical contact with the screen in order to input a selection or navigate in the GUI. Because the user can operate the GUI without having to extend his arm and reach the screen on the dashboard, it is much safer to operate the car during normal operations.
  • The information processing apparatus 1 includes a wireless capability, such as Bluetooth, that allows for the pairing of the information processing apparatus 1 to other accessible monitors. For example in the context of an automobile, the apparatus 1 is capable of providing a Bluetooth connectivity to the display panel such that the GUI of FIG. 3 is visible on the dashboard display panel. Information contained in the information processing apparatus 1, or the accessibility to other network information through the information processing apparatus 1, is made available for display on the dashboard display panel by way of the Bluetooth connection. The present inventors have recognized that this can be a much safer way to operate different navigation systems or other computer based display and input system when driving ones automobile.
  • FIG. 2 shows another perspective view of the information processing apparatus 1, although the cover body 4 is made to cover the base unit 2. Moreover, the information processing apparatus 1 in FIG. 2 as shown to be in a closed position. Because the apparatus 1 has a mobile telephone capability, the user is able to use the apparatus as a cellular telephone, as well as for other functions, such as a MP3 player, camera, web browser, etc.
  • FIG. 3 shows a main display region of a GUI displayed on a display, such as a LCD, plasma, CRT, LED or other type of unit capable of visually presenting computer-produced imagines of text and/or graphics. The main display region may be implemented as a touch panel that allows for human interaction with the main display region. However, in another embodiment, a touch panel is not required and the interaction with the GUI may be implemented through a pointing device for remote operation that does not require touching of the main display region itself. Example pointing devices include a mouse, trackball, jog-dial, touchpad, sensor pad, etc., each adapted to allow a human to interact with the pointing device so as to select and or scroll/“flick” content displayed on the display. The pointing device need not require tactile input, but rather may also allow for pointing/scrolling/selection to be done by way of eye movement detection by having an integrated or peripheral eye movement detection device. The eye movement detection device is particularly helpful in the context of using the display in an automotive setting, where it is safer for the driver to keep both of his hands on the steering wheel, while visually interacting with the GUI displayed on the display device.
  • However in the present example, in FIG. 3 the main display region is displayed on a touch panel display and the main display region is separated into a widget display region 20 (sometimes referred to as “sub-display region”), launcher bar 10 (sometimes referred to as “dividing region”), and information area 30 (sometimes referred to as “sub-display region”). Within the main display region, the launcher bar 10 separates the widget display area 20 from the information area 30. The launcher bar is movable through a dragging operation (in this embodiment the user drags his finger from on top of the launcher bar to another location) so as to change the ratio of the widget display region 20 and the information area 30. In particular, in the present touch screen embodiment, the launcher bar region 10 may be moved upwards by the user touching one of the icons 10, 11, 12, 13, 14 or 15, and then dragging the user's finger in an upwards direction. The main display region will then show the launcher bar 10 following the user's finger in an upwards direction so as to reduce the size of the widget display region 20, and increase the size of the information area 30.
  • Each of the icons 10-15 are described in a different section that follows. However each of the icons 10-15 represents a different function such that when touched, displays information associated with the icon, or launches an application associated with that particular icon, such as an e-mail operation. In the widget display region 20, a particular widget, 22, which is shown to be a weather graphic, is displayed in combination with other widgets in the widget display region 20.
  • Within either of the widget display region 20, and/or information area 30, the user has the option for scrolling either vertically or horizontally. The scrolling operation is performed by “flicking” the screen, which is executed by the user dragging his finger across the screen (see e.g., FIGS. 4 and 5). For example, when the user places his or her finger on the widget 22 and then flicks the user's finger in an upward or downward direction within either the widget display region 20, or the information area 30, the newly displayed information is a same kind of information (e.g., same category of information). In one example, the same kind of information is a next day in a schedule, or a next contact person of a list of contact persons. On the other hand, when the display is flicked either to the right side or to the left side, the display is changed to show a different kind of information. In this way, the operator recognizes with the operator's own memory (i.e., what the user remembers or what the user can intuitively imagine will appear) the relationship between scrolling up or down so as to stay within a particular category, or scrolling to the right or left in order to change the type of information.
  • For example with respect to the widget display region 20, a region 22 is one of a plurality of different widgets that is displayed. When displayed as a widget, typically a “small” application that does not require a significant amount of memory (such as a listing of movies, music, weather information, stock market information, travel information, navigation information) is displayed in the widget display region 20. The display information is displayed as part of setting information such as a number, kind, size, position information of the widget. Further, in one embodiment, when a widget is selected, a webpage which includes detailed information regarding the widget is displayed.
  • There are several types of setting information, mostly related to time and/or position. The apparatus that displays the GUI of FIG. 3 is able to read the time setting information (e.g., a previously saved time, set the a user, or set as a default value based on a time zone that the unit can recognize based on position information from a built in, or remote, GPS function) and compare the present time and/or position information with the current time or position. The apparatus is then able to decide which widget should be displayed automatically. As an example, early in the morning, the user may choose to have weather information, time information, and traffic information displayed. However at a later time during the day, the information may switch to something that is more useful for that particular user at that particular time of day, such as access to a word processing program, spreadsheet program, e-mail, etc. In the evening, the user may choose to have displayed topical information such as sports information, etc., that is of particular interest to that user.
  • The underlying device that hosts the GUI includes a GPS data reception function. As discussed above, the apparatus having the display may include the GPS receiver or the location information may be provided remotely, or input via user entry into the device. Nevertheless, by understanding its present position, the device may connect to a network, either wired or wirelessly, to obtain information related to that particular location. Such information may be traffic information, weather information, time information including time zone, etc.
  • The information area 30, which in FIG. 3 is displayed underneath the launcher bar, includes various information such as contact list, mailbox, the individual's schedule, or an RSS reader, or a menu selection display for selecting other types of displays. The selection of different icons is made by clicking or touching one of the icons (or tabs) displayed on the launcher bar or flicking the display to the right or left so as to obtain other icons for execution.
  • When the user scrolls either up or down inside of the upper side region, the display information is scrolled in units of the widget. Therefore, if in the scrolling operation, only a portion of the widget is displayed, the system will continue to move that widget until either all of it or none of it is displayed. Further, the user may slide (or drag) the launcher bar so as to display information that is scrolled in units of the widget.
  • In one example, when a user gets new mail, the launcher bar 10 moves up automatically and shows a greater portion of the information area 30 so that the mail may be displayed. The user will then be able to recognize the abstract or subject line of the mail, thus allowing the user to recognize the existence of such mail as soon as possible.
  • The user may also stop the changing of the display information in the middle of scrolling to the left or right so that the apparatus displays larger information automatically.
  • FIG. 4 shows a related of series of 5 figures. The brightly colored portion of each figure shows the actual display area of the GUI. In widget 4/4 view, the display shows the upper most portion of the widget display region 20, and none of the sub-display region (information area 30). However as the user touches the launcher bar 10 and drags his or her finger in a downward direction, a portion of the widgets of the widget display region 20 are reduced in size, while the portion of the display that is allocated for the information area 30 is increased. As shown in widget 2/4 view, the uppermost widget showing the traffic information is now removed from sight, as additionally a greater portion of the information area 30 is shown. Widget 1/4 view shows the situation where the user continues to drag his or her finger in a downward direction so that another of the widgets in the widget display region 20 is removed. Finally, the widget 0/4 view shows a bottom-most position of the GUI, such that none of the widget display region 20 is shown above the launcher bar 10.
  • Nevertheless, as can be imagined based on the organization of the series of figures shown in FIG. 4, the user nevertheless will be able to remember the previous locations of the widgets in the widget display region 20. Therefore, it is relatively straight forward and easy for the user to remember how to navigate back to the widget display region 20, and the particular widgets that were of interest to that user.
  • FIG. 5 is similar to FIG. 4, although it shows how scrolling can be performed horizontally by flicking ones finger across the display region of the GUI. As shown 5 different panels are present, with only the middle panel being shown in bright colors. An indicator is shown to illustrate which of pages 1-5 is presented while flicking-scrolling horizontally. The indicator highlights the current widget panels so the user has an intuitive feel for where the present panel is located relative to the other panels. As seen, the page 5 panel in the second figure in the series is begun to be shown as the user moves his or her finger towards the right portion FIG. 5. Subsequently, in the last portion of FIG. 5, the full portion of panel 5 is shown. After a predetermined interval time (e.g. 3 second) the indicator turns off and is no longer displayed. In one embodiment, the panel includes a plurality of widgets relating to a single category.
  • FIG. 6 is a flowchart showing how a user interacts with the GUI of FIG. 3. The process starts in step S1 where an inquiry is made regarding whether the device receives the user input to side the launcher bar. If the response to the inquiry in step S1 is negative, the process proceeds to Step S3. However if the response to the inquiry in step S1 is affirmative, the process proceeds to Step S2, where based on the user input, the launch bar is shown to be moved so as to change the display ratio of the widget display region 20 in the information area 30. Subsequently the process proceeds to step S3 where another inquiry is made regarding whether the user input is received for an icon displayed on the launcher bar. If the response to the inquiry is negative, the process returns the Step S1. However if the response to the inquiry in step S3 is affirmative, the process proceeds to Step S4 where displayed information in the information area 30 is changed based on the input received from the user. The process proceeds to Step S1 unless power is turned off at which the process ends.
  • FIG. 7 shows a flowchart illustrating the process of scrolling widget region 20 and information area 30 and selecting applications for apparatus 11. In step S4, apparatus 11 determines if an icon on the launcher bar 10 is selected. If so, apparatus 11 displays the corresponding display information in information area 30 in step S5, and then proceeds to step S6. If not, apparatus 11 skips to step S6 where it determines if the user wishes to vertically scroll one of widget region 20 or information area 30. If so, apparatus 11 changes the display corresponding to widget region 20 or information area 30 in step S7 and then proceeds to step S8. If not, apparatus 11 skips to step S8 where it determines if the user wishes to horizontally scroll one of widget region 20 or information area 30. If so, apparatus 11 changes the display corresponding to widget region 20 or information area 30 in step S9 and then proceeds to step S10. If not, apparatus 11 skips to step S10 where it determines if the user has selected a displayed icon. If so, apparatus 11 launches the corresponding application in step S11 and ends the process. If not, apparatus 11 returns to step S4.
  • FIG. 8 shows a process for displaying start up information for apparatus 11. In step S12, apparatus 11 determines if it is recovering from sleep mode or it has been turned on. If not, it returns to step S12. If so, apparatus 11 proceeds to step S13 where it obtains the current time and compares it with a time in the setting information. Apparatus 11 determines in step S14 if the current time is within the setting time range. If so, apparatus 11 displays corresponding display information in widget area 20 in step S15, and ends the process. If not, apparatus 11 obtains current position information and compares it with the position in the setting information in step S 16. Apparatus 11 determines in step S17 if the current position is within the setting position range. If so, apparatus 11 displays corresponding display information in widget area 20 in step S18, and ends the process. If not, apparatus 11 displays a default widget in step S19 and ends the process.
  • FIG. 9 illustrates an embodiment of the present invention including where a main display area is divided into a launcher bar 10 and a widget display area 20 including a plurality of widgets 22. The content of widget display area 20 can be scrolled horizontally, as shown in FIG. 10 to display additional widgets 22. Further, the content of widget display area 20 can also be scrolled horizontally, as shown in FIG. 11, to display a map 24. Map 24 includes icons 25 which allow zooming, rotating, and vertically or horizontally dragging the map 24.
  • Launcher bar 10 can be scrolled vertically to reveal information display area 30. Information display area 30 includes information related to the highlighted icon of launcher bar 10. For example, FIG. 12 shows information display area 30 containing a plurality of contact information 32 when contact icon 11 of launcher bar 10 is highlighted. In an embodiment where the apparatus includes a phone function, a contact's phone number can be dialed when the icon including the phone number is clicked on. Further, detailed information regarding a particular contact is displayed when that contact is clicked on, as shown in FIG. 13.
  • FIG. 14 shows a plurality of the e-mails 34 displayed in the information display area 30 when the e-mail icon 12 of launcher bar 10 is highlighted. The content of each e-mail message is displayed when the e-mail is clicked on, as shown in FIG. 15.
  • FIG. 16 shows an appointment schedule 36 when clock icon 13 of launcher bar 10 is highlighted. Detailed information regarding a particular appointment is displayed when that appointment is clicked on, as shown in FIG. 17.
  • FIG. 18 shows a plurality of webpage thumbnails 38 when globe icon 14 of launcher bar 10 is highlighted. The corresponding webpage is displayed when one of the thumbnails 38 is clicked on, as shown in FIG. 19.
  • FIG. 20 shows a plurality of applications 40 displayed in information display area 30 when application icon 15 of launcher bar 10 is highlighted. Any of the applications 40 can be launched when the corresponding icon is clicked on.
  • When moving launcher bar 10 vertically, the relative sizes of the widget display area 20 in the information display area 30 are changed. For example, FIG. 9 shows launcher bar 10 at the bottom of the screen such that widget display area 20 takes up most of the main display area. FIG. 12 shows launcher bar 10 at the top of the screen such that information display area 30 takes up most of the main display area. However, launcher bar 10 may be located at any vertical position in the main display area. For example, FIG. 21 shows launcher bar 10 near the middle of the main display area with a widget display area 20 above launcher bar 10 and an information display area 30 below launcher bar 10. Further, information display area 30 and widget display area 20 may be independently scrolled. For example, FIG. 22 shows widget display area 20 being horizontally scrolled independent of launcher bar 10 and information display area 30. FIG. 22 shows information display area 30 being horizontally scrolled independent of launcher bar 10 widget display area 20.
  • In one embodiment, it is possible to show a half-displayed widget, as shown in FIG. 22. However, one embodiment moves launcher bar 10 automatically to show an entire widget when the user stops dragging at the middle of the widget. Therefore, the display control unit determines whether or not the half displayed widget should be displayed, and moves launcher bar 10 automatically to show the entire widget based on the displayed ratio of the half displayed widget. Thus, FIGS. 9 and 23 do not show half-displayed widgets.
  • FIG. 24 illustrates a computer system 1201 upon which an embodiment of the present invention may be implemented. The computer system 1201 includes a bus 1202 or other communication mechanism for communicating information, and a processor 1203 coupled with the bus 1202 for processing the information. The computer system 1201 also includes a main memory 1204, such as a random access memory (RAM) or other dynamic storage device (e.g., dynamic RAM (DRAM), static RAM (SRAM), and synchronous DRAM (SDRAM)), coupled to the bus 1202 for storing information and instructions to be executed by processor 1203. In addition, the main memory 1204 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processor 1203. The computer system 1201 further includes a read only memory (ROM) 1205 or other static storage device (e.g., programmable ROM (PROM), erasable PROM (EPROM), and electrically erasable PROM (EEPROM)) coupled to the bus 1202 for storing static information and instructions for the processor 1203.
  • The computer system 1201 also includes a disk controller 1206 coupled to the bus 1202 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 1207, and a removable media drive 1208 (e.g., floppy disk drive, read-only compact disc drive, read/write compact disc drive, compact disc jukebox, tape drive, and removable magneto-optical drive). The storage devices may be added to the computer system 1201 using an appropriate device interface (e.g., small computer system interface (SCSI), integrated device electronics (IDE), enhanced-IDE (E-IDE), direct memory access (DMA), or ultra-DMA).
  • The computer system 1201 may also include special purpose logic devices (e.g., application specific integrated circuits (ASICs)) or configurable logic devices (e.g., simple programmable logic devices (SPLDs), complex programmable logic devices (CPLDs), and field programmable gate arrays (FPGAs)).
  • The computer system 1201 may also include a display controller 1209 coupled to the bus 1202 to control a display 1210, such as a LCD or plasma display, for displaying information to a computer user. The computer system includes input devices, such as a keyboard 1211 and a pointing device 1212, for interacting with a computer user and providing information to the processor 1203. The pointing device 1212, for example, may be a mouse, a trackball, or a pointing stick for communicating direction information and command selections to the processor 1203 and for controlling cursor movement on the display 1210. When the remote pointing device is used, the apparatus 1, generates a pointer overlaid on the GUI, so the user knows the location of the pointer when choosing to either select an item or “flick” the display to cause a scrolling operation. In addition, a printer may provide printed listings of data stored and/or generated by the computer system 1201.
  • The computer system 1201 performs a portion or all of the processing steps of the invention in response to the processor 1203 executing one or more sequences of one or more instructions contained in a memory, such as the main memory 1204. Such instructions may be read into the main memory 1204 from another computer readable medium, such as a hard disk 1207 or a removable media drive 1208. One or more processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in main memory 1204. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
  • As stated above, the computer system 1201 includes at least one computer readable medium or memory for holding instructions programmed according to the teachings of the invention and for containing data structures, tables, records, or other data described herein. Examples of computer readable media are compact discs, hard disks, floppy disks, tape, magneto-optical disks, PROMs (EPROM, EEPROM, flash EPROM), DRAM, SRAM, SDRAM, or any other magnetic medium, compact discs (e.g., CD-ROM), or any other optical medium, punch cards, paper tape, or other physical medium with patterns of holes, a carrier wave (described below), or any other medium from which a computer can read.
  • Stored on any one or on a combination of computer readable media, the present invention includes software for controlling the computer system 1201, for driving a device or devices for implementing the invention, and for enabling the computer system 1201 to interact with a human user (e.g., print production personnel). Such software may include, but is not limited to, device drivers, operating systems, development tools, and applications software. Such computer readable media further includes the computer program product of the present invention for performing all or a portion (if processing is distributed) of the processing performed in implementing the invention.
  • The computer code devices of the present invention may be any interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs), Java classes, and complete executable programs. Moreover, parts of the processing of the present invention may be distributed for better performance, reliability, and/or cost.
  • The term “computer readable medium” as used herein refers to any medium that participates in providing instructions to the processor 1203 for execution. A computer readable medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks, such as the hard disk 1207 or the removable media drive 1208. Volatile media includes dynamic memory, such as the main memory 1204. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that make up the bus 1202. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • Various forms of computer readable media may be involved in carrying out one or more sequences of one or more instructions to processor 1203 for execution. For example, the instructions may initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions for implementing all or a portion of the present invention remotely into a dynamic memory and send the instructions over a telephone line using a modem. A modem local to the computer system 1201 may receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector coupled to the bus 1202 can receive the data carried in the infrared signal and place the data on the bus 1202. The bus 1202 carries the data to the main memory 1204, from which the processor 1203 retrieves and executes the instructions. The instructions received by the main memory 1204 may optionally be stored on storage device 1207 or 1208 either before or after execution by processor 1203.
  • The computer system 1201 also includes a communication interface 1213 coupled to the bus 1202. The communication interface 1213 provides a two-way data communication coupling to a network link 1214 that is connected to, for example, a local area network (LAN) 1215, or to another communications network 1216 such as the Internet. For example, the communication interface 1213 may be a network interface card to attach to any packet switched LAN. As another example, the communication interface 1213 may be an asymmetrical digital subscriber line (ADSL) card, an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of communications line. Wireless links may also be implemented. In any such implementation, the communication interface 1213 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • The network link 1214 typically provides data communication through one or more networks to other data devices. For example, the network link 1214 may provide a connection to another computer through a local network 1215 (e.g., a LAN) or through equipment operated by a service provider, which provides communication services through a communications network 1216. The local network 1214 and the communications network 1216 use, for example, electrical, electromagnetic, or optical signals that carry digital data streams, and the associated physical layer (e.g., CAT 5 cable, coaxial cable, optical fiber, etc). The signals through the various networks and the signals on the network link 1214 and through the communication interface 1213, which carry the digital data to and from the computer system 1201 maybe implemented in baseband signals, or carrier wave based signals. The baseband signals convey the digital data as unmodulated electrical pulses that are descriptive of a stream of digital data bits, where the term “bits” is to be construed broadly to mean symbol, where each symbol conveys at least one or more information bits. The digital data may also be used to modulate a carrier wave, such as with amplitude, phase and/or frequency shift keyed signals that are propagated over a conductive media, or transmitted as electromagnetic waves through a propagation medium. Thus, the digital data may be sent as unmodulated baseband data through a “wired” communication channel and/or sent within a predetermined frequency band, different than baseband, by modulating a carrier wave. The computer system 1201 can transmit and receive data, including program code, through the network(s) 1215 and 1216, the network link 1214 and the communication interface 1213. Moreover, the network link 1214 may provide a connection through a LAN 1215 to a mobile device 1217 such as a personal digital assistant (PDA) laptop computer, or cellular telephone.
  • FIG. 25 is a block diagram showing a typical hardware configuration of the information-processing apparatus 11.
  • As shown in the figure, a CPU 101 serving as a control nucleus is connected to a control unit 102 through an FSB (Front Side Bus). The control unit 102, other control units and other devices form the processing unit 3 described above. The other control units and other devices will be described later. The control unit 102 is a component for executing control of a main memory 103 and graphics functions. The control unit 102 mainly plays a role for processing a large amount of data at a high speed. In AT compatibility, the control unit 102 is referred to as a north bridge. In this embodiment, the control unit 102 is connected to the CPU 101, the main memory 103, a control unit 104 and a graphic display unit 105 such as a liquid-crystal display device.
  • The control unit 104 is a component mainly for controlling elements such as control devices provided for a user interface and for controlling bus links of devices. In the AT compatibility, the control unit 104 is referred to as a south bridge. In an ISA bridge to the PCI, the control unit 104 plays a role of a bridge between a PCI (Peripheral Component Interconnect) bus and a low-speed bus such as an ISA (Industry Standard Architecture) bus. The control unit 104 has the functions of controllers such as an ISA controller and an IDE (Integrated Drive Electronics) controller.
  • The PCI bus is connected to a radio LAN (or a W-LAN) serving as a radio communication device 106 and a device 107 for connection with and controlling an external memory and an external apparatus. As the external memory, a semiconductor memory device can be employed. The device 107 is provided with a control device 108 for reading out and writing data from and into a stick-shaped storage medium and a control device 109 for reading out and writing data from and into a card-shaped storage medium. In addition, the device 107 has the function of a connection interface with an external apparatus. An example of the connection interface is an interface conforming to IEEE 1394 defining specifications of hardware for adding a serial device to a computer.
  • The control unit 104 is connected a LAN (Local Area Network) connection device 110 and a USB (Universal Serial Bus) port connected to the touch panel 111 to detect user operation. The CPU 101 receives signal of user operation from touch panel 111 and determines such as whether or not user operation is to move the launcher bar or to press an icon on it. If CPU 101 determines that user operation is to move the launcher bar, then CPU 101 changes the display, such as, a ratio of widget display region 20 and information display region 30, or displays corresponding information in information display area based on the user operation. Furthermore, CPU 101 determines whether vertical scrolling or horizontal scrolling is selected, or compares the current time or current position information with time or position information in setting information based on the program stored on storage unit 116. These processes are described hereafter.
  • An auxiliary storage unit 112 is a drive for driving a disk such as a magnetic or optical disk. In this embodiment, the auxiliary storage unit 112 is a drive for driving a large-capacity storage medium such as a hard disk. The auxiliary storage unit 112 is connected to the control unit 104, which serves as an internal IDE controller.
  • An audio codec 113 connected to the control unit 104 is a component for outputting an audio signal obtained as a result of a digital-analog conversion process to a component such as a speaker 114 or head phones 115. The audio signal represents a voice or a sound. In an apparatus configuration including a microphone, the audio codec 113 carries out a process to convert audio input data into a digital one.
  • A storage unit 116 is a memory for storing a control program for driving a computer. The storage unit 116 is connected to the control unit 104 and a control unit 117 by using an LPC (Low Pin Count) bus or the like.
  • The control unit 117 is a general-purpose unit for controlling a variety of signals. As the control unit 117, for example, an EC (Embedded Controller) is employed. The control unit 117 also controls the power supply of the information-processing apparatus 11 and additional functions of the information-processing apparatus 11. In the case of a portable information-processing apparatus, the control unit 117 is a microcomputer. It is to be noted that, by modifying a control program stored in the storage unit 116, the method for controlling the computer can be changed.
  • An operation section 118 including the operation element 17 provided on the main body of the information-processing apparatus 11 outputs a signal to the control unit 117. As a connection section 119 for connecting an external apparatus to the information-processing apparatus 11, a USB connector is provided on the main body of the information-processing apparatus 11. The USB connector 119 is also connected to the control unit 104.
  • It is to be noted that a power-supply section not shown in the figure receives a commercial power-supply voltage from an AC adaptor. As an alternative, the information-processing apparatus 11 may be powered by a battery pack serving as DC power supply. Typically, the battery pack includes secondary batteries or fuel batteries.
  • Obviously, numerous modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.

Claims (13)

1. An information processing apparatus comprising:
a display unit configured to display a dividing region dividing a main display region into two sub regions; and
a control unit configured to control said display unit
to display a plurality of icons in said dividing region,
to change position of said dividing region in said main display region based on user input,
to display, when an icon of said plurality of icons is selected, information corresponding to said icon in at least one of the sub regions.
2. An information processing apparatus according to claim 1 further comprising:
a detector configured to detect a gesture of a user with respect to said main display region,
wherein said control unit is configured to control said display unit to change information in said sub region based on said gesture.
3. An information processing apparatus according to claim 1, wherein said control unit is configured to control said display unit to display widget information including a plurality of widgets in one sub region of said two sub regions.
4. An information processing apparatus according to claim 2, wherein said control unit is configured to control the display unit to display the plurality of widgets selected based on user input.
5. An information processing apparatus according to claim 2, wherein said control unit is configured to define a plurality of said widget information and said display unit is configured to display one widget information of said plurality of widget information in one sub region.
6. An information processing apparatus according to claim 5, wherein said control unit is configured to control the display unit to display other widget information as a substitute for said one widget information when said detector is configured to detect a flick gesture by the user with respect to said one sub region.
7. An information processing apparatus according to claim 5, wherein said display unit is configured to display a plurality of buttons, each button corresponding to one widget information, and said control unit is configured to control, when said detector detects that one button is selected, the display unit to display other widget information corresponding to said selected button, as a substitute for said one widget information.
8. An information processing apparatus according to claim 5, further comprising;
an acquisition unit configured to acquire current parameter information,
wherein said control unit is configured to select said one widget information based on comparison result of said current parameter information and a plurality of predetermined parameter information which each correspond to a plurality of predetermined widget information.
9. An information processing apparatus according to claim 8, wherein said parameter information is time information and/or location information.
10. An information processing apparatus according to claim 3, wherein said control unit is configured to control the display unit to display one fixed information of a plurality of setting disable fixed information in an other sub region.
11. An information processing apparatus according to claim 10, wherein said plurality of icons represents a plurality of fixed information displayed in said sub region.
12. An information processing apparatus according to claim 10, wherein said control unit is configured to move said dividing region to the top of said main display region and to display said one fixed information based on the detection of selection of the icon when said display unit displays said dividing region in the bottom of said main display region.
13. An information processing apparatus according to claim 10, wherein said control unit is configured to control the display unit to display other fixed information as a substitute for said one fixed information based on the detection of a flick gesture by said user in said other sub region detected by said detector.
US12/242,279 2007-11-29 2008-09-30 Computer implemented display, graphical user interface, design and method including scrolling features Active 2030-01-18 US8245155B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US12/242,279 US8245155B2 (en) 2007-11-29 2008-09-30 Computer implemented display, graphical user interface, design and method including scrolling features
EP08169529.8A EP2068236B1 (en) 2007-11-29 2008-11-20 Computer implemented display, graphical user interface, design and method including scrolling features
CN200910009742XA CN101692194B (en) 2007-11-29 2008-11-28 Graphical user interface, design and method including scrolling features
JP2009024947A JP5259444B2 (en) 2007-11-29 2009-02-05 Computer-implemented display, graphical user interface, design and method characterized by scrolling

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US99101307P 2007-11-29 2007-11-29
US12/242,279 US8245155B2 (en) 2007-11-29 2008-09-30 Computer implemented display, graphical user interface, design and method including scrolling features

Publications (2)

Publication Number Publication Date
US20090144661A1 true US20090144661A1 (en) 2009-06-04
US8245155B2 US8245155B2 (en) 2012-08-14

Family

ID=40560254

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/242,279 Active 2030-01-18 US8245155B2 (en) 2007-11-29 2008-09-30 Computer implemented display, graphical user interface, design and method including scrolling features

Country Status (4)

Country Link
US (1) US8245155B2 (en)
EP (1) EP2068236B1 (en)
JP (1) JP5259444B2 (en)
CN (1) CN101692194B (en)

Cited By (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090293013A1 (en) * 2008-05-20 2009-11-26 Palm, Inc. System and method for providing content on an electronic device
US20100013780A1 (en) * 2008-07-17 2010-01-21 Sony Corporation Information processing device, information processing method, and information processing program
US20100064207A1 (en) * 2008-09-10 2010-03-11 Chi Mei Communication Systems, Inc. System and method for displaying widget contents using a mobile device
US20100099457A1 (en) * 2008-10-16 2010-04-22 Lg Electronics Inc. Mobile communication terminal and power saving method thereof
US20100099464A1 (en) * 2008-10-22 2010-04-22 Jong Hwan Kim Mobile communication terminal and screen scrolling method thereof
US20100159992A1 (en) * 2008-12-18 2010-06-24 Nokia Corporation Mobile communication device with a sliding display screen and screen-dividing member
US20100167788A1 (en) * 2008-12-29 2010-07-01 Choi Hye-Jin Mobile terminal and control method thereof
US20100164895A1 (en) * 2008-12-31 2010-07-01 Samsung Electronics Co., Ltd. Apparatus and method for performing scroll function in portable terminal
US20100204914A1 (en) * 2009-02-11 2010-08-12 Telmap Ltd Active widgets for mobile navigation systems
US20110047492A1 (en) * 2009-02-16 2011-02-24 Nokia Corporation Method and apparatus for displaying favorite contacts
US20110066931A1 (en) * 2009-09-11 2011-03-17 Samsung Electronics Co., Ltd. Method for providing widget and apparatus for providing and displaying the same
US20110082619A1 (en) * 2009-10-05 2011-04-07 Tesla Motors, Inc. Adaptive Soft Buttons for a Vehicle User Interface
US20110082616A1 (en) * 2009-10-05 2011-04-07 Tesla Motors, Inc. Vehicle User Interface with Proximity Activation
US20110082627A1 (en) * 2009-10-05 2011-04-07 Tesla Motors, Inc. Morphing Vehicle User Interface
US20110106365A1 (en) * 2009-10-30 2011-05-05 Denso Corporation In-vehicle device for storing gadget
US20110115820A1 (en) * 2009-11-16 2011-05-19 Shunichi Kasahara Information processing apparatus, information processing method, and program
US20110151936A1 (en) * 2009-12-21 2011-06-23 Samsung Electronics Co. Ltd. Input key output method and apparatus of projector-enabled mobile terminal
CN102279694A (en) * 2010-06-08 2011-12-14 联想(北京)有限公司 Electronic device and display method of application software window thereof
US20120017176A1 (en) * 2010-07-16 2012-01-19 Samsung Electronics Co., Ltd. Method and apparatus for displaying a menu
US20120030619A1 (en) * 2010-07-30 2012-02-02 Samsung Electronics Co., Ltd. Method for providing user interface and display apparatus applying the same
US20120057794A1 (en) * 2010-09-06 2012-03-08 Shingo Tsurumi Image processing device, program, and image procesing method
US20120151400A1 (en) * 2010-12-08 2012-06-14 Hong Yeonchul Mobile terminal and controlling method thereof
US8276072B2 (en) 2001-11-09 2012-09-25 Sony Corporation Information processing apparatus and information processing method
US20120331424A1 (en) * 2011-02-28 2012-12-27 Research In Motion Limited Electronic device and method of displaying information in response to input
US20130085854A1 (en) * 2010-06-30 2013-04-04 Rakuten, Inc. Information processing device, information processing method, information processing program and recording medium
US20130111405A1 (en) * 2011-10-28 2013-05-02 Samsung Electronics Co., Ltd. Controlling method for basic screen and portable device supporting the same
AU2011201514B2 (en) * 2010-04-09 2013-05-23 Sony Interactive Entertainment Inc. Information processing apparatus
WO2013081676A1 (en) * 2011-12-01 2013-06-06 Research In Motion Limited Electronic device and method of providing visual notification of a received communication
US20130151987A1 (en) * 2011-12-13 2013-06-13 William Joseph Flynn, III Tactile Interface for Social Networking System
US20130187863A1 (en) * 2012-01-23 2013-07-25 Research In Motion Limited Electronic device and method of controlling a display
US20130222431A1 (en) * 2012-02-24 2013-08-29 Samsung Electronics Co., Ltd. Method and apparatus for content view display in a mobile device
US20130346850A1 (en) * 2012-06-26 2013-12-26 Samsung Electronics Co., Ltd Apparatus and method for displaying a web page in a portable terminal
US20140063053A1 (en) * 2012-08-28 2014-03-06 Hayang Jung Mobile terminal and control method thereof
CN103647998A (en) * 2013-11-14 2014-03-19 四川长虹电器股份有限公司 Control method based on intelligent television
US8726198B2 (en) 2012-01-23 2014-05-13 Blackberry Limited Electronic device and method of controlling a display
US8749690B2 (en) 2011-12-13 2014-06-10 Facebook, Inc. In-context content capture
US8799778B2 (en) 2011-12-13 2014-08-05 Facebook, Inc. Scrolling velocity modulation in a tactile interface for a social networking system
US20140325433A1 (en) * 2013-04-24 2014-10-30 Canon Kabushiki Kaisha Information processing device, display control method, and computer program recording medium
US20140351749A1 (en) * 2011-12-14 2014-11-27 Nokia Corporation Methods, apparatuses and computer program products for merging areas in views of user interfaces
US9015641B2 (en) 2011-01-06 2015-04-21 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US20150212700A1 (en) * 2014-01-28 2015-07-30 Microsoft Technology Licensing, Llc Dashboard with panoramic display of ordered content
US20150242065A1 (en) * 2014-02-21 2015-08-27 Samsung Electronics Co., Ltd. Method and apparatus for displaying screen on electronic device
US9213421B2 (en) 2011-02-28 2015-12-15 Blackberry Limited Electronic device and method of displaying information in response to detecting a gesture
US20160021047A1 (en) * 2014-07-17 2016-01-21 Honda Motor Co., Ltd. Method and electronic device for performing exchange of messages
US20160021155A1 (en) * 2014-07-17 2016-01-21 Honda Motor Co., Ltd. Method and electronic device for performing exchange of messages
USD755226S1 (en) * 2014-08-25 2016-05-03 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US20160139752A1 (en) * 2013-06-18 2016-05-19 Samsung Electronics Co., Ltd. User terminal apparatus and management method of home network thereof
USD763914S1 (en) * 2014-09-02 2016-08-16 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD763880S1 (en) * 2013-10-30 2016-08-16 GreatCall, Inc. Display screen or portion thereof with graphical user interface
US20160264070A1 (en) * 2015-03-13 2016-09-15 Yazaki Corporation Vehicle operation system
US9465440B2 (en) 2011-01-06 2016-10-11 Blackberry Limited Electronic device and method of displaying information in response to a gesture
CN106020655A (en) * 2016-05-18 2016-10-12 北京金山安全软件有限公司 Method and device for switching interface screen and electronic equipment
US9471145B2 (en) 2011-01-06 2016-10-18 Blackberry Limited Electronic device and method of displaying information in response to a gesture
WO2016167612A1 (en) * 2015-04-16 2016-10-20 삼성전자 주식회사 Electronic device for providing notification information, and notification information provision method therefor
US9477311B2 (en) 2011-01-06 2016-10-25 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9477336B2 (en) 2011-12-26 2016-10-25 Brother Kogyo Kabushiki Kaisha Image forming apparatus having display displaying images, non-transitory storage medium storing program to be executed by the same, method of controlling the same, and terminal device having the same
US9507495B2 (en) 2013-04-03 2016-11-29 Blackberry Limited Electronic device and method of displaying information in response to a gesture
USD772887S1 (en) * 2013-11-08 2016-11-29 Microsoft Corporation Display screen with graphical user interface
US9519397B2 (en) 2012-08-31 2016-12-13 Samsung Electronics Co., Ltd. Data display method and apparatus
US20170003753A1 (en) * 2015-07-01 2017-01-05 Samsung Electronics Co., Ltd. Method for providing feedback and an electronic device thereof
US20170031452A1 (en) * 2014-01-15 2017-02-02 Juice Design Co., Ltd. Manipulation determination apparatus, manipulation determination method, and, program
US9659261B2 (en) 2013-10-30 2017-05-23 GreatCall, Inc. User interface for portable device
US9665244B2 (en) 2010-02-12 2017-05-30 Samsung Electronics Co., Ltd. Menu executing method and apparatus in portable terminal
US9690476B2 (en) 2013-03-14 2017-06-27 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US20170243564A1 (en) * 2016-02-24 2017-08-24 Naver Corporation Image displaying apparatus, image generating apparatus, image providing server, image displaying method, image generating method, and computer programs for executing the image displaying method and the image generating method
US9798139B2 (en) 2013-01-28 2017-10-24 Beijing Lenovo Software Ltd. Wearable electronic device and display method
US10228728B2 (en) 2011-02-10 2019-03-12 Samsung Electronics Co., Ltd Apparatus including multiple touch screens and method of changing screens therein
US10257670B2 (en) 2015-04-16 2019-04-09 Samsung Electronics Co., Ltd. Portable device and method for providing notice information thereof
USD848458S1 (en) * 2015-08-03 2019-05-14 Google Llc Display screen with animated graphical user interface
USD849027S1 (en) * 2015-08-03 2019-05-21 Google Llc Display screen with animated graphical user interface
USD851671S1 (en) * 2017-11-06 2019-06-18 Whatsapp Inc. Display screen or portion thereof with graphical user interface
US10324612B2 (en) * 2007-12-14 2019-06-18 Apple Inc. Scroll bar with video region in a media system
US10397639B1 (en) 2010-01-29 2019-08-27 Sitting Man, Llc Hot key systems and methods
USD857740S1 (en) * 2017-08-22 2019-08-27 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US10437454B2 (en) 2012-07-09 2019-10-08 Facebook, Inc. Dynamically scaled navigation system for social network data
US10536411B2 (en) 2017-11-06 2020-01-14 Whatsapp Inc. Providing group messaging thread highlights
USRE47966E1 (en) * 2012-04-02 2020-04-28 Samsung Electronics Co., Ltd. Method and apparatus for displaying first content alone or first and second content simultaneously based on movement
US10664150B2 (en) 2017-11-06 2020-05-26 Whatsapp Inc. Providing group messaging thread highlights
US10678401B2 (en) 2017-11-06 2020-06-09 Whatsapp Inc. Providing group messaging thread highlights
US10685074B2 (en) 2017-11-06 2020-06-16 Whatsapp Inc. Providing group messaging thread highlights
USD897365S1 (en) * 2014-09-01 2020-09-29 Apple Inc. Display screen or portion thereof with graphical user interface
US11093132B2 (en) 2011-02-10 2021-08-17 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
US11163425B2 (en) 2013-06-18 2021-11-02 Samsung Electronics Co., Ltd. User terminal apparatus and management method of home network thereof
US11270250B2 (en) * 2020-02-14 2022-03-08 International Business Machines Corporation Intelligent service and customer matching using an information processing system
US11592968B2 (en) 2013-06-18 2023-02-28 Samsung Electronics Co., Ltd. User terminal apparatus and management method of home network thereof
USD990505S1 (en) * 2020-06-21 2023-06-27 Apple Inc. Display screen or portion thereof with graphical user interface
US11714520B2 (en) 2012-09-24 2023-08-01 Samsung Electronics Co., Ltd. Method and apparatus for providing multi-window in touch device
US11726631B2 (en) * 2013-08-24 2023-08-15 Tara Chand Singhal Apparatus and method for a simplified menu screen in handheld mobile wireless devices
US11836340B2 (en) 2014-10-30 2023-12-05 Google Llc Systems and methods for presenting scrolling online content on mobile devices

Families Citing this family (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8296684B2 (en) 2008-05-23 2012-10-23 Hewlett-Packard Development Company, L.P. Navigating among activities in a computing device
US8683362B2 (en) 2008-05-23 2014-03-25 Qualcomm Incorporated Card metaphor for activities in a computing device
US8159469B2 (en) * 2008-05-06 2012-04-17 Hewlett-Packard Development Company, L.P. User interface for initiating activities in an electronic device
JP5402650B2 (en) * 2009-06-09 2014-01-29 株式会社リコー Display control apparatus, information processing system, and display control method
JP5127792B2 (en) 2009-08-18 2013-01-23 キヤノン株式会社 Information processing apparatus, control method therefor, program, and recording medium
JP5749435B2 (en) * 2009-12-28 2015-07-15 ソニー株式会社 Information processing apparatus, information processing method, program, control target device, and information processing system
KR101600549B1 (en) * 2010-02-11 2016-03-07 삼성전자주식회사 Method and apparatus for providing history of information associated to time information
US9417787B2 (en) 2010-02-12 2016-08-16 Microsoft Technology Licensing, Llc Distortion effects to indicate location in a movable data collection
US8473860B2 (en) * 2010-02-12 2013-06-25 Microsoft Corporation Multi-layer user interface with flexible parallel and orthogonal movement
US8756522B2 (en) 2010-03-19 2014-06-17 Blackberry Limited Portable electronic device and method of controlling same
EP2367097B1 (en) 2010-03-19 2017-11-22 BlackBerry Limited Portable electronic device and method of controlling same
KR20110113844A (en) * 2010-04-12 2011-10-19 엘지전자 주식회사 Mobile terminal and method for controlling thereof
GB2479756B (en) 2010-04-21 2013-06-05 Realvnc Ltd Virtual interface devices
US9310990B2 (en) 2010-04-26 2016-04-12 Blackberry Limited Portable electronic device and method of controlling same
KR101317401B1 (en) 2010-08-25 2013-10-10 주식회사 팬택 Terminal device and method for object storing
JP2012058857A (en) 2010-09-06 2012-03-22 Sony Corp Information processor, operation method and information processing program
JP2012058856A (en) 2010-09-06 2012-03-22 Sony Corp Information processor, information processing method and information processing program
US20120117492A1 (en) * 2010-11-08 2012-05-10 Ankur Aggarwal Method, system and apparatus for processing context data at a communication device
JP5691464B2 (en) * 2010-12-09 2015-04-01 ソニー株式会社 Information processing device
EP2492789A1 (en) * 2011-02-28 2012-08-29 Research In Motion Limited Electronic device and method of displaying information in response to input
CA2823302C (en) * 2011-01-06 2017-11-28 Research In Motion Limited Electronic device and method of displaying information in response to a gesture
US20120226979A1 (en) * 2011-03-04 2012-09-06 Leica Camera Ag Navigation of a Graphical User Interface Using Multi-Dimensional Menus and Modes
KR101809950B1 (en) 2011-03-25 2017-12-18 엘지전자 주식회사 Mobile terminal and method for controlling thereof
US20130019195A1 (en) * 2011-07-12 2013-01-17 Oracle International Corporation Aggregating multiple information sources (dashboard4life)
JP6091829B2 (en) * 2011-09-28 2017-03-08 京セラ株式会社 Apparatus, method, and program
US10083247B2 (en) 2011-10-01 2018-09-25 Oracle International Corporation Generating state-driven role-based landing pages
KR101799408B1 (en) 2011-11-03 2017-11-20 삼성전자주식회사 Apparatus and method for controlling controllable device in portable terminal
JP6002012B2 (en) * 2011-11-28 2016-10-05 京セラ株式会社 Apparatus, method, and program
KR101872862B1 (en) * 2011-12-13 2018-06-29 엘지전자 주식회사 Mobile terminal and method for controlling of the same
CA2868807A1 (en) * 2012-02-24 2013-08-29 Blackberry Limited Electronic device and method of displaying information in response to a gesture
JP5970937B2 (en) 2012-04-25 2016-08-17 ソニー株式会社 Display control apparatus and display control method
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
EP2847659B1 (en) 2012-05-09 2019-09-04 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
KR101806350B1 (en) 2012-05-09 2017-12-07 애플 인크. Device, method, and graphical user interface for selecting user interface objects
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
AU2013259613B2 (en) 2012-05-09 2016-07-21 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
CN109298789B (en) 2012-05-09 2021-12-31 苹果公司 Device, method and graphical user interface for providing feedback on activation status
WO2013169846A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying additional information in response to a user contact
EP2682850A1 (en) * 2012-07-05 2014-01-08 BlackBerry Limited Prioritization of multitasking applications in a mobile device interface
DE102012107552A1 (en) * 2012-08-17 2014-05-15 Claas Selbstfahrende Erntemaschinen Gmbh Display device for agricultural machinery
JP5916573B2 (en) * 2012-09-19 2016-05-11 シャープ株式会社 Display device, control method, control program, and recording medium
US10649607B2 (en) 2012-12-28 2020-05-12 Facebook, Inc. Re-ranking story content
US9781223B2 (en) 2012-12-28 2017-10-03 Facebook, Inc. Conserving battery and data usage
US10761672B2 (en) 2012-12-28 2020-09-01 Facebook, Inc. Socialized dash
CA2895536A1 (en) * 2012-12-28 2014-07-03 Facebook, Inc. Social cover feed interface
US10249007B2 (en) 2012-12-28 2019-04-02 Facebook, Inc. Social cover feed interface
CN107831991B (en) * 2012-12-29 2020-11-27 苹果公司 Device, method and graphical user interface for determining whether to scroll or select content
WO2014106274A1 (en) * 2012-12-31 2014-07-03 Perinote LLC Methods and systems for organizing information
USD744508S1 (en) * 2013-01-25 2015-12-01 Htc Corporation Display screen with a graphical user interface
EP2972634B1 (en) * 2013-03-13 2024-02-21 Flow Control LLC. Methodology to define optimal sun position using the capability provided by smart phone technology
USD735232S1 (en) * 2013-03-14 2015-07-28 Microsoft Corporation Display screen with graphical user interface
USD733739S1 (en) * 2013-03-14 2015-07-07 Microsoft Corporation Display screen with graphical user interface
USD745876S1 (en) * 2013-03-14 2015-12-22 Microsoft Corporation Display screen with graphical user interface
US9256358B2 (en) * 2013-06-10 2016-02-09 Adtile Technologies Inc. Multiple panel touch user interface navigation
JP2015196495A (en) * 2014-04-03 2015-11-09 株式会社デンソー Input device for vehicle
JP2016024555A (en) * 2014-07-17 2016-02-08 本田技研工業株式会社 Program and method for exchanging messages, and electronic apparatus
JP2016024556A (en) * 2014-07-17 2016-02-08 本田技研工業株式会社 Program and method for exchanging messages, and electronic apparatus
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
JP2015167033A (en) * 2015-04-30 2015-09-24 コニカミノルタ株式会社 Display area control device, method, and program
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10222941B2 (en) 2015-10-27 2019-03-05 Cnh Industrial America Llc Bottom bar display area for an agricultural system
US20170322696A1 (en) 2016-05-07 2017-11-09 Perinote LLC Selecting and performing contextual actions via user interface objects
JP6213622B2 (en) * 2016-07-13 2017-10-18 ソニー株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
JP2017102971A (en) * 2017-03-01 2017-06-08 コニカミノルタ株式会社 Display region control device, method and program
USD847198S1 (en) * 2017-10-16 2019-04-30 Pelmorex Corp. Display screen or portion thereof with animated graphical user interface
JP6498799B1 (en) * 2018-01-17 2019-04-10 株式会社ぐるなび Information providing apparatus, information providing method, information providing program, and user terminal control program
KR102057797B1 (en) * 2019-02-11 2019-12-19 최현준 Controlling electronic document scrolling apparatus, method and computer readable medium
CN111488192B (en) * 2020-04-16 2023-07-04 北京雷石天地电子技术有限公司 Method, device, terminal and non-transitory computer readable storage medium for implementing graphic user interface
CN112181573A (en) * 2020-09-28 2021-01-05 北京达佳互联信息技术有限公司 Media resource display method, device, terminal, server and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US6792615B1 (en) * 1999-05-19 2004-09-14 New Horizons Telecasting, Inc. Encapsulated, streaming media automation and distribution system
US20060005131A1 (en) * 2004-07-01 2006-01-05 Di Tao Touch display PDA phone with slide keypad
US20060143574A1 (en) * 2004-12-28 2006-06-29 Yuichi Ito Display method, portable terminal device, and display program
US20070146347A1 (en) * 2005-04-22 2007-06-28 Outland Research, Llc Flick-gesture interface for handheld computing devices
US20070157105A1 (en) * 2006-01-04 2007-07-05 Stephen Owens Network user database for a sidebar
US20070252822A1 (en) * 2006-05-01 2007-11-01 Samsung Electronics Co., Ltd. Apparatus, method, and medium for providing area division unit having touch function
US20080276200A1 (en) * 2007-05-02 2008-11-06 Drew Bamford Method for disposing a menu layout and related device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8175921B1 (en) 2000-05-30 2012-05-08 Nokia Corporation Location aware product placement and advertising
JP2003005892A (en) * 2001-06-19 2003-01-08 Casio Comput Co Ltd Electronic equipment and input processing program for the same
JP3879980B2 (en) * 2002-01-15 2007-02-14 ソニー株式会社 Portable information processing apparatus and operation screen display control method
US20050182856A1 (en) 2003-12-22 2005-08-18 Mcknett Charles L. Systems and methods for creating time aware networks using independent absolute time values in network devices
JP4599898B2 (en) * 2004-06-11 2010-12-15 富士ゼロックス株式会社 Program, method and portable information device for screen display control
JP2006162815A (en) * 2004-12-03 2006-06-22 Sony Computer Entertainment Inc Multimedia reproducing apparatus
JP4606158B2 (en) 2004-12-28 2011-01-05 ソニー株式会社 Display method, portable terminal device, and display program
JP4360496B2 (en) 2004-12-28 2009-11-11 ソニー・エリクソン・モバイルコミュニケーションズ株式会社 Display method, portable terminal device, and display program
KR100667337B1 (en) * 2005-03-15 2007-01-12 삼성전자주식회사 On screen display apparatus and method for displaying menu
JP2007041641A (en) 2005-07-29 2007-02-15 Sony Ericsson Mobilecommunications Japan Inc Personal digital assistant, menu display method, and program
JP4347303B2 (en) * 2006-01-23 2009-10-21 シャープ株式会社 Information processing apparatus, program, and recording medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US6792615B1 (en) * 1999-05-19 2004-09-14 New Horizons Telecasting, Inc. Encapsulated, streaming media automation and distribution system
US20060005131A1 (en) * 2004-07-01 2006-01-05 Di Tao Touch display PDA phone with slide keypad
US20060143574A1 (en) * 2004-12-28 2006-06-29 Yuichi Ito Display method, portable terminal device, and display program
US20070146347A1 (en) * 2005-04-22 2007-06-28 Outland Research, Llc Flick-gesture interface for handheld computing devices
US20070157105A1 (en) * 2006-01-04 2007-07-05 Stephen Owens Network user database for a sidebar
US20070252822A1 (en) * 2006-05-01 2007-11-01 Samsung Electronics Co., Ltd. Apparatus, method, and medium for providing area division unit having touch function
US20080276200A1 (en) * 2007-05-02 2008-11-06 Drew Bamford Method for disposing a menu layout and related device

Cited By (138)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8276072B2 (en) 2001-11-09 2012-09-25 Sony Corporation Information processing apparatus and information processing method
US10324612B2 (en) * 2007-12-14 2019-06-18 Apple Inc. Scroll bar with video region in a media system
US20090293013A1 (en) * 2008-05-20 2009-11-26 Palm, Inc. System and method for providing content on an electronic device
US8392847B2 (en) * 2008-05-20 2013-03-05 Hewlett-Packard Development Company, L.P. System and method for providing content on an electronic device
US9411503B2 (en) 2008-07-17 2016-08-09 Sony Corporation Information processing device, information processing method, and information processing program
US20100013780A1 (en) * 2008-07-17 2010-01-21 Sony Corporation Information processing device, information processing method, and information processing program
US20100064207A1 (en) * 2008-09-10 2010-03-11 Chi Mei Communication Systems, Inc. System and method for displaying widget contents using a mobile device
US20100099457A1 (en) * 2008-10-16 2010-04-22 Lg Electronics Inc. Mobile communication terminal and power saving method thereof
US8666446B2 (en) * 2008-10-16 2014-03-04 Lg Electronics Inc. Mobile communication terminal with power-saving motion sensor and method for the same
US20100099464A1 (en) * 2008-10-22 2010-04-22 Jong Hwan Kim Mobile communication terminal and screen scrolling method thereof
US8260364B2 (en) * 2008-10-22 2012-09-04 Lg Electronics Inc. Mobile communication terminal and screen scrolling method thereof for projecting display information
US20100159992A1 (en) * 2008-12-18 2010-06-24 Nokia Corporation Mobile communication device with a sliding display screen and screen-dividing member
US20100167788A1 (en) * 2008-12-29 2010-07-01 Choi Hye-Jin Mobile terminal and control method thereof
US8140126B2 (en) * 2008-12-29 2012-03-20 Lg Electronics Inc. Mobile terminal and control method thereof
US20100164895A1 (en) * 2008-12-31 2010-07-01 Samsung Electronics Co., Ltd. Apparatus and method for performing scroll function in portable terminal
US8860670B2 (en) * 2008-12-31 2014-10-14 Samsung Electronics Co., Ltd Apparatus and method for performing scroll function in portable terminal
US20100204914A1 (en) * 2009-02-11 2010-08-12 Telmap Ltd Active widgets for mobile navigation systems
US20110047492A1 (en) * 2009-02-16 2011-02-24 Nokia Corporation Method and apparatus for displaying favorite contacts
US20110066931A1 (en) * 2009-09-11 2011-03-17 Samsung Electronics Co., Ltd. Method for providing widget and apparatus for providing and displaying the same
US20110082627A1 (en) * 2009-10-05 2011-04-07 Tesla Motors, Inc. Morphing Vehicle User Interface
US8078359B2 (en) * 2009-10-05 2011-12-13 Tesla Motors, Inc. User configurable vehicle user interface
US8818624B2 (en) 2009-10-05 2014-08-26 Tesla Motors, Inc. Adaptive soft buttons for a vehicle user interface
US8892299B2 (en) * 2009-10-05 2014-11-18 Tesla Motors, Inc. Vehicle user interface with proximity activation
US20110082619A1 (en) * 2009-10-05 2011-04-07 Tesla Motors, Inc. Adaptive Soft Buttons for a Vehicle User Interface
US20110082615A1 (en) * 2009-10-05 2011-04-07 Tesla Motors, Inc. User Configurable Vehicle User Interface
US20110082616A1 (en) * 2009-10-05 2011-04-07 Tesla Motors, Inc. Vehicle User Interface with Proximity Activation
US9079498B2 (en) 2009-10-05 2015-07-14 Tesla Motors, Inc. Morphing vehicle user interface
US8626374B2 (en) * 2009-10-30 2014-01-07 Denso Corporation In-vehicle device for storing gadget
US20110106365A1 (en) * 2009-10-30 2011-05-05 Denso Corporation In-vehicle device for storing gadget
US20110115820A1 (en) * 2009-11-16 2011-05-19 Shunichi Kasahara Information processing apparatus, information processing method, and program
US20110151936A1 (en) * 2009-12-21 2011-06-23 Samsung Electronics Co. Ltd. Input key output method and apparatus of projector-enabled mobile terminal
US10397639B1 (en) 2010-01-29 2019-08-27 Sitting Man, Llc Hot key systems and methods
US11089353B1 (en) 2010-01-29 2021-08-10 American Inventor Tech, Llc Hot key systems and methods
US9665244B2 (en) 2010-02-12 2017-05-30 Samsung Electronics Co., Ltd. Menu executing method and apparatus in portable terminal
AU2011201514B2 (en) * 2010-04-09 2013-05-23 Sony Interactive Entertainment Inc. Information processing apparatus
US9098181B2 (en) 2010-04-09 2015-08-04 Sony Corporation Information processing apparatus
CN102279694A (en) * 2010-06-08 2011-12-14 联想(北京)有限公司 Electronic device and display method of application software window thereof
US20130085854A1 (en) * 2010-06-30 2013-04-04 Rakuten, Inc. Information processing device, information processing method, information processing program and recording medium
WO2012008791A3 (en) * 2010-07-16 2012-05-24 Samsung Electronics Co., Ltd. Method and apparatus for displaying a menu
WO2012008791A2 (en) * 2010-07-16 2012-01-19 Samsung Electronics Co., Ltd. Method and apparatus for displaying a menu
US20120017176A1 (en) * 2010-07-16 2012-01-19 Samsung Electronics Co., Ltd. Method and apparatus for displaying a menu
US20120030619A1 (en) * 2010-07-30 2012-02-02 Samsung Electronics Co., Ltd. Method for providing user interface and display apparatus applying the same
US9865068B2 (en) * 2010-09-06 2018-01-09 Sony Corporation Image processing device, and image procesing method
US20120057794A1 (en) * 2010-09-06 2012-03-08 Shingo Tsurumi Image processing device, program, and image procesing method
US20120151400A1 (en) * 2010-12-08 2012-06-14 Hong Yeonchul Mobile terminal and controlling method thereof
US9690471B2 (en) * 2010-12-08 2017-06-27 Lg Electronics Inc. Mobile terminal and controlling method thereof
US10649538B2 (en) 2011-01-06 2020-05-12 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US11698723B2 (en) 2011-01-06 2023-07-11 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US9423878B2 (en) 2011-01-06 2016-08-23 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US10481788B2 (en) 2011-01-06 2019-11-19 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US9471145B2 (en) 2011-01-06 2016-10-18 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9465440B2 (en) 2011-01-06 2016-10-11 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9015641B2 (en) 2011-01-06 2015-04-21 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US10884618B2 (en) 2011-01-06 2021-01-05 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US9766802B2 (en) 2011-01-06 2017-09-19 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US10191556B2 (en) 2011-01-06 2019-01-29 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US11379115B2 (en) 2011-01-06 2022-07-05 Blackberry Limited Electronic device and method of providing visual notification of a received communication
US9684378B2 (en) 2011-01-06 2017-06-20 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9477311B2 (en) 2011-01-06 2016-10-25 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US11640238B2 (en) 2011-02-10 2023-05-02 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
US10228728B2 (en) 2011-02-10 2019-03-12 Samsung Electronics Co., Ltd Apparatus including multiple touch screens and method of changing screens therein
US11237723B2 (en) 2011-02-10 2022-02-01 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
US11093132B2 (en) 2011-02-10 2021-08-17 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
US11132025B2 (en) 2011-02-10 2021-09-28 Samsung Electronics Co., Ltd. Apparatus including multiple touch screens and method of changing screens therein
US9213421B2 (en) 2011-02-28 2015-12-15 Blackberry Limited Electronic device and method of displaying information in response to detecting a gesture
US9766718B2 (en) 2011-02-28 2017-09-19 Blackberry Limited Electronic device and method of displaying information in response to input
US20120331424A1 (en) * 2011-02-28 2012-12-27 Research In Motion Limited Electronic device and method of displaying information in response to input
US8689146B2 (en) * 2011-02-28 2014-04-01 Blackberry Limited Electronic device and method of displaying information in response to input
US20130111405A1 (en) * 2011-10-28 2013-05-02 Samsung Electronics Co., Ltd. Controlling method for basic screen and portable device supporting the same
WO2013081676A1 (en) * 2011-12-01 2013-06-06 Research In Motion Limited Electronic device and method of providing visual notification of a received communication
US9477391B2 (en) * 2011-12-13 2016-10-25 Facebook, Inc. Tactile interface for social networking system
US20130151987A1 (en) * 2011-12-13 2013-06-13 William Joseph Flynn, III Tactile Interface for Social Networking System
US8749690B2 (en) 2011-12-13 2014-06-10 Facebook, Inc. In-context content capture
US8799778B2 (en) 2011-12-13 2014-08-05 Facebook, Inc. Scrolling velocity modulation in a tactile interface for a social networking system
US20140351749A1 (en) * 2011-12-14 2014-11-27 Nokia Corporation Methods, apparatuses and computer program products for merging areas in views of user interfaces
US10218861B2 (en) 2011-12-26 2019-02-26 Brother Kogyo Kabushiki Kaisha Image forming apparatus, non-transitory storage medium storing program to be executed by the same, method of controlling the same, and terminal device
US10015331B2 (en) 2011-12-26 2018-07-03 Brother Kogyo Kabushiki Kaisha Image forming apparatus, non-transitory storage medium storing program to be executed by the same, method of controlling the same, and terminal device
US9477336B2 (en) 2011-12-26 2016-10-25 Brother Kogyo Kabushiki Kaisha Image forming apparatus having display displaying images, non-transitory storage medium storing program to be executed by the same, method of controlling the same, and terminal device having the same
US9619038B2 (en) 2012-01-23 2017-04-11 Blackberry Limited Electronic device and method of displaying a cover image and an application image from a low power condition
US9058168B2 (en) * 2012-01-23 2015-06-16 Blackberry Limited Electronic device and method of controlling a display
US20130187863A1 (en) * 2012-01-23 2013-07-25 Research In Motion Limited Electronic device and method of controlling a display
US8726198B2 (en) 2012-01-23 2014-05-13 Blackberry Limited Electronic device and method of controlling a display
US20130222431A1 (en) * 2012-02-24 2013-08-29 Samsung Electronics Co., Ltd. Method and apparatus for content view display in a mobile device
USRE47966E1 (en) * 2012-04-02 2020-04-28 Samsung Electronics Co., Ltd. Method and apparatus for displaying first content alone or first and second content simultaneously based on movement
USRE49212E1 (en) 2012-04-02 2022-09-13 Samsung Electronics Co., Ltd. Method and apparatus for displaying first content alone or first and second content simultaneously based on movement
US20130346850A1 (en) * 2012-06-26 2013-12-26 Samsung Electronics Co., Ltd Apparatus and method for displaying a web page in a portable terminal
US10437454B2 (en) 2012-07-09 2019-10-08 Facebook, Inc. Dynamically scaled navigation system for social network data
US20140063053A1 (en) * 2012-08-28 2014-03-06 Hayang Jung Mobile terminal and control method thereof
US9569067B2 (en) * 2012-08-28 2017-02-14 Lg Electronics Inc. Mobile terminal and control method thereof
US9519397B2 (en) 2012-08-31 2016-12-13 Samsung Electronics Co., Ltd. Data display method and apparatus
US11714520B2 (en) 2012-09-24 2023-08-01 Samsung Electronics Co., Ltd. Method and apparatus for providing multi-window in touch device
US9798139B2 (en) 2013-01-28 2017-10-24 Beijing Lenovo Software Ltd. Wearable electronic device and display method
US9690476B2 (en) 2013-03-14 2017-06-27 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9507495B2 (en) 2013-04-03 2016-11-29 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US10126914B2 (en) * 2013-04-24 2018-11-13 Canon Kabushiki Kaisha Information processing device, display control method, and computer program recording medium
US20140325433A1 (en) * 2013-04-24 2014-10-30 Canon Kabushiki Kaisha Information processing device, display control method, and computer program recording medium
US11163425B2 (en) 2013-06-18 2021-11-02 Samsung Electronics Co., Ltd. User terminal apparatus and management method of home network thereof
US11592968B2 (en) 2013-06-18 2023-02-28 Samsung Electronics Co., Ltd. User terminal apparatus and management method of home network thereof
US10564813B2 (en) * 2013-06-18 2020-02-18 Samsung Electronics Co., Ltd. User terminal apparatus and management method of home network thereof
US20160139752A1 (en) * 2013-06-18 2016-05-19 Samsung Electronics Co., Ltd. User terminal apparatus and management method of home network thereof
US11726631B2 (en) * 2013-08-24 2023-08-15 Tara Chand Singhal Apparatus and method for a simplified menu screen in handheld mobile wireless devices
USD763880S1 (en) * 2013-10-30 2016-08-16 GreatCall, Inc. Display screen or portion thereof with graphical user interface
US9659261B2 (en) 2013-10-30 2017-05-23 GreatCall, Inc. User interface for portable device
USD772887S1 (en) * 2013-11-08 2016-11-29 Microsoft Corporation Display screen with graphical user interface
CN103647998A (en) * 2013-11-14 2014-03-19 四川长虹电器股份有限公司 Control method based on intelligent television
US20170031452A1 (en) * 2014-01-15 2017-02-02 Juice Design Co., Ltd. Manipulation determination apparatus, manipulation determination method, and, program
US20150212700A1 (en) * 2014-01-28 2015-07-30 Microsoft Technology Licensing, Llc Dashboard with panoramic display of ordered content
US20150242065A1 (en) * 2014-02-21 2015-08-27 Samsung Electronics Co., Ltd. Method and apparatus for displaying screen on electronic device
CN105282708A (en) * 2014-07-17 2016-01-27 本田技研工业株式会社 Method and electronic device for performing exchange of messages
US9608956B2 (en) * 2014-07-17 2017-03-28 Honda Motor Co., Ltd. Method and electronic device for performing exchange of messages
US9571539B2 (en) * 2014-07-17 2017-02-14 Honda Motor Co., Ltd. Method and electronic device for performing exchange of messages
US20160021155A1 (en) * 2014-07-17 2016-01-21 Honda Motor Co., Ltd. Method and electronic device for performing exchange of messages
CN105278806A (en) * 2014-07-17 2016-01-27 本田技研工业株式会社 Method and electronic device for performing exchange of messages
US20160021047A1 (en) * 2014-07-17 2016-01-21 Honda Motor Co., Ltd. Method and electronic device for performing exchange of messages
USD755226S1 (en) * 2014-08-25 2016-05-03 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD897365S1 (en) * 2014-09-01 2020-09-29 Apple Inc. Display screen or portion thereof with graphical user interface
USD763914S1 (en) * 2014-09-02 2016-08-16 Apple Inc. Display screen or portion thereof with animated graphical user interface
US11836340B2 (en) 2014-10-30 2023-12-05 Google Llc Systems and methods for presenting scrolling online content on mobile devices
US20160264070A1 (en) * 2015-03-13 2016-09-15 Yazaki Corporation Vehicle operation system
US10257670B2 (en) 2015-04-16 2019-04-09 Samsung Electronics Co., Ltd. Portable device and method for providing notice information thereof
WO2016167612A1 (en) * 2015-04-16 2016-10-20 삼성전자 주식회사 Electronic device for providing notification information, and notification information provision method therefor
US20170003753A1 (en) * 2015-07-01 2017-01-05 Samsung Electronics Co., Ltd. Method for providing feedback and an electronic device thereof
US10664052B2 (en) * 2015-07-01 2020-05-26 Samsung Electronics Co., Ltd. Method for providing feedback and an electronic device thereof
USD848458S1 (en) * 2015-08-03 2019-05-14 Google Llc Display screen with animated graphical user interface
USD849027S1 (en) * 2015-08-03 2019-05-21 Google Llc Display screen with animated graphical user interface
US20170243564A1 (en) * 2016-02-24 2017-08-24 Naver Corporation Image displaying apparatus, image generating apparatus, image providing server, image displaying method, image generating method, and computer programs for executing the image displaying method and the image generating method
US10311625B2 (en) * 2016-02-24 2019-06-04 Naver Corporation Image displaying apparatus, image generating apparatus, image providing server, image displaying method, image generating method, and computer programs for executing the image displaying method and the image generating method
CN106020655A (en) * 2016-05-18 2016-10-12 北京金山安全软件有限公司 Method and device for switching interface screen and electronic equipment
USD857740S1 (en) * 2017-08-22 2019-08-27 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US10536411B2 (en) 2017-11-06 2020-01-14 Whatsapp Inc. Providing group messaging thread highlights
US11604561B2 (en) 2017-11-06 2023-03-14 Whatsapp Llc Providing group messaging thread highlights
USD904435S1 (en) 2017-11-06 2020-12-08 Whatsapp Inc. Display screen or portion thereof with graphical user interface
US10685074B2 (en) 2017-11-06 2020-06-16 Whatsapp Inc. Providing group messaging thread highlights
US10678401B2 (en) 2017-11-06 2020-06-09 Whatsapp Inc. Providing group messaging thread highlights
US10664150B2 (en) 2017-11-06 2020-05-26 Whatsapp Inc. Providing group messaging thread highlights
USD851671S1 (en) * 2017-11-06 2019-06-18 Whatsapp Inc. Display screen or portion thereof with graphical user interface
US11270250B2 (en) * 2020-02-14 2022-03-08 International Business Machines Corporation Intelligent service and customer matching using an information processing system
USD990505S1 (en) * 2020-06-21 2023-06-27 Apple Inc. Display screen or portion thereof with graphical user interface

Also Published As

Publication number Publication date
EP2068236B1 (en) 2017-08-16
CN101692194A (en) 2010-04-07
US8245155B2 (en) 2012-08-14
JP5259444B2 (en) 2013-08-07
EP2068236A1 (en) 2009-06-10
CN101692194B (en) 2013-02-27
JP2009205675A (en) 2009-09-10

Similar Documents

Publication Publication Date Title
US8245155B2 (en) Computer implemented display, graphical user interface, design and method including scrolling features
EP3005069B1 (en) Electronic device and method for controlling applications in the electronic device
US10102010B2 (en) Layer-based user interface
EP2717145B1 (en) Apparatus and method for switching split view in portable terminal
EP2940566B1 (en) Method and apparatus for displaying a list of items in ribbon format
KR100770936B1 (en) Method for inputting characters and mobile communication terminal therefor
EP2450781B1 (en) Mobile terminal and screen change control method based on input signals for the same
US20100138776A1 (en) Flick-scrolling
KR20100043371A (en) Apparatus and method for composing idle screen in a portable terminal
EP2735960A2 (en) Electronic device and page navigation method
US20110265039A1 (en) Category-based list navigation on touch sensitive screen
TW201229874A (en) Method and apparatus for gesture recognition
KR20090096149A (en) User interface apparatus of mobile station having touch screen and method thereof
US10474346B2 (en) Method of selection of a portion of a graphical user interface
WO2010061052A1 (en) Item and view specific options
CN105122176A (en) Systems and methods for managing displayed content on electronic devices
US20050223341A1 (en) Method of indicating loading status of application views, electronic device and computer program product
EP2849045A2 (en) Method and apparatus for controlling application using key inputs or combination thereof
EP3671418A1 (en) Method for displaying multiple content cards, and terminal device
WO2010060502A1 (en) Item and view specific options
US20130169555A1 (en) Display apparatus and image representation method using the same
US20100138765A1 (en) Indicator Pop-Up
WO2022143198A1 (en) Processing method for application interface, and related device
CN116601586A (en) Virtual keyboard processing method and related equipment
EP4261660A1 (en) Feedback method and related device

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKAJIMA, TAKESHI;NIREI, KENICHI;HABARA, YASUHIRO;AND OTHERS;REEL/FRAME:022026/0878;SIGNING DATES FROM 20081107 TO 20081209

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKAJIMA, TAKESHI;NIREI, KENICHI;HABARA, YASUHIRO;AND OTHERS;SIGNING DATES FROM 20081107 TO 20081209;REEL/FRAME:022026/0878

AS Assignment

Owner name: SONY ELECTRONICS INC. (50%), NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONY CORPORATION;REEL/FRAME:022226/0242

Effective date: 20090120

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY