US8624931B2 - Information processing apparatus, information processing method, and program - Google Patents

Information processing apparatus, information processing method, and program Download PDF

Info

Publication number
US8624931B2
US8624931B2 US12/274,619 US27461908A US8624931B2 US 8624931 B2 US8624931 B2 US 8624931B2 US 27461908 A US27461908 A US 27461908A US 8624931 B2 US8624931 B2 US 8624931B2
Authority
US
United States
Prior art keywords
information
movement
target object
display screen
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/274,619
Other versions
US20090147027A1 (en
Inventor
Ken Miyashita
Kouichi Matsuda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MATSUDA, KOUICHI, MIYASHITA, KEN
Publication of US20090147027A1 publication Critical patent/US20090147027A1/en
Application granted granted Critical
Publication of US8624931B2 publication Critical patent/US8624931B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects

Definitions

  • the present invention contains subject matter related to Japanese Patent Application JP 2007-317721 filed in the Japan Patent Office on Dec. 7, 2007, the entire contents of which being incorporated herein by reference.
  • the present invention relates to an information processing apparatus, an information processing method, and a program.
  • GUI graphical user interface
  • the typical user finds it difficult to work out such an abstracted condition expression as given above. Further, in the case of utilizing an information processing apparatus having a lot of functions, the user sometimes finds it difficult also to remember an operation procedure in accordance with which the user would instruct the information processing apparatus to perform classification processing.
  • an information processing apparatus that classifies objects having the respective attribute information which are disposed on a display screen of a graphical user interface, the apparatus including an operation information acquisition unit that acquires operation information containing movement information which indicates a position of a movement destination of the object on the display screen, an instructive request estimate unit that estimates an instructive request for classification processing on the object based on the attribute information and the movement information of the object, a classification processing unit that classifies the object based on the estimated instructive request, and a display control unit that controls display of the object on the display screen.
  • an instructive request for classification processing on an object is estimated based on the attribute information and movement information of the object, so that the object is classified based on the estimated instructive request.
  • the user can easily classify the objects without explicitly specifying classification conditions or instructing the performing of the classification processing.
  • a plurality of objects having attribute information common to them are set as an object group that has classification conditions which correspond to the attribute information, while an object group targeted by classification processing is set as a target object group.
  • an unclassified target object having the attribute information that matches the classification conditions of the target object group undergoes the classification processing. Accordingly, an unclassified target object is classified based on the classification conditions of a target object group, so that the user can easily classify the objects without explicitly specifying the classification conditions or instructing the performing of the classification processing.
  • the operation information acquisition unit acquires first movement information about a first object and then acquires second movement information about a second object different from the first object, and if having determined that the first and second objects are disposed close to each other from the first and second movement information and also that the first and second objects have attribute information common to them, the instructive request estimate unit may estimate an instructive request for start of classification processing which uses the common attribute information as the classification conditions.
  • an instructive request is estimated for the classification conditions to be employed in classification processing and the start of the classification processing. Therefore, the user can start classification processing on the objects without explicitly specifying the classification conditions or instructing the performing of the classification processing.
  • the classification processing unit can cause the object group setting unit to set first and second objects as a target object group, thus starting classification processing based on the classification conditions of the target object group.
  • the classification processing unit can cause the object group setting unit to set first and second objects as a target object group, thus starting classification processing based on the classification conditions of the target object group.
  • a plurality of objects moved by the user can be set as a target object group then the classification processing based on the classification conditions of the target object group starts. Therefore, the user can start classification processing on the objects without explicitly setting a target object group or instructing the performing of the classification processing.
  • the operation information acquisition unit acquires movement information about objects, and if having determined that the objects and an object group have been disposed close to each other from position information of the object group and the movement information of the objects and also the attribute information of the objects matches classification conditions of the object group, the instructive request estimate unit may estimate an instructive request for the start of classification processing.
  • the instructive request estimate unit may estimate an instructive request for the start of classification processing.
  • the classification processing unit can cause the object group setting unit to set an object group including objects as a target object group, thus starting classification processing based on the classification conditions of the target object group.
  • the object group setting unit can set an object group including objects as a target object group, thus starting classification processing based on the classification conditions of the target object group.
  • an object group including object moved by the user is set as a target object group then the classification processing based on the classification conditions of the target object group starts. Therefore, the user can start classification processing on the objects without explicitly setting a target object group or instructing the performing of the classification processing.
  • the classification processing unit may move a target object toward a target object group at a constant movement speed
  • the display control unit may control display so that the target object may move toward the target object group at the constant movement speed. Accordingly, display is provided so that the target object may move toward a target object group at a constant movement speed. Therefore, the user can confirm an instructive request estimated by the information processing apparatus.
  • the classification processing unit may cause the object group setting unit to set the target object group as a target object group that includes the target object.
  • the target object group is set as a target object group that includes the target object. Therefore, the user can easily classify objects without explicitly specifying classification conditions or instructing the performing of the classification processing.
  • the operation information acquisition unit may acquire movement information which indicates the position of a movement destination of a target object on the display screen, while the instructive request estimate unit may estimate an instructive request for classification processing based on the movement information of the target object.
  • an instructive request for classification processing is estimated based on the movement information of a target object. Therefore, the user can easily transmit to the information processing apparatus an instructive request for the classification processing of the object.
  • the instructive request estimate unit may estimate an instructive request for the continuation of classification processing. In such a manner, if a target object is prompted to move, an instructive request for the continuation of classification processing is estimated. Therefore, the user can easily transmit to the information processing apparatus an instructive request for the continuation of the classification processing.
  • the classification processing unit may increase the movement speed of a target object, while the display control unit may control display so that the target object may move toward the target object group at an increased movement speed. Accordingly, if an instructive request for the continuation of classification processing is estimated, display is provided so that the target object may move at an increased movement speed. Therefore, the user can confirm that an instructive request for the continuation of the classification processing has been transmitted to the information processing apparatus and, further, speed up the progress of the classification processing.
  • the instructive request estimate unit may estimate an instructive request for the stoppage of classification processing. In such a manner, if a target object is inhibited from moving, an instructive request for the stoppage of classification processing is estimated. Therefore, the user can easily transmit to the information processing apparatus an instructive request for the stoppage of the classification processing.
  • the classification processing unit may stop the movement of a target object, while the display control unit may control display so that the movement of the target object may be stopped. Accordingly, if an instructive request for the stoppage of classification processing is estimated, display is provided so that the target object may stop its movement. Therefore, the user can confirm that an instructive request for the stoppage of the classification processing has been transmitted to the information processing apparatus.
  • the classification processing unit may cause the object group setting unit to set the target object group as an object group. Accordingly, if an instructive request for the stoppage of classification processing is estimated, the target object group is set as an object group. Therefore, the user can stop the classification processing by the information processing apparatus without explicitly instructing the stoppage of the classification processing.
  • the operation information acquisition unit may acquire setting cancellation information which indicates an instruction to cancel the setting of a specific object group, while the classification processing unit may cause the object group setting unit to set cancellation of the setting of the specific object group based on the setting cancellation information. In such a manner, the setting of a specific object group is canceled based on setting cancellation information. Therefore, the user can change and modify the results of classification processing by the information processing apparatus.
  • the display control unit may control display so that the target object group can be distinguished from the other object groups than itself. Accordingly, display is provided so that the target object group may be distinguished from the other target object groups than itself. Therefore, the user can easily confirm the progress status of object classification processing.
  • the display control unit may control display so that a target object can be distinguished from the other target objects than itself. Accordingly, display is provided so that a target object may be distinguished from the other objects than itself. Therefore, the user can easily confirm a target object which undergoes classification processing.
  • an information processing method of classifying objects having the respective attribute information which are disposed on a display screen of a graphical user interface including the steps of: acquiring operation information containing movement information which indicates a position of a movement destination of the object on the display screen; estimating an instructive request for classification processing on the object based on the attribute information and the movement information of that object; classifying the object based on the estimated instructive request; and controlling display of the object on the display screen.
  • a program that causes a computer to perform the information processing method according to the second embodiment of the present invention described above.
  • an information processing apparatus an information processing method, and a program that can easily classify objects disposed on the display screen of a graphical user interface.
  • FIG. 1 is a block diagram showing main functional components of an information processing apparatus according to one embodiment of the present invention
  • FIG. 2 is a flowchart showing a flow of classification processing by the information processing apparatus
  • FIG. 3 is a flowchart showing a flow of classification processing by the information processing apparatus
  • FIG. 4A is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 4B is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 4C is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 4D is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 5A is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 5B is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 5C is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 5D is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 6A is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 6B is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 6C is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 6D is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 7A is a schematic diagram explaining the classification processing by the information processing apparatus
  • FIG. 7B is a schematic diagram explaining the classification processing by the information processing apparatus.
  • FIG. 7C is a schematic diagram explaining the classification processing by the information processing apparatus.
  • FIG. 7D is a schematic diagram explaining the classification processing by the information processing apparatus.
  • FIG. 1 is a block diagram showing main functional components of an information processing apparatus 100 according to one embodiment of the present invention.
  • the information processing apparatus 100 may be, for example, a personal computer, a personal digital assistance (PDA), or a cellular phone and has a display screen integrally provided on it or a connection to a display screen provided separately from it. It is to be noted that although the following will describe a case where a display screen is provided integrally on the information processing apparatus 100 , the present invention is similarly applicable also to a case where it is provided separately.
  • PDA personal digital assistance
  • the information processing apparatus 100 includes an operation information acquisition unit 102 , a storage unit 104 , a display unit 106 , and a control processing unit 110 .
  • the operation information acquisition unit 102 acquires operation information entered by the user through a keyboard, a pointing device, and the like.
  • the operation information contains movement information which indicates the position of a movement destination of an object on the display screen. It is to be noted that although the following will describe a case where operation information is entered through a mouse, the operation information may be entered through any other input device such as a touch panel.
  • the storage unit 104 is constituted of a storage memory such as a RAM or a ROM and stores information about programs used to make the information processing apparatus 100 operative, and objects.
  • the display unit 106 is constituted of a display, a monitor, or the like which has a display screen and displays a plurality of pieces of information expressed as an object or the like on the display screen.
  • the control processing unit 110 includes such function units as a user instruction estimate unit (instructive request estimate unit) 112 , a classification processing unit 114 , an object group setting unit 116 , and a display control unit 118 and manages overall control processing on the information processing apparatus 100 , including control processing by use of these function units.
  • a user instruction estimate unit instructive request estimate unit
  • a classification processing unit 114
  • an object group setting unit 116
  • a display control unit 118 manages overall control processing on the information processing apparatus 100 , including control processing by use of these function units.
  • the user instruction estimate unit 112 estimates a user's instruction for classification processing on an object based on attribute information and movement information of the object.
  • the classification processing unit 114 classifies the object based on the user's instruction estimated by the user instruction estimate unit 112 .
  • the object group setting unit 116 sets an object group which includes a plurality of objects having attribute information common to them and which has classification conditions corresponding to the common attribute information and position information indicating a position of the object group on the display screen.
  • the display control unit 118 controls the display unit 106 , which displays objects on the display screen.
  • the operation information acquisition unit 102 can be realized in the information processing apparatus 100 as hardware and/or software such as a dedicated electronic circuit or a program which is executed by a CPU.
  • An object may be various kinds of information that can be a target of classification processing by the information processing apparatus 100 , such as content data including image, video, and voice, icon in a graphical user interface (GUI), etc.
  • An object has identification information that can identify the object, at least one piece of attribute information, and position information.
  • the attribute information of an object contains information assigned by the user and information automatically assigned by the information processing apparatus 100 etc.
  • the attribute information contains information of, for example, a date and time when the photo was taken, a place where the photo was taken, a person who took the photo, etc.
  • the position information of an object is set in order to specify a position and a range where the object is displayed on the display screen.
  • the position information is set as two-dimensional coordinate information in a case where an object is disposed in a two-dimensional coordinate plane and as three-dimensional coordinate information in a case where an object is disposed in a three-dimensional coordinate space.
  • An object group has identification information that can identify the object group, classification conditions, position information, group information, and flag information.
  • the classification conditions for an object group correspond to attribute information common to a plurality of objects that make up an object group.
  • Classification conditions are set as at least one piece of attribute information in order to classify the objects and used as composite conditions if set as a plurality of pieces of attribute information.
  • the position information of an object group is set in order to specify a position and a range where the object group is displayed on the display screen.
  • the group information of an object group is a set of the identification information of a plurality of objects that make up the object group and set in order to specify the objects classified in the object group.
  • the flag information of an object group is set in order to specify whether the object group corresponds to the target object group or an ordinary object group.
  • the target object group means an object group that is handled as a target of classification processing by the information processing apparatus 100 .
  • the information processing apparatus 100 is operative to classify objects having the respective attribute information which are disposed on the display screen of a GUI. The following will describe classification processing of the object by the information processing apparatus 100 .
  • FIGS. 2 and 3 are a flowchart showing the flow of classification processing by the information processing apparatus 100 .
  • FIGS. 4A to 4D , FIGS. 5A to 5D , FIGS. 6A to 6D , and FIGS. 7A to 7D are schematic diagrams explaining the classification processing by the information processing apparatus 100 . It is to be noted that in FIGS. 4A to 4D , FIGS. 5A to 5D , FIGS. 6A to 6D , and FIGS. 7A to 7D , an object, an object group, and a target object group are indicated as a square icon, a pile of square icons, and a hatched pile respectively.
  • the user moves a pointer 200 on the display screen through moving operations of a mouse and clicks the mouse in a condition where the pointer 200 is disposed on a specific object, thereby specifying the specific object.
  • the operation information acquisition unit 102 acquires position information that indicates a position of the pointer 200 on the display screen so that the specified object is specified based on the position information of the pointer 200 and the position information of the object.
  • the user moves the pointer 200 on the display screen through press-and-drag operations in a condition where the object is specified, thereby moving the object in a condition where its movement is interlocked with that of the pointer 200 .
  • the operation information acquisition unit 102 acquires movement information that indicates a position of the pointer 200 on the display screen so that the movement information of the object may be changed as needed.
  • the display control unit 118 controls display of the object on the display screen.
  • the user releases the press-and-drag operations in a condition where the object 211 is moved to an arbitrary position on the display screen, thereby completing the movement of the object 211 .
  • the operation information acquisition unit 102 acquires movement information that indicates a position to which the pointer 200 is moved on the display screen so that the position information of the object 211 may be updated based on this movement information.
  • the display control unit 118 controls display of the object 211 on the display screen.
  • the operation information acquisition unit 102 acquires the first movement information about the first object 211 and the second movement information about the second object 212 (step S 102 ). Then, based on the first and second movement information, the display control unit 118 controls display of the first and second objects 211 and 212 .
  • the user instruction estimate unit 112 determines whether the first and second objects 211 and 212 are disposed close to each other based on the movement information of the first and second object 211 and 212 (S 104 ).
  • the user instruction estimate unit 112 determines that they are disposed close to each other. If having determined that the objects 211 and 212 are disposed close to each other, the user instruction estimate unit 112 determines whether the first and second objects 211 and 212 have common attribute information based on the attribute information of the first and second objects 211 and 212 (S 106 ).
  • the user instruction estimate unit 112 estimates that the user has given an instruction to the effect that classification processing using the common attribute information as classification conditions should be started. This enables the user to start object classification processing without explicitly specifying classification conditions or instructing the performing of the classification processing.
  • the user instruction estimate unit 112 estimates that the user has not given an instruction to the effect that the classification processing should be started. In such a case, the information processing apparatus 100 may notify the user of a alarm, for example, to the effect that objects having different attribute information are going to be classified.
  • step S 102 or S 106 if the condition determination at step S 102 or S 106 comes up with “NO”, the processing by the information processing apparatus 100 returns to step S 102 , to wait for acquisition of movement information from the user.
  • the classification processing unit 114 causes the object group setting unit 116 to set an object group 231 made up by the first and second objects 211 and 212 (S 108 ).
  • the object group setting unit 116 assigns unique identification information to the object group 231 .
  • the object group setting unit 116 sets as classification conditions the attribute information common to the first and second objects 211 and 212 and sets as a group information a set of the identification information of the first and second objects 211 and 212 .
  • the object group setting unit 116 sets position information of the object group 231 based on the position information of the first and second objects 211 and 212 and flag information by which the object group 231 is defined as the target object group 241 to be classified (S 110 ).
  • the classification processing unit 114 starts classification processing based on the classification conditions.
  • the user can start classification processing on objects without explicitly setting a target object group or instructing the performing of the classification processing.
  • the display control unit 118 controls display so that the target object group 241 may be distinguished from the other object groups. It is to be noted that the target object group 241 may be displayed as distinguished from the other object groups by utilizing, for example, rotation, uplifting, spotlighting, or blinking. Thus, the user can easily confirm the progress status of the object classification processing.
  • the operation information acquisition unit 102 acquires movement information about the object 213 (S 102 ). Then, based on the movement information, the display control unit 118 controls display of the object 213 .
  • the user instruction estimate unit 112 determines whether the object 213 and the object groups 232 and 233 are disposed close to each other based on the movement information of the object 213 and the position information of the object groups 232 and 233 (S 112 ). In this case, if the object 213 and the object groups 232 and 233 are disposed within a predetermined distance, the user instruction estimate unit 112 determines that the object 213 and the object groups 232 and 233 are disposed close to each other.
  • the user instruction estimate unit 112 determines whether attribute information of the object 213 matches classification conditions of the object group 233 based on the attribute information and the classification conditions (S 114 ). If having determined that the attribute information matches the classification conditions, the user instruction estimate unit 112 estimates that the user has given an instruction to the effect that classification processing should be started. This enables the user to start object classification processing without explicitly instructing the performing of the classification processing.
  • the display control unit 118 controls display so that the object 213 may be taken into the object group 233 as shown in FIG. 5C .
  • the object 213 may be uplifted over the object group 233 once and then displayed as taken into the object group 233 .
  • the user instruction estimate unit 112 estimates that the user has not given an instruction to the effect that the classification processing should be started. In such a case, the information processing apparatus 100 may notify the user of a alarm, for example, to the effect that objects that do not match the classification conditions are going to be classified.
  • step S 112 or S 114 if the condition determination at step S 112 or S 114 comes up with “NO”, the processing by the information processing apparatus 100 returns to step S 102 , to wait for acquisition of movement information from the user.
  • the classification processing unit 114 causes the object group setting unit 116 to set an object group 233 as the object group 233 including the object 213 (S 116 ).
  • the object group setting unit 116 adds the identification information of the object 213 to the group information, and sets the flag information by which the object group 233 is defined as a target object group 242 (S 110 ).
  • the classification processing unit 114 starts classification processing based on the classification conditions.
  • the user can start classification processing on objects without explicitly setting a target object group or instructing the performing of the classification processing.
  • the display control unit 118 controls display so that the target object group 242 may be distinguished from the other object groups.
  • the user can easily confirm the progress status of the object classification processing.
  • the classification processing unit 114 starts classification processing based on classification conditions of the target object group 243 .
  • the classification processing unit 114 extracts objects that have attribute information matching the classification conditions and that are not included in the target object group 243 or any other object groups, as target objects 221 and 222 of classification processing (S 122 ). It is to be noted that if there are not target objects on the display screen, the classification processing unit 114 stops the classification processing on the target object group 243 .
  • the display control unit 118 controls display so that the target objects 221 and 222 may be distinguished from the other objects.
  • the target objects 221 and 222 may be displayed as distinguished from the other objects by utilizing, for example, rotation, uplifting, spotlighting, or blinking.
  • the user can easily confirm a target object of classification processing.
  • the classification processing unit 114 moves the target objects 221 and 222 toward the target object group 243 at a constant movement speed (S 124 ).
  • the display control unit 118 controls display so that the target objects 221 and 222 move toward the target object group 243 at the constant movement speed.
  • the movement of the target objects 221 and 222 is expressed by changing the position information of the target objects 221 and 222 so that it nears the position information of the target object group 243 at a constant rate. Therefore, the user can confirm an instruction of the user estimated by the information processing apparatus 100 .
  • the classification processing unit 114 determines whether the target objects 221 and 222 have moved in such a manner as to be disposed close to the target object group 243 , based on the position information of the target objects 221 and 222 and the position information of the target object group 243 (S 126 ). In this case, if the target objects 221 and 222 and the target object group 243 are disposed within a predetermined distance, the classification processing unit 114 determines that both of them are disposed close to each other.
  • the classification processing unit 114 causes the object group setting unit 116 to set the target object group 243 as the target object group 243 including the target object 221 (S 128 ).
  • the object group setting unit 116 adds the identification information of the target object 221 to the group information.
  • the display control unit 118 controls display so that the target object 221 may be taken into the target object group 243 as shown in FIG. 6C .
  • the classification processing unit 114 determines whether there are any other target objects yet to be classified (S 130 ). If there are no other target objects, the classification processing unit 114 stops the classification processing on the target object group 243 , and the processing by the information processing apparatus 100 returns to step S 102 , to wait for acquisition of movement information from the user. On the other hand, if there are any other target objects, the classification processing unit 114 continues with the following processing.
  • the operation information acquisition unit 102 acquires movement information about the target objects 221 and 222 , based on which movement information the display control unit 118 controls display of the target objects 221 and 222 .
  • the user instruction estimate unit 112 estimates a user's instruction for classification processing based on the movement information of the target objects 221 and 222 as described below. Thus, the user can easily transmit an instruction for object classification processing to the information processing apparatus 100 .
  • the user instruction estimate unit 112 determines whether the target objects 221 and 222 are prompted to move or inhibited from moving, based on the movement information of the target objects 221 and 222 (S 132 , S 136 ). In this case, it determines that the target objects 221 and 222 are prompted to move if the target object 222 is moved toward the target object group 243 as shown in FIG. 7A and, if the target object 222 is moved away from the target object group 243 or stopped in motion as shown in FIG. 7C , determines that the target objects 221 and 222 are inhibited from moving.
  • step S 132 and S 136 if the condition determination at step S 132 and S 136 comes up with “NO”, the processing by the classification processing unit 114 returns to step S 124 , to continue with the movement of the target objects 221 and 222 .
  • the user instruction estimate unit 112 estimates that the user has given an instruction to the effect that the classification processing under way should be continued. Thus, the user can easily transmit a continuation instruction for classification processing to the information processing apparatus 100 .
  • the user instruction estimate unit 112 estimates that the user has given an instruction to the effect that the classification processing under way should be stopped. Thus, the user can easily transmit a stoppage instruction for classification processing to the information processing apparatus 100 .
  • the classification processing unit 114 increases the movement speed of the target objects 221 and 222 (S 134 ). Then, the display control unit 118 controls display so that the target objects 221 and 222 may move toward the target object group 243 at an increased movement speed as shown in FIG. 7D . Thus, the user can confirm that an instruction for the continuation of classification processing is transmitted to the information processing apparatus 100 and, further, speed up the progress of the classification processing.
  • the classification processing unit 114 stops the movement of the target objects 221 and 222 (S 138 ).
  • the object group setting unit 116 sets flag information in which the target object group 243 is defined as an ordinary object group 233 (S 140 ).
  • the display control unit 118 controls display so that the target objects 221 and 222 are stopped in motion and displayed as ordinary objects 214 and 215 respectively as shown in FIG. 7D .
  • the user can confirm that an instruction for the stoppage of classification processing is transmitted to the information processing apparatus 100 .
  • step S 124 the processing by the information processing apparatus 100 returns to step S 124 to continue with the movement of the target objects 221 and 222 .
  • the processing by the information processing apparatus 100 returns to step S 102 to wait for acquisition of movement information from the user.
  • the user can specify a specific object group, for example, through moving operations of the mouse and cancel the setting of the object group through menu operations.
  • the classification processing unit 114 causes the object group setting unit 116 to establish such setting as to cancel the setting of the object group.
  • the object group setting unit 116 specifies an object included in an object group based on identification information of the object contained in classification information of the object group and transmits information of the object to the display control unit 118 .
  • the display control unit 118 controls display so that the objects displayed as the object group may be displayed as an ordinary object, based on the identification information of the object. Further, the object group setting unit 116 invalidates the identification information, the position information, the group information, and the flag information of the object group.
  • the user can change and modify the results of classification processing by the information processing apparatus 100 by, for example, reclassifying objects once classified, in accordance with the different classification conditions.
  • An object group has identification information of its objects classified in accordance with classification conditions, as group information. This enables the user to move an object group on the display screen as in the case of objects, thereby permitting the information processing apparatus 100 to perform various kinds of processing on a plurality of objects as a target. For example, in a case where processing icons are disposed on the display screen for the performing of processing items such as saving, deleting, converting of objects, those items of processing can be performed on a target of a plurality of the objects by moving a group of the objects so that the group is disposed close to the processing icons.
  • a user's instruction for classification processing on an object is estimated based on attribute information and movement information of the object, and the object is classified based on the estimated user's instruction.
  • the user can easily classify the objects without explicitly specifying classification conditions or instructing the performing of the classification processing.
  • objects having the common attribute information are set as an object group having classification conditions that correspond to the attribute information, while an object group to be targeted by classification processing is set as a target object group.
  • classification processing will be performed on target objects which are yet to be classified and have the attribute information matching the classification conditions of the target object group.
  • Target objects yet to be classified will be classified based on the classification conditions of a target object group, so that the user can easily classify the objects without explicitly specifying classification conditions or instructing the performing of the classification processing.

Abstract

There is provided an information processing apparatus that classifies objects having the respective attribute information which are disposed on a display screen of a graphical user interface, the apparatus including an operation information acquisition unit that acquires operation information containing movement information which indicates a position of a movement destination of the object on the display screen, an instructive request estimate unit that estimates an instructive request for classification processing on the object based on the attribute information and the movement information of the object, a classification processing unit that classifies the object based on the estimated instructive request, and a display control unit that controls display of the object on the display screen. Thus, the user can easily classify the objects without explicitly specifying classification conditions or instructing the performing of the classification processing.

Description

CROSS REFERENCES TO RELATED APPLICATIONS
The present invention contains subject matter related to Japanese Patent Application JP 2007-317721 filed in the Japan Patent Office on Dec. 7, 2007, the entire contents of which being incorporated herein by reference.
BACKGROUND OF THE INVENTION Field of the Invention
The present invention relates to an information processing apparatus, an information processing method, and a program.
In the case of classifying information which is expressed as an object or the like in a graphical user interface (GUI) by using an information processing apparatus such as a computer, typically, the user specifies classification conditions explicitly and instructs the information processing apparatus to perform classification processing so that this information processing apparatus may perform the classification processing based on the classification conditions specified by the user.
In this case, the user needs to abstract desired classification conditions and explicitly specify the classification conditions so that the information processing apparatus can perform the classification processing. For example, in order to extract an object of photos taken in summer of 2007 from among a plurality of photo objects, the user needs to specify an abstracted condition expression such as “(‘2007-07’<=$year) && ($year<=‘2007-09’).” Further, the user needs to explicitly instruct the performing of classification processing in accordance with an operation procedure predetermined by the information processing apparatus or an application which is executed in it.
SUMMARY OF THE INVENTION
However, in some cases, the typical user finds it difficult to work out such an abstracted condition expression as given above. Further, in the case of utilizing an information processing apparatus having a lot of functions, the user sometimes finds it difficult also to remember an operation procedure in accordance with which the user would instruct the information processing apparatus to perform classification processing.
It is desirable to provide an information processing apparatus, an information processing method, and a program that can easily classify objects disposed on the display screen of a graphical user interface.
According to a first embodiment of the present invention, there is provided an information processing apparatus that classifies objects having the respective attribute information which are disposed on a display screen of a graphical user interface, the apparatus including an operation information acquisition unit that acquires operation information containing movement information which indicates a position of a movement destination of the object on the display screen, an instructive request estimate unit that estimates an instructive request for classification processing on the object based on the attribute information and the movement information of the object, a classification processing unit that classifies the object based on the estimated instructive request, and a display control unit that controls display of the object on the display screen.
According to this configuration, an instructive request for classification processing on an object is estimated based on the attribute information and movement information of the object, so that the object is classified based on the estimated instructive request. Thus, the user can easily classify the objects without explicitly specifying classification conditions or instructing the performing of the classification processing.
It further includes an object group setting unit that sets an object group which includes a plurality of objects having attribute information common to them and which has classification conditions corresponding to the common attribute information and position information indicating a position on the display screen, in which the classification processing unit causes the object group setting unit to set the object group targeted by the classification processing as a target object group and classifies the unclassified target object having the attribute information that matches the classification conditions of the target object group based on the classification conditions of the target object group. Thus, a plurality of objects having attribute information common to them are set as an object group that has classification conditions which correspond to the attribute information, while an object group targeted by classification processing is set as a target object group. Then, an unclassified target object having the attribute information that matches the classification conditions of the target object group undergoes the classification processing. Accordingly, an unclassified target object is classified based on the classification conditions of a target object group, so that the user can easily classify the objects without explicitly specifying the classification conditions or instructing the performing of the classification processing.
Further, the operation information acquisition unit acquires first movement information about a first object and then acquires second movement information about a second object different from the first object, and if having determined that the first and second objects are disposed close to each other from the first and second movement information and also that the first and second objects have attribute information common to them, the instructive request estimate unit may estimate an instructive request for start of classification processing which uses the common attribute information as the classification conditions. In such a manner, if a first object and a second object have been moved by the user so as to be disposed close to each other and also do they have attribute information common to them, an instructive request is estimated for the classification conditions to be employed in classification processing and the start of the classification processing. Therefore, the user can start classification processing on the objects without explicitly specifying the classification conditions or instructing the performing of the classification processing.
Further, the classification processing unit can cause the object group setting unit to set first and second objects as a target object group, thus starting classification processing based on the classification conditions of the target object group. In such a manner, a plurality of objects moved by the user can be set as a target object group then the classification processing based on the classification conditions of the target object group starts. Therefore, the user can start classification processing on the objects without explicitly setting a target object group or instructing the performing of the classification processing.
Further, the operation information acquisition unit acquires movement information about objects, and if having determined that the objects and an object group have been disposed close to each other from position information of the object group and the movement information of the objects and also the attribute information of the objects matches classification conditions of the object group, the instructive request estimate unit may estimate an instructive request for the start of classification processing. In such a manner, if objects and an object group have been moved by the user so as to be disposed close to each other and also the attribute information of the objects matches the classification conditions of the object group, an instructive request is estimated for the start of the classification processing. Therefore, the user can start classification processing on the objects without explicitly instructing the performing of the classification processing.
Further, the classification processing unit can cause the object group setting unit to set an object group including objects as a target object group, thus starting classification processing based on the classification conditions of the target object group. In such a manner, an object group including object moved by the user is set as a target object group then the classification processing based on the classification conditions of the target object group starts. Therefore, the user can start classification processing on the objects without explicitly setting a target object group or instructing the performing of the classification processing.
Further, the classification processing unit may move a target object toward a target object group at a constant movement speed, while the display control unit may control display so that the target object may move toward the target object group at the constant movement speed. Accordingly, display is provided so that the target object may move toward a target object group at a constant movement speed. Therefore, the user can confirm an instructive request estimated by the information processing apparatus.
Further, if a target object has moved close to a target object group, the classification processing unit may cause the object group setting unit to set the target object group as a target object group that includes the target object. In such a manner, if a target object has moved close to a target object group, the target object group is set as a target object group that includes the target object. Therefore, the user can easily classify objects without explicitly specifying classification conditions or instructing the performing of the classification processing.
Further, the operation information acquisition unit may acquire movement information which indicates the position of a movement destination of a target object on the display screen, while the instructive request estimate unit may estimate an instructive request for classification processing based on the movement information of the target object. In such a manner, an instructive request for classification processing is estimated based on the movement information of a target object. Therefore, the user can easily transmit to the information processing apparatus an instructive request for the classification processing of the object.
Further, if having determined that a target object is prompted to move from the position information of the target object group and the movement information of the target object, the instructive request estimate unit may estimate an instructive request for the continuation of classification processing. In such a manner, if a target object is prompted to move, an instructive request for the continuation of classification processing is estimated. Therefore, the user can easily transmit to the information processing apparatus an instructive request for the continuation of the classification processing.
Further, if an instructive request for the continuation of classification processing is estimated by the instructive request estimate unit, the classification processing unit may increase the movement speed of a target object, while the display control unit may control display so that the target object may move toward the target object group at an increased movement speed. Accordingly, if an instructive request for the continuation of classification processing is estimated, display is provided so that the target object may move at an increased movement speed. Therefore, the user can confirm that an instructive request for the continuation of the classification processing has been transmitted to the information processing apparatus and, further, speed up the progress of the classification processing.
Further, if having determined that a target object is inhibited from moving from the position information of the target object group and the movement information of the target object, the instructive request estimate unit may estimate an instructive request for the stoppage of classification processing. In such a manner, if a target object is inhibited from moving, an instructive request for the stoppage of classification processing is estimated. Therefore, the user can easily transmit to the information processing apparatus an instructive request for the stoppage of the classification processing.
Further, if an instructive request for the stoppage of classification processing is estimated by the instructive request estimate unit, the classification processing unit may stop the movement of a target object, while the display control unit may control display so that the movement of the target object may be stopped. Accordingly, if an instructive request for the stoppage of classification processing is estimated, display is provided so that the target object may stop its movement. Therefore, the user can confirm that an instructive request for the stoppage of the classification processing has been transmitted to the information processing apparatus.
Further, if an instructive request for the stoppage of classification processing is estimated by the instructive request estimate unit, the classification processing unit may cause the object group setting unit to set the target object group as an object group. Accordingly, if an instructive request for the stoppage of classification processing is estimated, the target object group is set as an object group. Therefore, the user can stop the classification processing by the information processing apparatus without explicitly instructing the stoppage of the classification processing.
Further, the operation information acquisition unit may acquire setting cancellation information which indicates an instruction to cancel the setting of a specific object group, while the classification processing unit may cause the object group setting unit to set cancellation of the setting of the specific object group based on the setting cancellation information. In such a manner, the setting of a specific object group is canceled based on setting cancellation information. Therefore, the user can change and modify the results of classification processing by the information processing apparatus.
Further, the display control unit may control display so that the target object group can be distinguished from the other object groups than itself. Accordingly, display is provided so that the target object group may be distinguished from the other target object groups than itself. Therefore, the user can easily confirm the progress status of object classification processing.
Further, the display control unit may control display so that a target object can be distinguished from the other target objects than itself. Accordingly, display is provided so that a target object may be distinguished from the other objects than itself. Therefore, the user can easily confirm a target object which undergoes classification processing.
According to a second embodiment of the present invention, there is provided an information processing method of classifying objects having the respective attribute information which are disposed on a display screen of a graphical user interface, the method including the steps of: acquiring operation information containing movement information which indicates a position of a movement destination of the object on the display screen; estimating an instructive request for classification processing on the object based on the attribute information and the movement information of that object; classifying the object based on the estimated instructive request; and controlling display of the object on the display screen.
According to a third embodiment of the present invention, there is provided a program that causes a computer to perform the information processing method according to the second embodiment of the present invention described above.
According to the embodiments of the present invention described above, there can be provided an information processing apparatus, an information processing method, and a program that can easily classify objects disposed on the display screen of a graphical user interface.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram showing main functional components of an information processing apparatus according to one embodiment of the present invention;
FIG. 2 is a flowchart showing a flow of classification processing by the information processing apparatus;
FIG. 3 is a flowchart showing a flow of classification processing by the information processing apparatus;
FIG. 4A is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 4B is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 4C is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 4D is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 5A is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 5B is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 5C is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 5D is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 6A is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 6B is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 6C is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 6D is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 7A is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 7B is a schematic diagram explaining the classification processing by the information processing apparatus;
FIG. 7C is a schematic diagram explaining the classification processing by the information processing apparatus; and
FIG. 7D is a schematic diagram explaining the classification processing by the information processing apparatus.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the appended drawings. Note that, in the specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
(Functional Components of Information Processing Apparatus)
FIG. 1 is a block diagram showing main functional components of an information processing apparatus 100 according to one embodiment of the present invention.
The information processing apparatus 100 may be, for example, a personal computer, a personal digital assistance (PDA), or a cellular phone and has a display screen integrally provided on it or a connection to a display screen provided separately from it. It is to be noted that although the following will describe a case where a display screen is provided integrally on the information processing apparatus 100, the present invention is similarly applicable also to a case where it is provided separately.
As shown in FIG. 1, the information processing apparatus 100 includes an operation information acquisition unit 102, a storage unit 104, a display unit 106, and a control processing unit 110.
The operation information acquisition unit 102 acquires operation information entered by the user through a keyboard, a pointing device, and the like. The operation information contains movement information which indicates the position of a movement destination of an object on the display screen. It is to be noted that although the following will describe a case where operation information is entered through a mouse, the operation information may be entered through any other input device such as a touch panel.
The storage unit 104 is constituted of a storage memory such as a RAM or a ROM and stores information about programs used to make the information processing apparatus 100 operative, and objects. The display unit 106 is constituted of a display, a monitor, or the like which has a display screen and displays a plurality of pieces of information expressed as an object or the like on the display screen.
The control processing unit 110 includes such function units as a user instruction estimate unit (instructive request estimate unit) 112, a classification processing unit 114, an object group setting unit 116, and a display control unit 118 and manages overall control processing on the information processing apparatus 100, including control processing by use of these function units.
The user instruction estimate unit 112 estimates a user's instruction for classification processing on an object based on attribute information and movement information of the object. The classification processing unit 114 classifies the object based on the user's instruction estimated by the user instruction estimate unit 112. The object group setting unit 116 sets an object group which includes a plurality of objects having attribute information common to them and which has classification conditions corresponding to the common attribute information and position information indicating a position of the object group on the display screen. The display control unit 118 controls the display unit 106, which displays objects on the display screen.
Of these function components, the operation information acquisition unit 102, the user instruction estimate unit 112, the classification processing unit 114, the object group setting unit 116, and the display control unit 118 can be realized in the information processing apparatus 100 as hardware and/or software such as a dedicated electronic circuit or a program which is executed by a CPU.
An object may be various kinds of information that can be a target of classification processing by the information processing apparatus 100, such as content data including image, video, and voice, icon in a graphical user interface (GUI), etc. An object has identification information that can identify the object, at least one piece of attribute information, and position information.
The attribute information of an object contains information assigned by the user and information automatically assigned by the information processing apparatus 100 etc. In a case where the object is a photo content, the attribute information contains information of, for example, a date and time when the photo was taken, a place where the photo was taken, a person who took the photo, etc.
The position information of an object is set in order to specify a position and a range where the object is displayed on the display screen. The position information is set as two-dimensional coordinate information in a case where an object is disposed in a two-dimensional coordinate plane and as three-dimensional coordinate information in a case where an object is disposed in a three-dimensional coordinate space.
A plurality of objects classified based on attribute information common to them can make up an object group. An object group has identification information that can identify the object group, classification conditions, position information, group information, and flag information.
The classification conditions for an object group correspond to attribute information common to a plurality of objects that make up an object group. Classification conditions are set as at least one piece of attribute information in order to classify the objects and used as composite conditions if set as a plurality of pieces of attribute information.
Similar to the position information of objects, the position information of an object group is set in order to specify a position and a range where the object group is displayed on the display screen.
The group information of an object group is a set of the identification information of a plurality of objects that make up the object group and set in order to specify the objects classified in the object group.
The flag information of an object group is set in order to specify whether the object group corresponds to the target object group or an ordinary object group. Here, although described in detail later, the target object group means an object group that is handled as a target of classification processing by the information processing apparatus 100.
The information processing apparatus 100 according to the present embodiment is operative to classify objects having the respective attribute information which are disposed on the display screen of a GUI. The following will describe classification processing of the object by the information processing apparatus 100.
(Classification Processing by Information Processing Apparatus)
FIGS. 2 and 3 are a flowchart showing the flow of classification processing by the information processing apparatus 100. FIGS. 4A to 4D, FIGS. 5A to 5D, FIGS. 6A to 6D, and FIGS. 7A to 7D are schematic diagrams explaining the classification processing by the information processing apparatus 100. It is to be noted that in FIGS. 4A to 4D, FIGS. 5A to 5D, FIGS. 6A to 6D, and FIGS. 7A to 7D, an object, an object group, and a target object group are indicated as a square icon, a pile of square icons, and a hatched pile respectively.
First, assume a case in which a plurality of objects 211 and 212 are disposed on a display screen of the information processing apparatus 100 in a condition where they are not classified as shown in FIG. 4A.
The user moves a pointer 200 on the display screen through moving operations of a mouse and clicks the mouse in a condition where the pointer 200 is disposed on a specific object, thereby specifying the specific object. If the object is specified, the operation information acquisition unit 102 acquires position information that indicates a position of the pointer 200 on the display screen so that the specified object is specified based on the position information of the pointer 200 and the position information of the object.
The user moves the pointer 200 on the display screen through press-and-drag operations in a condition where the object is specified, thereby moving the object in a condition where its movement is interlocked with that of the pointer 200. If the object is moved, the operation information acquisition unit 102 acquires movement information that indicates a position of the pointer 200 on the display screen so that the movement information of the object may be changed as needed. Then, based on the movement information changed as needed, the display control unit 118 controls display of the object on the display screen.
Next, as shown in FIG. 4B, the user releases the press-and-drag operations in a condition where the object 211 is moved to an arbitrary position on the display screen, thereby completing the movement of the object 211. If the movement of the object 211 is completed, the operation information acquisition unit 102 acquires movement information that indicates a position to which the pointer 200 is moved on the display screen so that the position information of the object 211 may be updated based on this movement information. Then, based on the updated position information, the display control unit 118 controls display of the object 211 on the display screen.
Here, assume a case where the user has moved the first object 211 and then moved the second object 212 different from the first object 211 so that it is disposed close to the first object 211 on the display screen as shown in FIG. 4C. In this case, as described above, the operation information acquisition unit 102 acquires the first movement information about the first object 211 and the second movement information about the second object 212 (step S102). Then, based on the first and second movement information, the display control unit 118 controls display of the first and second objects 211 and 212.
If the movement information is acquired, the user instruction estimate unit 112 determines whether the first and second objects 211 and 212 are disposed close to each other based on the movement information of the first and second object 211 and 212 (S104).
In this case, if a plurality of objects are overlapped with each other or otherwise disposed within a predetermined distance, the user instruction estimate unit 112 determines that they are disposed close to each other. If having determined that the objects 211 and 212 are disposed close to each other, the user instruction estimate unit 112 determines whether the first and second objects 211 and 212 have common attribute information based on the attribute information of the first and second objects 211 and 212 (S106).
If having determined that the first and second objects 211 and 212 have common attribute information, the user instruction estimate unit 112 estimates that the user has given an instruction to the effect that classification processing using the common attribute information as classification conditions should be started. This enables the user to start object classification processing without explicitly specifying classification conditions or instructing the performing of the classification processing.
On the other hand, if having determined that the first and second objects 211 and 212 have no common attribute information, the user instruction estimate unit 112 estimates that the user has not given an instruction to the effect that the classification processing should be started. In such a case, the information processing apparatus 100 may notify the user of a alarm, for example, to the effect that objects having different attribute information are going to be classified.
It is to be noted that if the condition determination at step S102 or S106 comes up with “NO”, the processing by the information processing apparatus 100 returns to step S102, to wait for acquisition of movement information from the user.
If it is estimated that the user has instructed the start of classification processing, the classification processing unit 114 causes the object group setting unit 116 to set an object group 231 made up by the first and second objects 211 and 212 (S108). In this case, the object group setting unit 116 assigns unique identification information to the object group 231. The object group setting unit 116 sets as classification conditions the attribute information common to the first and second objects 211 and 212 and sets as a group information a set of the identification information of the first and second objects 211 and 212.
Further, the object group setting unit 116 sets position information of the object group 231 based on the position information of the first and second objects 211 and 212 and flag information by which the object group 231 is defined as the target object group 241 to be classified (S110).
If the objects are thus set completely by the object group setting unit 116, the classification processing unit 114 starts classification processing based on the classification conditions. Thus, the user can start classification processing on objects without explicitly setting a target object group or instructing the performing of the classification processing.
On the other hand, as shown in FIG. 4D, the display control unit 118 controls display so that the target object group 241 may be distinguished from the other object groups. It is to be noted that the target object group 241 may be displayed as distinguished from the other object groups by utilizing, for example, rotation, uplifting, spotlighting, or blinking. Thus, the user can easily confirm the progress status of the object classification processing.
In the second place, assume a case where there are object groups 232 and 233 on the display screen other than the target object group as shown in FIG. 5A. In this case, if the user has moved the object 213 so that it might be disposed close to the object group 233 as shown in FIG. 5B, the operation information acquisition unit 102 acquires movement information about the object 213 (S102). Then, based on the movement information, the display control unit 118 controls display of the object 213.
If the movement information is acquired, the user instruction estimate unit 112 determines whether the object 213 and the object groups 232 and 233 are disposed close to each other based on the movement information of the object 213 and the position information of the object groups 232 and 233 (S112). In this case, if the object 213 and the object groups 232 and 233 are disposed within a predetermined distance, the user instruction estimate unit 112 determines that the object 213 and the object groups 232 and 233 are disposed close to each other.
If having determined that the object 213 and the object group 233 are disposed close to each other, the user instruction estimate unit 112 determines whether attribute information of the object 213 matches classification conditions of the object group 233 based on the attribute information and the classification conditions (S114). If having determined that the attribute information matches the classification conditions, the user instruction estimate unit 112 estimates that the user has given an instruction to the effect that classification processing should be started. This enables the user to start object classification processing without explicitly instructing the performing of the classification processing.
Then, the display control unit 118 controls display so that the object 213 may be taken into the object group 233 as shown in FIG. 5C. For example, the object 213 may be uplifted over the object group 233 once and then displayed as taken into the object group 233.
On the other hand, if having determined that the attribute information does not match the classification conditions, the user instruction estimate unit 112 estimates that the user has not given an instruction to the effect that the classification processing should be started. In such a case, the information processing apparatus 100 may notify the user of a alarm, for example, to the effect that objects that do not match the classification conditions are going to be classified.
It is to be noted that if the condition determination at step S112 or S114 comes up with “NO”, the processing by the information processing apparatus 100 returns to step S102, to wait for acquisition of movement information from the user.
If it is estimated that the user has instructed the start of classification processing, the classification processing unit 114 causes the object group setting unit 116 to set an object group 233 as the object group 233 including the object 213 (S116). In this case, the object group setting unit 116 adds the identification information of the object 213 to the group information, and sets the flag information by which the object group 233 is defined as a target object group 242 (S110).
If the objects are thus set completely by the object group setting unit 116, the classification processing unit 114 starts classification processing based on the classification conditions. Thus, the user can start classification processing on objects without explicitly setting a target object group or instructing the performing of the classification processing.
On the other hand, as shown in FIG. 5D, the display control unit 118 controls display so that the target object group 242 may be distinguished from the other object groups. Thus, the user can easily confirm the progress status of the object classification processing.
In the third place, if a target object group 243 is set as shown in FIG. 6A, the classification processing unit 114 starts classification processing based on classification conditions of the target object group 243. First, the classification processing unit 114 extracts objects that have attribute information matching the classification conditions and that are not included in the target object group 243 or any other object groups, as target objects 221 and 222 of classification processing (S122). It is to be noted that if there are not target objects on the display screen, the classification processing unit 114 stops the classification processing on the target object group 243.
On the other hand, the display control unit 118 controls display so that the target objects 221 and 222 may be distinguished from the other objects. The target objects 221 and 222 may be displayed as distinguished from the other objects by utilizing, for example, rotation, uplifting, spotlighting, or blinking. Thus, the user can easily confirm a target object of classification processing.
If the target objects 221 and 222 are extracted, the classification processing unit 114 moves the target objects 221 and 222 toward the target object group 243 at a constant movement speed (S124). The display control unit 118 controls display so that the target objects 221 and 222 move toward the target object group 243 at the constant movement speed. In this case, the movement of the target objects 221 and 222 is expressed by changing the position information of the target objects 221 and 222 so that it nears the position information of the target object group 243 at a constant rate. Therefore, the user can confirm an instruction of the user estimated by the information processing apparatus 100.
If the target objects 221 and 222 start moving, the classification processing unit 114 determines whether the target objects 221 and 222 have moved in such a manner as to be disposed close to the target object group 243, based on the position information of the target objects 221 and 222 and the position information of the target object group 243 (S126). In this case, if the target objects 221 and 222 and the target object group 243 are disposed within a predetermined distance, the classification processing unit 114 determines that both of them are disposed close to each other.
If having determined that both are disposed close to each other as shown in FIG. 6B, the classification processing unit 114 causes the object group setting unit 116 to set the target object group 243 as the target object group 243 including the target object 221 (S128). In this case, the object group setting unit 116 adds the identification information of the target object 221 to the group information. Thus, the user can easily classify objects without explicitly specifying classification conditions or instructing the performing of classification processing. Then, the display control unit 118 controls display so that the target object 221 may be taken into the target object group 243 as shown in FIG. 6C.
If the target object 221 is set as part of the target object group 243, the classification processing unit 114 determines whether there are any other target objects yet to be classified (S130). If there are no other target objects, the classification processing unit 114 stops the classification processing on the target object group 243, and the processing by the information processing apparatus 100 returns to step S102, to wait for acquisition of movement information from the user. On the other hand, if there are any other target objects, the classification processing unit 114 continues with the following processing.
If the user moves the target objects 221 and 222 in motion on the screen in a condition where the target objects 221 and 222 have started moving, the operation information acquisition unit 102 acquires movement information about the target objects 221 and 222, based on which movement information the display control unit 118 controls display of the target objects 221 and 222. The user instruction estimate unit 112 estimates a user's instruction for classification processing based on the movement information of the target objects 221 and 222 as described below. Thus, the user can easily transmit an instruction for object classification processing to the information processing apparatus 100.
The user instruction estimate unit 112 determines whether the target objects 221 and 222 are prompted to move or inhibited from moving, based on the movement information of the target objects 221 and 222 (S132, S136). In this case, it determines that the target objects 221 and 222 are prompted to move if the target object 222 is moved toward the target object group 243 as shown in FIG. 7A and, if the target object 222 is moved away from the target object group 243 or stopped in motion as shown in FIG. 7C, determines that the target objects 221 and 222 are inhibited from moving.
It is to be noted that if the condition determination at step S132 and S136 comes up with “NO”, the processing by the classification processing unit 114 returns to step S124, to continue with the movement of the target objects 221 and 222.
If having determined that the target object 222 is prompted to move, the user instruction estimate unit 112 estimates that the user has given an instruction to the effect that the classification processing under way should be continued. Thus, the user can easily transmit a continuation instruction for classification processing to the information processing apparatus 100. On the other hand, if having determined that the target object 222 is inhibited from moving, the user instruction estimate unit 112 estimates that the user has given an instruction to the effect that the classification processing under way should be stopped. Thus, the user can easily transmit a stoppage instruction for classification processing to the information processing apparatus 100.
If the user's instruction is estimated for continuation of the classification processing, the classification processing unit 114 increases the movement speed of the target objects 221 and 222 (S134). Then, the display control unit 118 controls display so that the target objects 221 and 222 may move toward the target object group 243 at an increased movement speed as shown in FIG. 7D. Thus, the user can confirm that an instruction for the continuation of classification processing is transmitted to the information processing apparatus 100 and, further, speed up the progress of the classification processing.
On the other hand, if the user's instruction is estimated for stoppage of the classification processing, the classification processing unit 114 stops the movement of the target objects 221 and 222 (S138). In this case, the object group setting unit 116 sets flag information in which the target object group 243 is defined as an ordinary object group 233 (S140). Thus, the user can stop classification processing by the information processing apparatus 100 without explicitly instructing stoppage of the classification processing. Then, the display control unit 118 controls display so that the target objects 221 and 222 are stopped in motion and displayed as ordinary objects 214 and 215 respectively as shown in FIG. 7D. Thus, the user can confirm that an instruction for the stoppage of classification processing is transmitted to the information processing apparatus 100.
If the user's instruction is estimated for the continuation of classification processing, the processing by the information processing apparatus 100 returns to step S124 to continue with the movement of the target objects 221 and 222. On the other hand, if the user's instruction is estimated for the stoppage of the classification processing, the processing by the information processing apparatus 100 returns to step S102 to wait for acquisition of movement information from the user.
In order to, for example, change or modify the results of classification processing by the information processing apparatus 100, the user can specify a specific object group, for example, through moving operations of the mouse and cancel the setting of the object group through menu operations. In this case, if the operation information acquisition unit 102 has acquired setting cancellation information for the object group, the classification processing unit 114 causes the object group setting unit 116 to establish such setting as to cancel the setting of the object group.
The object group setting unit 116 specifies an object included in an object group based on identification information of the object contained in classification information of the object group and transmits information of the object to the display control unit 118. The display control unit 118 in turn controls display so that the objects displayed as the object group may be displayed as an ordinary object, based on the identification information of the object. Further, the object group setting unit 116 invalidates the identification information, the position information, the group information, and the flag information of the object group. Thus, the user can change and modify the results of classification processing by the information processing apparatus 100 by, for example, reclassifying objects once classified, in accordance with the different classification conditions.
An object group has identification information of its objects classified in accordance with classification conditions, as group information. This enables the user to move an object group on the display screen as in the case of objects, thereby permitting the information processing apparatus 100 to perform various kinds of processing on a plurality of objects as a target. For example, in a case where processing icons are disposed on the display screen for the performing of processing items such as saving, deleting, converting of objects, those items of processing can be performed on a target of a plurality of the objects by moving a group of the objects so that the group is disposed close to the processing icons.
As described above, by the information processing apparatus 100 according to the present embodiment, a user's instruction for classification processing on an object is estimated based on attribute information and movement information of the object, and the object is classified based on the estimated user's instruction. Thus, the user can easily classify the objects without explicitly specifying classification conditions or instructing the performing of the classification processing.
In particular, objects having the common attribute information are set as an object group having classification conditions that correspond to the attribute information, while an object group to be targeted by classification processing is set as a target object group. Then, classification processing will be performed on target objects which are yet to be classified and have the attribute information matching the classification conditions of the target object group. Target objects yet to be classified will be classified based on the classification conditions of a target object group, so that the user can easily classify the objects without explicitly specifying classification conditions or instructing the performing of the classification processing.
Although a preferred embodiment of the present invention is described in the foregoing with reference to the drawings, the present invention is not limited thereto. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims (21)

What is claimed is:
1. An information processing apparatus having a central processing unit that classifies objects associated with the respective attribute information and associated with position information representing a position where the objects are disposed on a display screen, the apparatus comprising:
a display control unit that controls display of the object on the display screen, wherein the display control unit further comprises an object position control unit that acquires a movement instruction provided by a user and performs movement of the object on the display screen based on the movement instruction;
an operation information acquisition unit that acquires operation information of an object containing movement information, which indicates a movement destination position of the object on the display screen resulting from the movement of the object on the display screen by the object position control unit;
an instructive request estimate unit implemented by the central processing unit that determines, based on whether a compatibility level between the attribute information of the object and attribute information of a resultant proximate object is greater than a predetermined threshold, whether or not the movement of the object on the display screen by the object position control unit is a classification processing request for the object;
a classification processing unit that classifies the object based on a result of the determination by the instructive request estimate unit,
wherein the attribute information of the object is distinct from the operation information and a position of the object on the display screen resulting from movement of the object on the display screen by the object position control unit.
2. The information processing apparatus according to claim 1, wherein the instructive request estimate unit further compares the attribute information of the object with classification conditions of a target object group and, in response to obtaining a positive result from the comparison with regard to compatibility, concludes that the movement of the object is a classification processing request for the object, the target object group including a plurality of objects having a common attribute information and having the classification conditions corresponding to the common attribute information and position information indicating a position on the display screen.
3. The information processing apparatus according to claim 2, wherein
the operation information acquisition unit acquires the movement information about a plurality of objects, and
in response to having determined that the objects and a first object group are disposed close to each other from the position information of the first object group and the movement information of the objects and having determined that the attribute information of the objects matches the classification conditions of the first object group, the instructive request estimate unit determines that the movement of the objects is a classification processing request for adding the objects to the first object group.
4. The information processing apparatus according to claim 3, wherein the classification processing unit sets the first object group including the objects as the target object group, thus starting classification processing based on the classification conditions of the first object group.
5. The information processing apparatus according to claim 2, wherein
the classification processing unit initiates movement of a target object toward the target object group at a constant movement speed, and
the display control unit controls display so that the target object moves toward the target object group at the constant movement speed.
6. The information processing apparatus according to claim 5, wherein in response to the target object being permitted to move close to the target object group, the classification processing unit includes the target object in the target object group.
7. The information processing apparatus according to claim 5, wherein
the operation information acquisition unit acquires movement information of the target object, which indicates a movement destination position of the target object on the display screen resulting from movement of the target object, and
the instructive request estimate unit determines, based on the movement information of the target object, whether or not the movement of the target object is a classification processing request for the target object.
8. The information processing apparatus according to claim 7, wherein in response to having determined that the target object is prompted to move from the position information of the target object group and from the movement information of the target object, the instructive request estimate unit determines whether or not the movement of the target object resulting from the prompt is a request for continuation of the classification processing.
9. The information processing apparatus according to claim 8, wherein
in response to a determination by the instructive request estimate unit that the movement of the target object is the request for the continuation of the classification processing, the classification processing unit initiates an increase in the movement speed of the target object, and
the display control unit controls display so that the target object moves toward the target object group at the increased movement speed.
10. The information processing apparatus according to claim 7, wherein in response to having determined that the target object is inhibited from moving from the position information of the target object group and the movement information of the target object, the instructive request estimate unit determines that the movement of the target object is not a request for the classification processing.
11. The information processing apparatus according to claim 10, wherein
in response to the instructive request estimating unit determining that the movement is not the request for the classification processing, the classification processing unit initiates cessation of movement of the target object, and
the display control unit controls display so that the target object is stopped in motion.
12. The information processing apparatus according to claim 10, wherein in response to the instructive request estimating unit determining that the movement is not the request for the classification processing, the classification processing unit sets the target object group as a non-targeted object group.
13. The information processing apparatus according to claim 5, wherein the display control unit controls display so that the target object is displayed in way such that the target object is distinguishable from objects other than the target object.
14. The information processing apparatus according to claim 2, wherein
the operation information acquisition unit acquires setting cancellation information which indicates an instruction to cancel targeting of a specific object group, and
the classification processing unit cancels targeting of the specific object group based on the setting cancellation information.
15. The information processing apparatus according to claim 2, wherein the display control unit controls display so that the target object group is displayed in way such that the target object group is distinguishable from object groups other than the target object group.
16. The information processing apparatus according to claim 1, wherein
the operation information acquisition unit acquires the movement information about the movement of the object and then acquires second movement information about a movement of a second object different from the object, and
in response to having determined that the object and second object are disposed close to each other from the movement information and the second movement information and having determined that the object and the second object have common attribute information, the instructive request estimate unit determines that the movement of the object and the movement of the second object is a classification processing request for classifying the object and the second object into a common new object group.
17. The information processing apparatus according to claim 16, wherein the classification processing unit groups the object and the second object the new object group, thus starting the new object group based on classification conditions of the common attribute information of the object and the second object.
18. An information processing method, implemented by a information processing apparatus having a central processing unit, of classifying objects associated with the respective attribute information and associated with position information representing a position where the objects are disposed on a display screen, the method comprising the steps of:
controlling display of the object on the display screen, the controlling further comprising
acquiring a movement instruction provided by a user, and
performing movement of the object on the display screen based on the movement instruction;
acquiring operation information containing movement information of an object, which indicates a movement destination position of the object on the display screen resulting from the movement of the object on the display screen by the performing;
determining, based on whether a compatibility level between the attribute information of the object and attribute information of a resultant proximate object is greater than a predetermined threshold using the central processing unit, whether or not the movement of the object on the display screen by the performing is a classification processing request for the object, the attribute information of the object being distinct from the operation information and a position of the object on the display screen resulting from movement of the object on the display screen by the performing; and
classifying the object based on a result of the determination by the determining.
19. A non-transitory computer readable medium having stored thereon a program that when executed by a computer causes the computer to perform an information processing method of classifying objects associated with the respective attribute information and associated with position information representing a position where the objects are disposed on a display screen, the information processing method comprising the steps of:
controlling display of the object on the display screen, the controlling further comprising
acquiring a movement instruction provided by a user, and
performing movement of the object on the display screen based on the movement instruction;
acquiring operation information containing movement information of an object, which indicates a movement destination position of the object on the display screen resulting from movement of the object on the display screen by the performing;
determining, based on whether a compatibility level between the attribute information of the object and attribute information of a resultant proximate object is greater than a predetermined threshold, whether or not the movement of the object on the display screen by the performing is a classification processing request for the object, the attribute information of the object being distinct from the operation information and a position of the object on the display screen resulting from movement of the object on the display screen by the performing;
classifying the object based on a result of the determination by the determining; and
controlling display of the object on the display screen.
20. The information processing apparatus according to claim 1, wherein the attribute information of the object is distinct from the position of the object on the display screen resulting from movement of the object.
21. The information processing apparatus according to claim 1, wherein the attribute information is distinct from an appearance of the object on the display screen.
US12/274,619 2007-12-07 2008-11-20 Information processing apparatus, information processing method, and program Expired - Fee Related US8624931B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007-317721 2007-12-07
JP2007317721A JP5079480B2 (en) 2007-12-07 2007-12-07 Information processing apparatus, information processing method, and program

Publications (2)

Publication Number Publication Date
US20090147027A1 US20090147027A1 (en) 2009-06-11
US8624931B2 true US8624931B2 (en) 2014-01-07

Family

ID=40721171

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/274,619 Expired - Fee Related US8624931B2 (en) 2007-12-07 2008-11-20 Information processing apparatus, information processing method, and program

Country Status (2)

Country Link
US (1) US8624931B2 (en)
JP (1) JP5079480B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6575077B2 (en) * 2015-02-23 2019-09-18 富士ゼロックス株式会社 Display control apparatus and display control program

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10207907A (en) 1997-01-27 1998-08-07 Mitsubishi Electric Corp Object arrangement, display and operation method and device using three-dimensional space
US5801699A (en) * 1996-01-26 1998-09-01 International Business Machines Corporation Icon aggregation on a graphical user interface
JPH11219369A (en) 1998-02-03 1999-08-10 Fujitsu Ltd Information presentation device
JP2002021644A (en) 2000-07-12 2002-01-23 Chuo Motor Wheel Co Ltd Liquefied gas fuel feeder for engine
JP2003288352A (en) 2002-01-23 2003-10-10 Matsushita Electric Ind Co Ltd Information analytic display device and information analytic display program
US20040143627A1 (en) * 2002-10-29 2004-07-22 Josef Dietl Selecting a renderer
US20050073585A1 (en) * 2003-09-19 2005-04-07 Alphatech, Inc. Tracking systems and methods
JP2006146729A (en) 2004-11-22 2006-06-08 National Institute Of Advanced Industrial & Technology Content retrieval display device and method, and program
US7274379B2 (en) * 2003-03-27 2007-09-25 Canon Kabushiki Kaisha Graphical object group management system
JP2007310890A (en) 2006-05-19 2007-11-29 Fuji Xerox Co Ltd Object organization method, system, and program
US7589750B1 (en) * 2006-03-15 2009-09-15 Adobe Systems, Inc. Methods and apparatus for arranging graphical objects
US7589749B1 (en) * 2005-08-16 2009-09-15 Adobe Systems Incorporated Methods and apparatus for graphical object interaction and negotiation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6791711B1 (en) * 1998-06-24 2004-09-14 Canon Kabushiki Kaisha Image processing method, image processing apparatus, and recording medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5801699A (en) * 1996-01-26 1998-09-01 International Business Machines Corporation Icon aggregation on a graphical user interface
JPH10207907A (en) 1997-01-27 1998-08-07 Mitsubishi Electric Corp Object arrangement, display and operation method and device using three-dimensional space
JPH11219369A (en) 1998-02-03 1999-08-10 Fujitsu Ltd Information presentation device
JP2002021644A (en) 2000-07-12 2002-01-23 Chuo Motor Wheel Co Ltd Liquefied gas fuel feeder for engine
JP2003288352A (en) 2002-01-23 2003-10-10 Matsushita Electric Ind Co Ltd Information analytic display device and information analytic display program
US20040143627A1 (en) * 2002-10-29 2004-07-22 Josef Dietl Selecting a renderer
US7274379B2 (en) * 2003-03-27 2007-09-25 Canon Kabushiki Kaisha Graphical object group management system
US20050073585A1 (en) * 2003-09-19 2005-04-07 Alphatech, Inc. Tracking systems and methods
JP2006146729A (en) 2004-11-22 2006-06-08 National Institute Of Advanced Industrial & Technology Content retrieval display device and method, and program
US7589749B1 (en) * 2005-08-16 2009-09-15 Adobe Systems Incorporated Methods and apparatus for graphical object interaction and negotiation
US7589750B1 (en) * 2006-03-15 2009-09-15 Adobe Systems, Inc. Methods and apparatus for arranging graphical objects
JP2007310890A (en) 2006-05-19 2007-11-29 Fuji Xerox Co Ltd Object organization method, system, and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Office Action issued Nov. 15, 2011, in Japanese Patent Application No. 2007-317721.

Also Published As

Publication number Publication date
US20090147027A1 (en) 2009-06-11
JP2009140367A (en) 2009-06-25
JP5079480B2 (en) 2012-11-21

Similar Documents

Publication Publication Date Title
US9990107B2 (en) Devices, methods, and graphical user interfaces for displaying and using menus
US9645732B2 (en) Devices, methods, and graphical user interfaces for displaying and using menus
KR102375794B1 (en) Structured suggestions
EP3495933B1 (en) Method and mobile device for displaying image
TWI545496B (en) Device, method, and graphical user interface for adjusting the appearance of a control
CN111324266B (en) Device, method and graphical user interface for sharing content objects in a document
US9712577B2 (en) Device, method, and graphical user interface for sharing content from a respective application
CN105335048B (en) Electronic equipment with hidden application icon and method for hiding application icon
US9690441B2 (en) Method and apparatus for managing message
CN107678644B (en) Image processing method and mobile terminal
US20140002387A1 (en) Electronic apparatus and control method
CN111339032A (en) Apparatus, method and graphical user interface for managing a folder having multiple pages
EP3103034A1 (en) User interface for searching
US20150193135A1 (en) Method for providing glance information, machine-readable storage medium, and electronic device
WO2014100958A1 (en) Method, apparatus and computer program product for providing a recommendation for an application
KR20210089799A (en) Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display
WO2016195785A1 (en) Linkin multiple windows in a user interface display
CN112947923A (en) Object editing method and device and electronic equipment
CN113268182A (en) Application icon management method and electronic equipment
US8624931B2 (en) Information processing apparatus, information processing method, and program
DK201500581A1 (en) Devices, Methods, and Graphical User Interfaces for Displaying and Using Menus
US20170322723A1 (en) Method and apparatus for executing function on a plurality of items on list
CN111796736B (en) Application sharing method and device and electronic equipment
CN114489414A (en) File processing method and device
CN113360062A (en) Display control method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYASHITA, KEN;MATSUDA, KOUICHI;REEL/FRAME:021868/0947;SIGNING DATES FROM 20080929 TO 20081007

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYASHITA, KEN;MATSUDA, KOUICHI;SIGNING DATES FROM 20080929 TO 20081007;REEL/FRAME:021868/0947

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20220107