US20150097920A1 - Information processing apparatus and information processing method - Google Patents
Information processing apparatus and information processing method Download PDFInfo
- Publication number
- US20150097920A1 US20150097920A1 US14/571,473 US201414571473A US2015097920A1 US 20150097920 A1 US20150097920 A1 US 20150097920A1 US 201414571473 A US201414571473 A US 201414571473A US 2015097920 A1 US2015097920 A1 US 2015097920A1
- Authority
- US
- United States
- Prior art keywords
- display
- image
- information processing
- display region
- captured
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- G06K9/00268—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1827—Network arrangements for conference optimisation or adaptation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
Definitions
- the present disclosure relates to information processing apparatus and an information processing method.
- a television receiver includes a network communication function which enables not only reception of video and audio content of a program from a broadcasting station and display of the content but also transmission of various information to and reception of various information from another receiver.
- Japanese Unexamined Patent Application Publication No. 2006-50370 discloses a technique of displaying, when a user views program content of television broadcasting using a television receiver, information on registered other users (such as thumbnail images of the other users, and names, channels, and video images of content viewed by the other users) together with the program content.
- a display device such as a television receiver
- a PIP (Picture in Picture) display method or a POP (Picture on Picture) display method is generally used.
- an information processing apparatus may include an obtaining unit to obtain a number of users from information on detection of a face region including a face in a captured image provided at the apparatus.
- the apparatus also may include a setting unit to set a display region for content and a display region for a captured image in a display screen.
- the apparatus may include a display image generation unit to generate a display image to be displayed in the display region for a captured image, in accordance with the information on the detection, the number of users, and the display region set for a captured image.
- a method may include obtaining a number of users from information on detection of a face region including a face in a captured image; setting a display region for content and a display region for a captured image in a display screen; and generating a display image to be displayed in the display region for a captured image, in accordance with the information on the detection, the number of users, and the display region set for a captured image.
- at least one of the obtaining, the setting and the generating may be by a processor.
- a non-transitory recording medium may be recorded with a computer-readable program having instructions executable by a processor.
- the program may include obtaining a number of users from information on detection of a face region including a face in a captured image; setting a display region for content and a display region for a captured image in a display screen; and generating a display image to be displayed in the display region for a captured image, in accordance with the information on the detection, the number of users, and the display region set for a captured image.
- display of content and display of a communication image may be appropriately performed in parallel.
- FIG. 1 is a diagram illustrating a display system according to an embodiment of the present disclosure
- FIG. 2 is a diagram illustrating a functional configuration of an information processing apparatus according to a first embodiment of the present disclosure
- FIG. 3 is a diagram illustrating a hardware configuration of the information processing apparatus shown in FIG. 2 ;
- FIG. 4 is a flowchart of a display process executed by the information processing apparatus shown in FIG. 2 ;
- FIG. 5 is a flowchart of a parallel display process included in the display process shown in FIG. 4 ;
- FIG. 6 is a diagram illustrating an image captured in another display system
- FIGS. 7A to 7D are diagrams illustrating content and captured images supplied from another information processing apparatus which are displayed in parallel in a display screen of a display device;
- FIG. 8A is a diagram illustrating an image captured by another display system when two users are using the display system
- FIG. 8B is a diagram illustrating images captured by other display systems when each of the display systems is used by a single user
- FIGS. 9A to 9D are diagrams illustrating content and a captured image supplied from another information processing apparatus which are displayed in parallel in the display screen of the display device;
- FIG. 10A is a diagram illustrating an image captured by another display system when three users are using the display system
- FIG. 10B is a diagram illustrating images captured by other display systems when one of the display systems is used by two users and the other is used by a single user;
- FIG. 10C is a diagram illustrating images captured by other display systems when each of the display systems is used by a single user
- FIGS. 11A to 11D are diagrams illustrating content and a captured image supplied from another information processing apparatus which are displayed in parallel in the display screen of the display device;
- FIGS. 12A to 12D are diagrams illustrating content and a captured image supplied from another information processing apparatus which are displayed in parallel in the display screen of the display device;
- FIG. 13 is a diagram illustrating a functional configuration of an information processing apparatus according to a second embodiment of the present disclosure.
- FIGS. 14A to 14C are diagrams illustrating a fourth embodiment of the present disclosure.
- FIGS. 15A to 15D are diagrams illustrating a fifth embodiment of the present disclosure.
- FIGS. 16A to 16F are diagrams illustrating a sixth embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating a display system according to an embodiment.
- FIG. 1 is a front view of the display system viewed from the front.
- a display system 100 includes a display device 102 and an image pickup device 104 , for example.
- the display device 102 is an example of a display device of the present disclosure, and displays still images or moving images in accordance with driving signals.
- the display device 102 displays a still image or a moving image using liquid crystal.
- the display device 102 may display a still image or a moving image using self-luminous display device such as an organic EL (Electroluminescence).
- the image pickup device 104 which is an example of an image pickup device according to the present disclosure is disposed in an upper center portion of the display device 102 and captures a subject located in a display direction of the display device 102 .
- the image pickup device 104 may take still images or moving images using a CCD (Charge Coupled Device) image sensor or may take still images or moving images using a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- the image pickup device 104 is disposed in the upper center portion of the display device 102 in this embodiment, a location where the image pickup device 104 is disposed is not limited to the upper center portion of the display device 102 .
- the image pickup device 104 may be disposed in a lower center portion of the display device 102 .
- the single image pickup device 104 is disposed in this embodiment, the number of image pickup devices 104 is not limited to one.
- two or more image pickup devices 104 may be disposed.
- the display device 102 and the image pickup device 104 are integrally configured in this embodiment, the display device 102 and the image pickup device 104 may be separately configured.
- the display system 100 may include a sensor (not shown) which detects presence or absence of a user positioned in front of the display device 102 and a signal reception unit (not shown) capable of receiving a control signal through an infrared communication or a wireless communication from a remote controller (not shown). Furthermore, the sensor may detect a distance between the display device 102 and the user positioned in front of the display device 102 .
- the display device 102 of this embodiment may display content corresponding to a still image or a moving image and images captured by other information processing apparatuses 500 and 700 shown in FIG. 2 in parallel which will be described hereinafter.
- the display device 102 may display content corresponding to a still image or a moving image in a content display region in a display screen and display images captured by the other information processing apparatuses 500 and 700 in a display region for displaying images captured by the information processing apparatuses 500 and 700 in the display screen.
- the content display region is an example of a first display region according to the present disclosure.
- the display region for displaying images captured by the other information processing apparatuses 500 and 700 is an example of a second display region according to the present disclosure.
- the image pickup device 104 of this embodiment may capture a still image and a moving image regarding a user A who is watching the display screen of the display device 102 shown in FIG. 2 .
- FIG. 2 is a diagram illustrating a functional configuration of the information processing apparatus according to the first embodiment of the present disclosure.
- FIG. 2 includes a display system 100 which transmits a captured image to an information processing apparatus 200 serving as the information processing apparatus according to this embodiment and which receives a signal for driving the display device 102 from the information processing apparatus 200 and a user A who uses the display system 100 and the information processing apparatus 200 .
- FIG. 1 illustrates a functional configuration of the information processing apparatus according to the first embodiment of the present disclosure.
- FIG. 2 includes a display system 100 which transmits a captured image to an information processing apparatus 200 serving as the information processing apparatus according to this embodiment and which receives a signal for driving the display device 102 from the information processing apparatus 200 and a user A who uses the display system 100 and the information processing apparatus 200 .
- the 2 includes a communication network 800 to which the information processing apparatus 200 is connectable, a communication server 300 and other information processing apparatuses 500 and 700 which are connectable to the communication network 800 , a display system 400 which transmits a captured image to the information processing apparatus 500 and which receives a signal from the information processing apparatus 500 , users B and C who use the display system 400 and the information processing apparatus 500 , a display system 600 which transmits a captured image to the information processing apparatus 700 and which receives a signal from the information processing apparatus 700 , and a user D who uses the display system 600 and the information processing apparatus 700 .
- the display systems 400 and 600 have the same configurations as the display system 100 , and therefore, detailed descriptions thereof are omitted.
- the information processing apparatuses 500 and 700 have the same configurations as the information processing apparatus 200 , and therefore, detailed descriptions thereof are omitted.
- the information processing apparatuses 500 and 700 are examples of communication object apparatuses according to the present disclosure.
- the information processing apparatus 200 includes an input unit 202 , a detection unit 204 , a head-count obtaining unit 206 , a setting unit 208 , a display image generation unit 210 , a communication unit 212 , an output unit 214 , and a display controller 216 .
- the input unit 202 receives a captured image which is generated by the image pickup device 104 through image capturing.
- the captured image generated by the image pickup device 104 through the image capturing is an example of another captured image according to the present disclosure.
- the input unit 202 transmits the received (input) captured image to the communication unit 212 .
- the input unit 202 may transmit the received captured image to the detection unit 204 and the display image generation unit 210 .
- the input unit 202 accepts an input for a setting of a display ratio of a display region for content to a display region for captured images supplied from the information processing apparatuses 500 and 700 in a display screen of the display device 102 performed by the user A, for example.
- the input unit 202 may accept an input for a setting of a display ratio of a display region for content to a display region for captured images supplied from the information processing apparatus 200 , 500 , and 700 in the display screen of the display device 102 performed by the user A. Then, the input unit 202 transmits information on the received input for a setting of a display ratio to the setting unit 208 .
- the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 is an example of a second display region according to the present disclosure.
- the detection unit 204 which is an example of a detection unit according to the present disclosure receives captured images which are supplied from the information processing apparatuses 500 and 700 and which are received by the communication unit 212 and detects face regions including faces of the users B, C, and D in the received captured images. Then, the detection unit 204 transmits information on the face regions as results of the detection to the head-count obtaining unit 206 and the display image generation unit 210 . Note that the detection unit 204 may receive a captured image supplied from the input unit 202 and detect a face region including a face of the user A in the received captured image.
- the detection unit 204 may transmit information on the face region including the face of the user A as a result of the detection to the head-count obtaining unit 206 , the display image generation unit 210 , and the communication unit 212 .
- a technique disclosed in Japanese Unexamined Patent Application Publication No. 2007-65766 and a technique disclosed in Japanese Unexamined Patent Application Publication No. 2005-44330 may be used for the detection of a face region performed on a captured image by the detection unit 204 .
- the detection of a face region will be briefly described.
- a face position, a face size, and a face direction are individually detected in the received captured image.
- a portion corresponding to a face image may be extracted from the image.
- characteristic portions of the face face feature positions
- a method referred to as an AAM (Active Appearance Models) method may be used for the detection of the face feature positions.
- a face is identified in the image captured by the image pickup device 104 .
- a technique disclosed in Japanese Unexamined Patent Application Publication No. 2007-65766 or a technique disclosed in Japanese Unexamined Patent Application Publication No. 2005-44330 may be used as a method for identifying a face, and therefore, a detailed description thereof is omitted here.
- a gender and an age of a face included in the received captured image may be determined.
- the face of the user included in the received captured image may be obtained from among the stored faces so as to specify the user.
- the head-count obtaining unit 206 which is an example of an obtaining unit according to the present disclosure receives information on the face regions including the faces of the users B, C, and D detected by the detection unit 204 . Then, the head-count obtaining unit 206 obtains the number of users who use the information processing apparatuses 500 and 700 in accordance with the received information on the face regions. Thereafter, the head-count obtaining unit 206 transmits a result of the obtainment of the number of users who use the information processing apparatuses 500 and 700 to the display image generation unit 210 .
- the head-count obtaining unit 206 may receive information on a face region including the face of the user A detected by the detection unit 204 and obtain the number of users who use the information processing apparatus 200 in accordance with received information on the face region including the face of the user A. Then, the head-count obtaining unit 206 may transmit a result of the obtainment of the number of users who use the information processing apparatus 200 to the display image generation unit 210 and the display controller 216 .
- the head-count obtaining unit 206 may receive the information on the face regions and obtain the number of users who use the information processing apparatuses 500 and 700 in accordance with the received information on the face regions.
- the detection unit 204 described above may not detect the face regions including the faces of the users B, C and D in the captured images supplied from the information processing apparatuses 500 and 700 .
- the setting unit 208 which is an example of a setting unit according to the present disclosure receives information on the input for the setting of the display ratio from the input unit 202 and sets the display region for content and a display region for captured images supplied from the information processing apparatuses 500 and 700 in the display screen of the display device 102 in accordance with the received information on the input for the setting of the display ratio. Furthermore, the setting unit 208 may set a display region for content and a display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 in the display screen of the display device 102 in accordance with the received information on the input for the setting of the display ratio.
- the setting unit 208 transmits information on the set display region for captured images to the display image generation unit 210 and the display controller 216 and transmits information on the set display region for content to the display controller 216 .
- the setting unit 208 sets a size of the display region for content and a size of the display region for captured images supplied from the information processing apparatuses 500 and 700 in the display screen of the display device 102 and transmits information on the set size of the display region for content and information on the set size of the display region for captured images.
- the display image generation unit 210 which is an example of a generation unit according to the present disclosure receives information on the face regions including the faces of the users B, C, and D from the detection unit 204 , receives the result of the obtainment of the number of users who use the information processing apparatuses 500 and 700 from the head-count obtaining unit 206 , receives the information on the display region for captured images supplied from the information processing apparatuses 500 and 700 from the setting unit 208 , and receives the image captured using the display system 400 and the image captured using the display system 600 which are received by the communication unit 212 .
- the display image generation unit 210 generates a display image to be displayed in the display region for captured images supplied from the information processing apparatuses 500 and 700 included in the display screen of the display device 102 using the image captured by the display system 400 and the image captured by the display system 600 . Thereafter, the display image generation unit 210 transmits the generated display image to the display controller 216 .
- the display image generation unit 210 extracts a portion of the image captured by the display system 400 corresponding to one third of the display region for captured images such that at least face feature portions of the user B are included in the portion in accordance with the information on the face region including the face of the user B, the result of the obtainment of the number of users who use the information processing apparatuses 500 and 700 , that is, information representing that the number of users is three, and the information on the display region for captured images supplied from the information processing apparatuses 500 and 700 such as information on the size of the display region.
- the display image generation unit 210 extracts a portion of the image captured by the display system 400 corresponding to one third of the display region for captured images such that at least face feature portions of the user C are included in the portion in accordance with the information on the face region including the face of the user C, the information representing that the number of users who use the information processing apparatuses 500 and 700 is three, and the information on the size of the display region for captured images supplied from the information processing apparatuses 500 and 700 .
- the display image generation unit 210 extracts a portion of the image captured by the display system 600 corresponding to one third of the display region for captured images such that at least face feature portions of the user D are included in the portion in accordance with the information on the face region including the face of the user D, the information representing that the number of users who use the information processing apparatuses 500 and 700 is three, and the information on the size of the display region for captured images supplied from the information processing apparatuses 500 and 700 .
- the display image generation unit 210 arranges the extracted captured images in the display region for captured images supplied from the information processing apparatuses 500 and 700 included in the display screen of the display device 102 such that the faces of the users who use the information processing apparatuses 500 and 700 are arranged in a horizontal direction, in a vertical direction, or in a matrix so that the display image described above is generated.
- the display image generation unit 210 may generate a display image to be displayed in the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 included in the display screen of the display device 102 using the image captured by the display system 100 , the image captured by the display system 400 , and the image captured by the display system
- the display image generation unit 210 extracts a portion of the image captured by the display system 100 corresponding to a quarter of the display region for captured images such that at least face feature portions of the user A are included in the portion in accordance with the information on the face region including the face of the user A, the result of the obtainment of the number of users who use the information processing apparatuses 200 , 500 , and 700 , that is, information representing that the number of users is four, and the information on the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 such as the information on the size of the display region.
- the display image generation unit 210 extracts a portion of the image captured by the display system 400 corresponding to a quarter of the display region for captured images such that at least face feature portions of the user B are included in the portion in accordance with the information on the face region including the face of the user B, the information representing that the number of users who use the information processing apparatuses 200 , 500 , and 700 is four, and the information on the size of the display region for the captured images supplied from the information processing apparatuses 500 and 700 .
- the display image generation unit 210 extracts a portion of the image captured by the display system 400 corresponding to a quarter of the display region for captured images such that at least face feature portions of the user C are included in the portion in accordance with the information on the face region including the face of the user C, the information representing that the number of users who use the information processing apparatuses 200 , 500 , and 700 is four, and the information on the size of the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 .
- the display image generation unit 210 extracts a portion of the image captured by the display system 600 corresponding to a quarter of the display region for captured images such that at least face feature portions of the user D are included in the portion in accordance with the information on the face region including the face of the user D, the information representing that the number of users who use the information processing apparatuses 200 , 500 , and 700 is four, and the information on the size of the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 .
- the display image generation unit 210 arranges the extracted captured images in the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 included in the display screen of the display device 102 such that the faces of the users who use the information processing apparatuses 200 , 500 , and 700 are arranged in the horizontal direction, in the vertical direction, or in a matrix to thereby generate the display image described above.
- the display image generation unit 210 may receive the information on the face images and generate a display image in accordance with the information on the face regions received by the communication unit 212 instead of the information on the face regions detected by the detection unit 204 .
- the display image generation unit 210 may extract portions of the captured images such that invalid regions which are wasted regions are not included in the display image in accordance with the received information on the face regions to thereby generate the display image. Moreover, the display image generation unit 210 may extract portions of the captured images such that the same portions which display the same images are not included in the display image to thereby generate the display image. In addition, the display image generation unit 210 may generate a display image regarding a plurality of users as a single user when the users are positioned near one another in a captured image in accordance with received information on face regions.
- the communication unit 212 which is an example of a reception unit according to the present disclosure receives an image captured using the display system 400 from the communication server 300 through the communication network 800 . Furthermore, the communication unit 212 receives an image captured by the display system 600 from the communication server 300 through the communication network 800 . Note that the communication unit 212 may directly receive an image captured by the display system 400 from the information processing apparatus 500 through the communication network 800 . Similarly, the communication unit 212 may directly receive an image captured by the display system 600 from the information processing apparatus 700 through the communication network 800 .
- the communication unit 212 may receive a captured image supplied from the input unit 202 and transmit the received captured image to the communication server 300 through the communication network 800 . Furthermore, the communication unit 212 may receive the information on the face region including the user A detected in the image captured using the display system 100 by the detection unit 204 and transmit the received information on the face region to the communication server 300 through the communication network 800 . Note that the communication unit 212 may directly transmit the received captured image and the information on the face region to the information processing apparatuses 500 and 700 through the communication network 800 .
- the communication unit 212 may receive the information on the face regions including the faces of the users B and C detected in the image captured using the display system 400 and the information on the face region including the face of the user D detected in the image captured using the display system 600 from the communication server 300 through the communication network 800 .
- the communication unit 212 may directly receive the information on the face regions including the faces of the users B and C detected in the image captured using the display system 400 from the information processing apparatus 500 through the communication network 800 .
- the communication unit 212 may directly receive the information on the face region including the face of the user D detected in the image captured using the display system 600 from the information processing apparatus 700 through the communication network 800 .
- the output unit 214 receives a signal used to drive the display device 102 from the display controller 216 and transmits the received signal to the display device 102 .
- the display controller 216 which is an example of a controller according to the present disclosure receives information on the display region for content from the setting unit 208 and the information on the display region for captured images supplied from the information processing apparatuses 500 and 700 . Furthermore, the display controller 216 receives content corresponding to a still image or a moving image. Then, the display controller 216 transmits a signal for displaying the display image generated by the display image generation unit 210 in the display region for captured images supplied from the information processing apparatuses 500 and 700 included in the display screen of the display device 102 to the output unit 214 . Moreover, the display controller 216 transmits a signal for displaying content having a reduced image size in the display region for content in the display screen of the display device 102 to the output unit 214 .
- the display controller 216 may transmit to the output unit 214 a signal for displaying the display image generated by the display image generation unit 210 in the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 included in the display screen of the display device 102 .
- FIG. 3 is a diagram illustrating a hardware configuration of the information processing apparatus 200 shown in FIG. 2 .
- the information processing apparatus 200 includes an MPU 230 , a ROM 232 , a RAM 234 , a recording medium 236 , an input/output interface 238 , an operation input device 240 , a display device 242 , and a communication interface 244 . Furthermore, in the information processing apparatus 200 , the components are connected to one another through a bus 246 serving as a data transmission path.
- the MPU 230 includes an MPU (Micro Processing Unit) and an integrated circuit in which a plurality of circuits are integrated to realize various functions such as image processing and functions as a controller (not shown) which controls the entire information processing apparatus 200 . Furthermore, in the information processing apparatus 200 , the MPU 230 functions as the detection unit 204 , the head-count obtaining unit 206 , the setting unit 208 , the display image generation unit 210 , and the display controller 216 .
- MPU Micro Processing Unit
- the MPU 230 functions as the detection unit 204 , the head-count obtaining unit 206 , the setting unit 208 , the display image generation unit 210 , and the display controller 216 .
- the ROM 232 stores programs and control data such as calculation parameters used by the MPU 230 .
- the RAM 234 temporarily stores programs executed by the MPU 230 , for example.
- the recording medium 236 stores applications, for example.
- examples of the recording medium 236 include a magnetic recording medium such as a hard disk and a nonvolatile memory such as an EEPROM (Electrically Erasable and Programmable Read Only Memory), a flash memory, an MRAM (Magnetoresistive Random Access Memory), an FeRAM (Ferroelectric Random Access Memory), or a PRAM (Phase change Random Access Memory).
- the information processing apparatus 200 may include a recording medium 236 which is detachable from the information processing apparatus 200 .
- the input/output interface 238 is connected to the operation input device 240 and the display device 242 , for example. Furthermore, the input/output interface 238 functions as the input unit 202 and the output unit 214 .
- the operation input device 240 functions as an operation unit (not shown), and the display device 242 functions as a display unit 254 which will be described with reference to FIG. 13 .
- examples of the input/output interface 238 include a USB (Universal Serial Bus) terminal, a DVI (Digital Visual Interface) terminal, an HDMI (High-Definition Multimedia Interface) terminal, and various processing circuits.
- the operation input device 240 is disposed on the information processing apparatus 200 , for example, and is connected to the input/output interface 238 inside the information processing apparatus 200 .
- Examples of the operation input device 240 include a button, a direction key, a rotatable selector such as a jog dial, and a combination thereof.
- the display device 242 is disposed on the information processing apparatus 200 , for example, and is connected to the input/output interface 238 inside the information processing apparatus 200 .
- Examples of the display device 242 include an LCD (Liquid Crystal Display) and an organic EL (ElectroLuminescence) display (which is also referred to as an OLED (Organic Light Emitting Diode) display).
- the input/output interface 238 may be connected to an operation input device (for example, a keyboard and a mouse) serving as an external apparatus of the information processing apparatus 200 and an external device such as a display device (for example, an external display device such as the display device 102 ) and an image pickup device (for example, the image pickup device 104 ).
- an operation input device for example, a keyboard and a mouse
- an external device such as a display device (for example, an external display device such as the display device 102 ) and an image pickup device (for example, the image pickup device 104 ).
- the display device 242 may be a device which is capable of performing display and which allows a user's operation, such as a touch screen.
- the communication interface 244 which is a communication unit included in the information processing apparatus 200 functions as the communication unit 212 which performs communication with an external apparatuses including the server 300 and the information processing apparatuses 500 and 700 through the network 800 (or directly) in a wireless/wired manner.
- the communication interface 244 include a combination of a communication antenna and an RF circuit (wireless communication), a combination of an IEEE802.15.1 port and a transmission/reception circuit (wireless communication), a combination of an IEEE802.11b port and a transmission/reception circuit (wireless communication), and a combination of a LAN terminal and a transmission/reception circuit (wired communication).
- the hardware configuration of the information processing apparatus 200 is not limited to the configuration shown in FIG. 3 .
- the information processing apparatus 200 may include an audio output device which serves as an audio output unit (not shown) and which includes a DSP (Digital Signal Processor), an amplifier, and a speaker.
- DSP Digital Signal Processor
- the information processing apparatus 200 may include an image pickup device which serves as an image pickup unit 252 shown in FIG. 13 and which includes a lens-and-image pickup element, and a signal processing circuit.
- the information processing apparatus 200 may process a captured image generated by itself.
- the lens-and-image pickup element includes an optical lens and an image sensor including a plurality of image pickup elements such as CCD (Charge Coupled Device) sensors or CMOS (Complementary Metal Oxide Semiconductor) sensors.
- the signal processing circuit which includes an AGC (Automatic Gain Control) circuit and an ADC (Analog to Digital Converter) converts analog signals generated by the image pickup elements into digital signals (image data) and performs various signal processes. Examples of the signal processes performed by the signal processing circuit include a White Balance correction process, an interpolation process, a tone correction process, a gamma correction process, an YCbCr conversion process, an edge emphasizing process, and a coating process.
- the information processing apparatus 200 may not include the operation input device 240 and the display device 242 shown in FIG. 3 , for example.
- FIG. 4 is a flowchart of the display process executed by the information processing apparatus 200 shown in FIG. 2 .
- the display controller 216 transmits a signal for displaying the content desired by the user A in the display screen of the display device 102 to the output unit 214 which transmits the received signal to the display device 102 .
- the content desired by the user A is displayed in the display screen of the display device 102 (step S 100 ).
- the communication unit 212 enters a state in which the communication unit 212 may communicate with the communication server 300 through the communication network 800 (in step S 102 ). Note that, in step S 102 , the communication unit 212 may enter a state in which the communication unit 212 may directly communicate with the information processing apparatuses 500 and 700 through the communication network 800 .
- the communication unit 212 transmits a captured image which is generated through image capturing performed by the image pickup device 104 included in the display system 100 and which is received through the input unit 202 to the communication server 300 through the communication network 800 (in step S 104 ).
- the communication unit 212 may transmit information on a face region including the face of the user A transmitted from the detection unit 204 to the communication server 300 through the communication network 800 .
- the communication unit 212 may directly transmit the captured image or the information on the face region to the information processing apparatuses 500 and 700 through the communication network 800 .
- the communication unit 212 receives an image which is captured by the display system 400 and which is transmitted from the information processing apparatus 500 through the communication server 300 . Furthermore, the communication unit 212 receives an image which is captured by the display system 600 and which is transmitted from the information processing apparatus 700 through the communication server 300 (in step S 106 ). Note that, in step S 106 , the information on the face regions including the faces of the users B and C detected in the image which is captured by the display system 400 and which is transmitted from the information processing apparatus 500 and the information on the face region including the face of the user D detected in the image which is captured by the display system 600 and which is transmitted from the information processing apparatus 700 may be received from the communication server 300 . Furthermore, in step S 106 , the communication unit 212 may directly receive the captured images and the information on the face regions from the information processing apparatuses 500 and 700 through the communication network 800 .
- the information processing apparatus 200 executes a parallel display process which will be described with reference to FIG. 5 (in step S 108 ) so that the content desired by the user A and the captured images supplied from the information processing apparatuses 500 and 700 are displayed in parallel in the display screen of the display device 102 , and thereafter, the process is terminated.
- FIG. 5 is a flowchart of the parallel display process performed in step S 108 included in the display process shown in FIG. 4 .
- the head-count obtaining unit 206 obtains the number of users who use the information processing apparatuses 500 and 700 in accordance with the information on the face regions including the faces of the users B, C, and D detected by the detection unit 204 (in step S 200 ). Note that, in step S 200 , the head-count obtaining unit 206 may obtain the number of users who use the information processing apparatus 200 in accordance with the information on the face region including the face of the user A detected by the detection unit 204 .
- step S 200 when the communication unit 212 receives the information on the face regions including the faces of the users B and C detected in the image captured by the display system 400 and the information on the face region including the face of the user D detected in the image captured by the display system 600 , the number of users who use the information processing apparatuses 500 and 700 may be obtained in accordance with the information on the face regions.
- the setting unit 208 sets a display region for content and a display region for captured images supplied from the information processing apparatuses 500 and 700 in the display screen of the display device 102 in accordance with information on an input performed by the user A to set the display ratio of the display region for content to the display region for captured images supplied from the information processing apparatuses 500 and 700 in the display screen of the display device 102 (in step S 202 ).
- the setting unit 208 may set the display region for content and the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 in the display screen of the display device 102 in accordance with the information on an input performed by the user to set the display ratio of the display region for content to the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 in the display screen of the display device 102 .
- the display image generation unit 210 extracts portions of the image which is captured by the display system 400 and which is received by the communication unit 212 and a portion of the image which is captured by the display system 600 and which is received by the communication unit 212 in accordance with the information on the face regions including the faces of the users B, C, and D transmitted from the detection unit 204 , a result of the obtainment of the number of users who use the information processing apparatuses 500 and 700 which is supplied from the head-count obtaining unit 206 , and the information on the display region for captured images supplied from the information processing apparatuses 500 and 700 which is supplied from the setting unit 208 (in step S 204 ), and generates a display image to be displayed in the display region for captured images supplied from the information processing apparatuses 500 and 700 in the display screen of the display device 102 (in step S 206 ).
- the display image generation unit 210 may extract a portion of the image which is captured by the display system 100 , portions of the image captured by the display system 400 , and a portion of the image captured by the display system 600 in accordance with the information on the face region including the face of the user A transmitted from the detection unit 204 , the information on the face regions including the faces of the users B, C, and D transmitted from the detection unit 204 , a result of the obtainment of the number of users who use the information processing apparatus 200 which is supplied from the head-count obtaining unit 206 , a result of the obtainment of the number of users who use the information processing apparatuses 500 and 700 which is supplied from the head-count obtaining unit 206 , and the information on the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 which is supplied from the setting unit 208 , and generate a display image to be displayed in the display region for captured images supplied from the information processing apparatuses 200 , 500
- step S 204 and step S 206 when the communication unit 212 receives the information on the face regions including the faces of the users B and C detected in the image captured by the display system 400 and the information on the face region including the face of the user D detected by in the image captured by the display system 600 , the display image generation unit 210 may generate a display image in accordance with the information on the face regions supplied from the communication unit 212 instead of the information on the face regions detected by the detection unit 204 .
- the display controller 216 transmits a signal for displaying the content which is desired by the user A and which has a reduced size in the display region for content included in the display screen of the display device 102 to the output unit 214 in accordance with the information on the display region for content supplied from the setting unit 208 .
- the content desired by the user A which corresponds to the image having the reduced size is displayed in the display region for content in the display screen of the display device 102 (in step S 208 ).
- the display controller 216 transmits a signal for displaying the display image generated by the display image generation unit 210 in the display region for captured images supplied from the information processing apparatuses 500 and 700 included in the display screen of the display device 102 to the output unit 214 in accordance with the information on the display region for the captured images supplied from the information processing apparatuses 500 and 700 which is received from the setting unit 208 .
- the display image generated by the display image generation unit 210 is displayed in the display region for captured images, and accordingly, the content desired by the user A and the captured images supplied from the information processing apparatuses 500 and 700 are displayed in parallel (in step S 210 ). Then, this process is terminated.
- the display controller 216 may transmit a signal for displaying the display image generated by the display image generation unit 210 in the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 included in the display screen of the display device 102 to the output unit 214 in accordance with the information on the display region for captured images supplied from the information processing apparatuses 200 , 500 , and 700 which is supplied from the setting unit 208 .
- the display image generated by the display image generation unit 210 is displayed in the display region for captured image, and accordingly, the content desired by the user A and the captured images supplied from the information processing apparatuses 200 , 500 , and 700 are displayed in parallel.
- the display image generated by the display image generation unit 210 is displayed in the display region for captured images, and the content desired by the user A which corresponds to the image having the reduced size is displayed in the display region for content.
- the display image to be displayed in the display region for captured image is generated by extracting portions of the captured images in accordance with the information on the face regions, the result of the obtainment of the number of users, and the information on the display region for captured images such as information on a size of the display region.
- the display image to be displayed in the display region for captured images is obtained by removing portions in the captured images which are wasted portions to identify the users, that is, obtained by removing large portions of backgrounds of the captured images, for example. Therefore, since portions of the captured images which are significant portions to identify the users such as portions of the captured images including face feature portions of the users are displayed in the display region for captured images even when the size of the display region for captured images is small, it is not difficult to identify the users. Furthermore, image quality of the display image is not deteriorated. Furthermore, since the image size of the content to be displayed in the content display region is reduced, it is unlikely that a portion of the content is hidden due to the display of the display image. Accordingly, the display of the content and the display of the captured images are appropriately performed in parallel.
- FIG. 6 and FIGS. 7A to 7D are diagrams for explanation of a first example of the present disclosure.
- a case where content desired by a user who uses the display system 100 and an image captured by the display system 400 are displayed in parallel in the display screen of the display device 102 shown in FIG. 2 will be described.
- Note that, in this first example, a case where a single user uses the display system 400 will be described.
- FIG. 6 is a diagram illustrating an image captured by the display system 400 .
- an image IM 1 captured by the display system 400 is transmitted to the detection unit 204 included in the information processing apparatus 200 as described above, and the detection unit 204 detects a face region FA 1 including a face of the user who uses the display system 400 .
- FIGS. 7A to 7D are diagrams illustrating content and a captured image supplied from the information processing apparatus 500 which are displayed in parallel in the display screen of the display device 102 .
- FIG. 7B shows a case where a display ratio of the display region for content to the display region for captured images is A 2 :B 2 (B 2 ⁇ A 2 ).
- a display image IM 2 or IM 4 which is generated by extracting a portion of the image IM 1 is displayed in the display region for captured images and content IM 3 or IM 5 which is desired by the user who uses the display system 100 and which corresponds to an image having a reduced size is displayed in the display region for content.
- the user who uses the display system 100 may view the content IM 3 or IM 5 which is not partially hidden and easily identify the user who uses the display system 400 and whose face is included in the captured image.
- FIG. 8A and FIGS. 9A to 9D are diagrams illustrating a second example of the present disclosure.
- a case where content desired by a user who uses the display system 100 and an image captured by the display system 400 are displayed in parallel in the display screen of the display device 102 shown in FIG. 2 will be described.
- FIG. 8A is a diagram illustrating an image captured by the display system 400 when two users use the display system 400 .
- an image IM 6 captured by the display system 400 is supplied to the detection unit 204 included in the information processing apparatus 200 as described above, and the detection unit 204 detects face regions FA 2 and FA 3 individually including faces of the two users who use the display system 400 .
- FIGS. 9A to 9D are diagrams illustrating content and a captured image supplied from the information processing apparatus 500 which are displayed in parallel in the display screen of the display device 102 .
- FIG. 9B shows a case where a display ratio of the display region for content to the display region for captured images is A 6 :B 6 (B 6 ⁇ A 6 ).
- a display image IM 9 or IM 11 which is generated by extracting a portion of the image IM 6 is displayed in the display region for captured images and content IM 10 or IM 12 which is desired by the user who uses the display system 100 and which corresponds to an image having a reduced size is displayed in the display region for content.
- the user who uses the display system 100 may view the content IM 10 or IM 12 which is not partially hidden and easily identify the two users who use the display system 400 and whose faces are included in the captured image.
- FIG. 8B also in a case where, when a single user uses the display system 400 and a single user uses the display system 600 , the content desired by the user who uses the display system 100 and images IM 7 and IM 8 captured by the display systems 400 and 600 , respectively, are displayed in parallel, the content desired by the user and the images captured by the display systems 400 and 600 may be displayed in parallel in the display screen of the display device 102 as shown in FIGS. 9A to 9D (specifically, FIGS. 9B and 9C ).
- FIG. 10A and FIGS. 11A to 11D are diagrams illustrating a third example of the present disclosure. Note that frames representing face regions are omitted in FIG. 10A and the following drawings.
- the third example a case where content desired by a user who uses the display system 100 and an image captured by the display system 400 are displayed in parallel in the display screen of the display device 102 shown in FIG. 2 will be described. Note that a case where three users use the display system 400 will be described in the third example.
- FIG. 10A is a diagram illustrating an image captured by the display system 400 when three users use the display system 400 .
- an image IM 13 captured by the display system 400 is transmitted to the detection unit 204 included in the information processing apparatus 200 as described above, and the detection unit 204 detects face regions (not shown) individually including faces of the three users who use the display system 400 .
- FIGS. 11A to 11D are diagrams illustrating the content and an image captured by the information processing apparatus 500 which are displayed in parallel in the display screen of the display device 102 .
- FIG. 11B shows a case where a display ratio of the display region for content to the display region for captured images is A 10 :B 10 (B 10 ⁇ A 10 ).
- a display image IM 19 or IM 21 which is generated by extracting a portion of the image IM 13 is displayed in the display region for captured images and content IM 20 or IM 22 which is desired by the user who uses the display system 100 and which corresponds to an image having a reduced size is displayed in the display region for content.
- the user who uses the display system 100 may view the content IM 20 or IM 22 which is not partially hidden and easily identify the three users who use the display system 400 and whose faces are included in the captured image.
- FIG. 10B the content desired by the user who uses the display system 100 and images IM 14 and IM 15 captured by the display systems 400 and 600 may be displayed in parallel in the display screen included in the display device 102 as shown in FIGS. 11A to 11D (specifically, FIGS. 11B and 11C ).
- FIGS. 11A to 11D specifically, FIGS. 11B and 11C .
- the content desired by the user who uses the display system 100 and images IM 16 , IM 17 , IM 18 captured by the display systems 400 and 600 and the other display system may be displayed in parallel in the display screen included in the display device 102 as shown in FIGS. 11A to 11D (specifically, FIGS. 11B and 11C ).
- FIGS. 12A to 12D specifically, FIGS. 12B and 12C
- the content desired by the user who use the display system 100 and images (not shown) captured by the display system 400 and the other systems may be displayed in parallel in the display screen included in the display device 102 .
- FIGS. 14A to 14C are diagrams illustrating a fourth example according to the present disclosure.
- a case where content desired by the user who uses the display system 100 and an image captured by the display system 400 are displayed in parallel in the display screen of the display device 102 shown in FIG. 2 will be described.
- FIG. 14A is a diagram illustrating the image captured by the display system 400 .
- an image IM 23 captured by the display system 400 is transmitted to the detection unit 204 included in the information processing apparatus 200 as described above, and the detection unit 204 detects a face region FA 4 including a face of the user who uses the display system 400 .
- FIGS. 14B and 14C are diagrams illustrating the content and the captured image supplied from the information processing apparatus 500 which are displayed in parallel in the display screen included in the display device 102 .
- a display image IM 24 generated by extracting a portion of the captured image IM 23 is displayed in a display region for captured images and content IM 25 which is desired by the user who uses the display system 100 and which has a reduced image size is displayed in a display region for content.
- the display image IM 24 displayed in the display region for captured images includes an invalid region R 1 which is a wasted region.
- the portion of the captured image IM 23 is extracted so that the invalid region R 1 is not displayed in the display region for captured images whereby a display image IM 26 is generated.
- a portion of the captured image IM 23 is extracted using a right edge of the captured image IM 23 as a reference so that the display image IM 26 is generated. In this case, as shown in FIG.
- the display image IM 26 which does not include the invalid region R 1 is displayed in the display region for captured images and content IM 27 which is desired by the user who uses the display system 100 and which has a reduced image size is displayed in the display region for content in the display screen of the display device 102 . Accordingly, the user who uses the display system 100 may be prevented from having a feeling of strangeness and a discomfort feeling.
- FIGS. 15A to 15D are diagrams illustrating a fifth example according to the present disclosure.
- a case where content desired by the user who uses the display system 100 and an image captured by the display system 400 are displayed in parallel in the display screen of the display device 102 shown in FIG. 2 will be described.
- FIG. 15A is a diagram illustrating an image captured by the display system 400 when two users use the display system 400 .
- an image IM 28 captured by the display system 400 is transmitted to the detection unit 204 included in the information processing apparatus 200 as described above, and the detection unit 204 detects face regions FA 5 and FA 6 individually including faces of the two users who use the display system 400 .
- FIGS. 15B to 15D are diagrams illustrating the content and the captured image supplied from the information processing apparatus 500 which are displayed in parallel in the display screen of the display device 102 .
- a display image IM 29 generated by extracting portions of the captured image IM 28 is displayed in a display region for captured images and content IM 30 which is desired by the user who uses the display system 100 and which has a reduced image size is displayed in a display region for content.
- the display image IM 29 in a case where the two users are positioned close to each other in the captured image, when the display image IM 29 is generated by extracting portions of the captured image IM 28 using the face regions FA 5 and FA 6 as centers, the display image IM 29 includes an overlapping region R 2 which includes the same images in the display region for captured images.
- a display image IM 31 is generated by extracting portions of the captured images IM 28 such that the overlapping region R 2 is not displayed in the display region for captured images.
- the display image IM 31 which does not include the overlapping region R 2 is displayed in the display region for captured images
- content IM 32 which is desired by the user who uses the display system 100 and which has a reduced image size is displayed in the display region for content in the display screen of the display device 102 . Accordingly, the user who uses the display system 100 may be prevented from having a feeling of strangeness and a discomfort feeling.
- a display image IM 33 may be generated by extracting a portion of the captured image IM 28 so that at least face feature portions of the two users are included in the display image IM 33 .
- the display image IM 33 which does not include the overlapping region R 2 is displayed in the display region for captured images and content IM 34 which is desired by the user who uses the display system 100 and which has a reduced image size is displayed in the display region for content in the display screen of the display device 102 .
- the display image IM 33 shown in FIG. 15D is a single image, that is, an image generated by integrating the two images. Accordingly, the user who uses the display system 100 may be prevented from having a feeling of strangeness and a discomfort feeling.
- FIGS. 16A to 16F are diagrams illustrating a sixth example according to the present disclosure.
- a setting of a region to be extracted from a captured image will be described.
- FIGS. 16A to 16F are diagrams illustrating a setting of a region to be extracted from a captured image.
- extraction region CA 1 , CA 2 , and CA 3 are set using a vertical axis VA which passes the center of a face of a user as a reference as shown in FIGS. 16A to 16C .
- extraction regions CA 4 , CA 5 , and CA 6 may be set using the vertical line VA which passes the center of the face of the user and a horizontal axis HA which passes eyes in the face of the user, for example, as references as shown in FIGS. 160 to 16F .
- FIG. 13 is a diagram illustrating the functional configuration of the information processing apparatus of this embodiment.
- An information processing apparatus 250 serving as the information processing apparatus of this embodiment is different from the information processing apparatus of the first embodiment described above only in that the information processing apparatus 250 includes an image pickup unit 252 and a display unit 254 . Therefore, descriptions of redundant configurations and operations are omitted, and configurations and operations different from those of the first embodiment will be described hereinafter.
- FIG. 13 a user A who uses the information processing apparatus 250 , a communication network 800 to which the information processing apparatus 250 is connectable, a communication server 300 and information processing apparatuses 550 and 750 which are connectable to the communication network 800 , users B and C who use the information processing apparatus 550 , and a user D who uses the information processing apparatus 750 are shown.
- the information processing apparatuses 550 and 750 have the same configuration as the information processing apparatus 250 , and therefore, detailed descriptions thereof are omitted.
- the information processing apparatus 250 includes the image pickup unit 252 , a detection unit 204 , a head-count obtaining unit 206 , a setting unit 208 , a display image generation unit 210 , a communication unit 212 , the display unit 254 , and a display controller 216 .
- the image pickup unit 252 is an example of an image pickup unit according to the present disclosure and may capture a still image or a moving image of a user A who watches a display screen of the display unit 254 . Then, the image pickup unit 252 may transmit a captured image generated through image capturing to the communication unit 212 , the detection unit 204 , and display image generation unit 210 .
- the display unit 254 is an example of a display unit of the present disclosure and may display content of a still image or a moving image and captured images supplied from the information processing apparatuses 550 and 750 in parallel.
- the display unit 254 may display content of a still image or a moving image in a display region for content and display captured images supplied from the information processing apparatuses 550 and 750 in a display region for captured images supplied from the information processing apparatuses 550 and 750 in the display screen.
- this embodiment when the information processing apparatus 250 executes the display process described with reference to FIG. 4 , advantages similar to those of the first embodiment described above may be attained. Furthermore, this embodiment may be carried out without separately providing the display system 100 shown in FIG. 2 , for example.
- an object of the present disclosure may be achieved by supplying a storage medium, such as a non-transitory storage medium, which stores a program code of software which realizes functions of the foregoing embodiments to a system or an apparatus and reading and executing the program code stored in the storage medium using a computer (or a CPU, an MPU, or the like) of the system or the apparatus.
- a storage medium such as a non-transitory storage medium, which stores a program code of software which realizes functions of the foregoing embodiments to a system or an apparatus and reading and executing the program code stored in the storage medium using a computer (or a CPU, an MPU, or the like) of the system or the apparatus.
- the program code which is read from the storage medium realizes the functions of the foregoing embodiments, and the program code and the storage medium which stores the program code are included in the present disclosure.
- examples of the storage medium to which the program code is supplied include a floppy (registered trademark) disk, a hard disk, a magneto-optical disc, an optical disc such as an CD-ROM, a CD-R, a CD-RW, a DVD-ROM, a DVD-RAM, a DVD-RW, or a DVD+RW, a magnetic tape, a nonvolatile memory card, and a ROM.
- the program code may be downloaded through a network.
- the present disclosure includes, in addition to the case where the functions of the foregoing embodiments are realized by executing the program code read by the computer, a case where an OS (Operating System) operating in a computer performs a part of or entire actual process in accordance with an instruction of the program code and the functions of the foregoing embodiments are realized by the process.
- OS Operating System
- the present disclosure further includes a case where, after the program code read from the storage medium is written to a memory included in a function expansion board inserted into the computer or a function expansion unit connected to the computer, a CPU or the like included in the expansion board or the expansion unit executes a part of or entire actual process in accordance with an instruction of the program code and the functions of the foregoing embodiments are realized by the process.
- the communication server 300 may detect face regions including faces of users who use the information processing apparatuses 500 and 700 in captured images supplied from the information processing apparatuses 500 and 700 .
- the information processing apparatus 200 may transmit content desired by the user A to the information processing apparatuses 500 and 700 so that the content is shared by the users.
Abstract
Description
- The present application is a continuation of U.S. patent application Ser. No. 13/235,609, filed on Sep. 19, 2011, which claims priority from Japanese Patent Application No. 2010-221550 filed in the Japan Patent Office on Sep. 30, 2010 and Japanese Patent Application No. 2010-289780 filed in the Japan Patent Office on Dec. 27, 2010, the entire contents of which are hereby incorporated by reference.
- The present disclosure relates to information processing apparatus and an information processing method.
- In general, as devices used for communication between users being in different locations, telephones, so-called videophones, and video conference systems have been used. In addition, text chat and video chat including video images and audio have also been performed through the Internet using personal computers.
- Therefore, electronic devices and like devices have been offering high functionalities and an increasing range of functions. For example, a television receiver includes a network communication function which enables not only reception of video and audio content of a program from a broadcasting station and display of the content but also transmission of various information to and reception of various information from another receiver.
- For example, Japanese Unexamined Patent Application Publication No. 2006-50370 discloses a technique of displaying, when a user views program content of television broadcasting using a television receiver, information on registered other users (such as thumbnail images of the other users, and names, channels, and video images of content viewed by the other users) together with the program content.
- Here, when display of content and display of an image externally received through network communication (hereinafter referred to as a “communication image”) are performed in parallel in a display device such as a television receiver, a PIP (Picture in Picture) display method or a POP (Picture on Picture) display method is generally used.
- However, there arises a problem in that, when the PIP display method or the POP display method is used, a portion of the displayed content is hidden by the displayed communication image, and accordingly, it is difficult to view the content. Furthermore, in general, since a region for displaying the communication image is limited in a display screen, a size of the communication image is reduced in the display screen. Therefore, there arises a problem in that sizes of images of other users included in the communication image become small, and accordingly, it is difficult to identify the users and image quality of the communication image is deteriorated.
- Accordingly, it is desirable to provide an improved information processing apparatus and an improved information processing method which have novelty and which are capable of appropriately performing display of content and display of a communication image when the content and the communication image are displayed in parallel.
- In accordance with one aspect of the embodiments, an information processing apparatus may include an obtaining unit to obtain a number of users from information on detection of a face region including a face in a captured image provided at the apparatus. The apparatus also may include a setting unit to set a display region for content and a display region for a captured image in a display screen. Further, the apparatus may include a display image generation unit to generate a display image to be displayed in the display region for a captured image, in accordance with the information on the detection, the number of users, and the display region set for a captured image.
- In accordance with another aspect of the embodiments, a method may include obtaining a number of users from information on detection of a face region including a face in a captured image; setting a display region for content and a display region for a captured image in a display screen; and generating a display image to be displayed in the display region for a captured image, in accordance with the information on the detection, the number of users, and the display region set for a captured image. In such method, at least one of the obtaining, the setting and the generating may be by a processor.
- In accordance with another aspect of the embodiments, a non-transitory recording medium may be recorded with a computer-readable program having instructions executable by a processor. The program may include obtaining a number of users from information on detection of a face region including a face in a captured image; setting a display region for content and a display region for a captured image in a display screen; and generating a display image to be displayed in the display region for a captured image, in accordance with the information on the detection, the number of users, and the display region set for a captured image.
- As described above, display of content and display of a communication image may be appropriately performed in parallel.
-
FIG. 1 is a diagram illustrating a display system according to an embodiment of the present disclosure; -
FIG. 2 is a diagram illustrating a functional configuration of an information processing apparatus according to a first embodiment of the present disclosure; -
FIG. 3 is a diagram illustrating a hardware configuration of the information processing apparatus shown inFIG. 2 ; -
FIG. 4 is a flowchart of a display process executed by the information processing apparatus shown inFIG. 2 ; -
FIG. 5 is a flowchart of a parallel display process included in the display process shown inFIG. 4 ; -
FIG. 6 is a diagram illustrating an image captured in another display system; -
FIGS. 7A to 7D are diagrams illustrating content and captured images supplied from another information processing apparatus which are displayed in parallel in a display screen of a display device; -
FIG. 8A is a diagram illustrating an image captured by another display system when two users are using the display system; -
FIG. 8B is a diagram illustrating images captured by other display systems when each of the display systems is used by a single user; -
FIGS. 9A to 9D are diagrams illustrating content and a captured image supplied from another information processing apparatus which are displayed in parallel in the display screen of the display device; -
FIG. 10A is a diagram illustrating an image captured by another display system when three users are using the display system; -
FIG. 10B is a diagram illustrating images captured by other display systems when one of the display systems is used by two users and the other is used by a single user; -
FIG. 10C is a diagram illustrating images captured by other display systems when each of the display systems is used by a single user; -
FIGS. 11A to 11D are diagrams illustrating content and a captured image supplied from another information processing apparatus which are displayed in parallel in the display screen of the display device; -
FIGS. 12A to 12D are diagrams illustrating content and a captured image supplied from another information processing apparatus which are displayed in parallel in the display screen of the display device; -
FIG. 13 is a diagram illustrating a functional configuration of an information processing apparatus according to a second embodiment of the present disclosure; -
FIGS. 14A to 14C are diagrams illustrating a fourth embodiment of the present disclosure; -
FIGS. 15A to 15D are diagrams illustrating a fifth embodiment of the present disclosure; and -
FIGS. 16A to 16F are diagrams illustrating a sixth embodiment of the present disclosure. - Embodiments of the present disclosure will be described with reference to the accompanying drawings. Note that, in this specification and the drawings, components having substantially the same functions are denoted by the same reference numerals so that redundant descriptions are avoided.
- Note that a description will be made in the following order.
- 1. Display System
- 2. Functional Configuration of Information Processing Apparatus (First Embodiment)
- 3. Hardware Configuration of Information Processing Apparatus
- 4. Display Process
- 5. Examples
- 6. Functional Configuration of Information Processing Apparatus (Second Embodiment)
- First, a display system according to an embodiment of the present disclosure will be described.
FIG. 1 is a diagram illustrating a display system according to an embodiment.FIG. 1 is a front view of the display system viewed from the front. - In
FIG. 1 , adisplay system 100 includes adisplay device 102 and animage pickup device 104, for example. - The
display device 102 is an example of a display device of the present disclosure, and displays still images or moving images in accordance with driving signals. For example, thedisplay device 102 displays a still image or a moving image using liquid crystal. Note that thedisplay device 102 may display a still image or a moving image using self-luminous display device such as an organic EL (Electroluminescence). - The
image pickup device 104 which is an example of an image pickup device according to the present disclosure is disposed in an upper center portion of thedisplay device 102 and captures a subject located in a display direction of thedisplay device 102. Theimage pickup device 104 may take still images or moving images using a CCD (Charge Coupled Device) image sensor or may take still images or moving images using a CMOS (Complementary Metal Oxide Semiconductor) image sensor. - Note that, although the
image pickup device 104 is disposed in the upper center portion of thedisplay device 102 in this embodiment, a location where theimage pickup device 104 is disposed is not limited to the upper center portion of thedisplay device 102. For example, theimage pickup device 104 may be disposed in a lower center portion of thedisplay device 102. Furthermore, although the singleimage pickup device 104 is disposed in this embodiment, the number ofimage pickup devices 104 is not limited to one. For example, two or moreimage pickup devices 104 may be disposed. Furthermore, although thedisplay device 102 and theimage pickup device 104 are integrally configured in this embodiment, thedisplay device 102 and theimage pickup device 104 may be separately configured. - Note that the
display system 100 may include a sensor (not shown) which detects presence or absence of a user positioned in front of thedisplay device 102 and a signal reception unit (not shown) capable of receiving a control signal through an infrared communication or a wireless communication from a remote controller (not shown). Furthermore, the sensor may detect a distance between thedisplay device 102 and the user positioned in front of thedisplay device 102. - The
display device 102 of this embodiment may display content corresponding to a still image or a moving image and images captured by otherinformation processing apparatuses FIG. 2 in parallel which will be described hereinafter. For example, thedisplay device 102 may display content corresponding to a still image or a moving image in a content display region in a display screen and display images captured by the otherinformation processing apparatuses information processing apparatuses information processing apparatuses - Furthermore, the
image pickup device 104 of this embodiment may capture a still image and a moving image regarding a user A who is watching the display screen of thedisplay device 102 shown inFIG. 2 . - Next, a functional configuration of an information processing apparatus according to a first embodiment of the present disclosure will be described.
FIG. 2 is a diagram illustrating a functional configuration of the information processing apparatus according to the first embodiment of the present disclosure. Note that,FIG. 2 includes adisplay system 100 which transmits a captured image to aninformation processing apparatus 200 serving as the information processing apparatus according to this embodiment and which receives a signal for driving thedisplay device 102 from theinformation processing apparatus 200 and a user A who uses thedisplay system 100 and theinformation processing apparatus 200. Furthermore,FIG. 2 includes acommunication network 800 to which theinformation processing apparatus 200 is connectable, acommunication server 300 and otherinformation processing apparatuses communication network 800, adisplay system 400 which transmits a captured image to theinformation processing apparatus 500 and which receives a signal from theinformation processing apparatus 500, users B and C who use thedisplay system 400 and theinformation processing apparatus 500, adisplay system 600 which transmits a captured image to theinformation processing apparatus 700 and which receives a signal from theinformation processing apparatus 700, and a user D who uses thedisplay system 600 and theinformation processing apparatus 700. Thedisplay systems display system 100, and therefore, detailed descriptions thereof are omitted. Furthermore, theinformation processing apparatuses information processing apparatus 200, and therefore, detailed descriptions thereof are omitted. Theinformation processing apparatuses - In
FIG. 2 , theinformation processing apparatus 200 includes aninput unit 202, adetection unit 204, a head-count obtaining unit 206, asetting unit 208, a displayimage generation unit 210, acommunication unit 212, anoutput unit 214, and adisplay controller 216. - The
input unit 202 receives a captured image which is generated by theimage pickup device 104 through image capturing. The captured image generated by theimage pickup device 104 through the image capturing is an example of another captured image according to the present disclosure. Then, theinput unit 202 transmits the received (input) captured image to thecommunication unit 212. Note that theinput unit 202 may transmit the received captured image to thedetection unit 204 and the displayimage generation unit 210. Furthermore, theinput unit 202 accepts an input for a setting of a display ratio of a display region for content to a display region for captured images supplied from theinformation processing apparatuses display device 102 performed by the user A, for example. Note that theinput unit 202 may accept an input for a setting of a display ratio of a display region for content to a display region for captured images supplied from theinformation processing apparatus display device 102 performed by the user A. Then, theinput unit 202 transmits information on the received input for a setting of a display ratio to thesetting unit 208. The display region for captured images supplied from theinformation processing apparatuses - The
detection unit 204 which is an example of a detection unit according to the present disclosure receives captured images which are supplied from theinformation processing apparatuses communication unit 212 and detects face regions including faces of the users B, C, and D in the received captured images. Then, thedetection unit 204 transmits information on the face regions as results of the detection to the head-count obtaining unit 206 and the displayimage generation unit 210. Note that thedetection unit 204 may receive a captured image supplied from theinput unit 202 and detect a face region including a face of the user A in the received captured image. Then, thedetection unit 204 may transmit information on the face region including the face of the user A as a result of the detection to the head-count obtaining unit 206, the displayimage generation unit 210, and thecommunication unit 212. A technique disclosed in Japanese Unexamined Patent Application Publication No. 2007-65766 and a technique disclosed in Japanese Unexamined Patent Application Publication No. 2005-44330 may be used for the detection of a face region performed on a captured image by thedetection unit 204. Hereinafter, the detection of a face region will be briefly described. - To detect the face of the user A in the captured image, for example, a face position, a face size, and a face direction are individually detected in the received captured image. When the face position and the face size are detected, a portion corresponding to a face image may be extracted from the image. Then, in accordance with the extracted face image and information on the face direction, characteristic portions of the face (face feature positions) such as eye blows, eyes, a nose, and a mouth are detected. A method referred to as an AAM (Active Appearance Models) method, for example, may be used for the detection of the face feature positions.
- After the face feature positions are detected, local feature values of the detected face feature positions are individually calculated. When the local feature values are calculated and the calculated local feature values are stored along with the face image, a face is identified in the image captured by the
image pickup device 104. A technique disclosed in Japanese Unexamined Patent Application Publication No. 2007-65766 or a technique disclosed in Japanese Unexamined Patent Application Publication No. 2005-44330 may be used as a method for identifying a face, and therefore, a detailed description thereof is omitted here. Furthermore, in accordance with the face image and the face feature positions, a gender and an age of a face included in the received captured image may be determined. Moreover, by storing information on faces in advance, the face of the user included in the received captured image may be obtained from among the stored faces so as to specify the user. - The head-
count obtaining unit 206 which is an example of an obtaining unit according to the present disclosure receives information on the face regions including the faces of the users B, C, and D detected by thedetection unit 204. Then, the head-count obtaining unit 206 obtains the number of users who use theinformation processing apparatuses count obtaining unit 206 transmits a result of the obtainment of the number of users who use theinformation processing apparatuses image generation unit 210. Note that the head-count obtaining unit 206 may receive information on a face region including the face of the user A detected by thedetection unit 204 and obtain the number of users who use theinformation processing apparatus 200 in accordance with received information on the face region including the face of the user A. Then, the head-count obtaining unit 206 may transmit a result of the obtainment of the number of users who use theinformation processing apparatus 200 to the displayimage generation unit 210 and thedisplay controller 216. Furthermore, when thecommunication unit 212 receives the information on the face regions including the faces of the users B and C detected in the image captured using thedisplay system 400 and information on the face region including the face of the user D detected in the image captured using thedisplay system 600, the head-count obtaining unit 206 may receive the information on the face regions and obtain the number of users who use theinformation processing apparatuses detection unit 204 described above may not detect the face regions including the faces of the users B, C and D in the captured images supplied from theinformation processing apparatuses - The
setting unit 208 which is an example of a setting unit according to the present disclosure receives information on the input for the setting of the display ratio from theinput unit 202 and sets the display region for content and a display region for captured images supplied from theinformation processing apparatuses display device 102 in accordance with the received information on the input for the setting of the display ratio. Furthermore, thesetting unit 208 may set a display region for content and a display region for captured images supplied from theinformation processing apparatuses display device 102 in accordance with the received information on the input for the setting of the display ratio. Then, thesetting unit 208 transmits information on the set display region for captured images to the displayimage generation unit 210 and thedisplay controller 216 and transmits information on the set display region for content to thedisplay controller 216. For example, thesetting unit 208 sets a size of the display region for content and a size of the display region for captured images supplied from theinformation processing apparatuses display device 102 and transmits information on the set size of the display region for content and information on the set size of the display region for captured images. - The display
image generation unit 210 which is an example of a generation unit according to the present disclosure receives information on the face regions including the faces of the users B, C, and D from thedetection unit 204, receives the result of the obtainment of the number of users who use theinformation processing apparatuses count obtaining unit 206, receives the information on the display region for captured images supplied from theinformation processing apparatuses setting unit 208, and receives the image captured using thedisplay system 400 and the image captured using thedisplay system 600 which are received by thecommunication unit 212. Then, in accordance with the received information on the face regions including the faces of the users B, and C and D, the received result of the obtainment of the number of users who use theinformation processing apparatuses information processing apparatuses image generation unit 210 generates a display image to be displayed in the display region for captured images supplied from theinformation processing apparatuses display device 102 using the image captured by thedisplay system 400 and the image captured by thedisplay system 600. Thereafter, the displayimage generation unit 210 transmits the generated display image to thedisplay controller 216. - For example, the display
image generation unit 210 extracts a portion of the image captured by thedisplay system 400 corresponding to one third of the display region for captured images such that at least face feature portions of the user B are included in the portion in accordance with the information on the face region including the face of the user B, the result of the obtainment of the number of users who use theinformation processing apparatuses information processing apparatuses image generation unit 210 extracts a portion of the image captured by thedisplay system 400 corresponding to one third of the display region for captured images such that at least face feature portions of the user C are included in the portion in accordance with the information on the face region including the face of the user C, the information representing that the number of users who use theinformation processing apparatuses information processing apparatuses image generation unit 210 extracts a portion of the image captured by thedisplay system 600 corresponding to one third of the display region for captured images such that at least face feature portions of the user D are included in the portion in accordance with the information on the face region including the face of the user D, the information representing that the number of users who use theinformation processing apparatuses information processing apparatuses image generation unit 210 arranges the extracted captured images in the display region for captured images supplied from theinformation processing apparatuses display device 102 such that the faces of the users who use theinformation processing apparatuses - When receiving the image captured by the display system 100 from the input unit 202, receiving the information on the face region including the face of the user A and the information on the face regions including the faces of the users B, C, and D from the detection unit 204, receiving the result of the obtainment of the number of users who use the information processing apparatus 200 and the result of the obtainment of the number of users who use the information processing apparatuses 500 and 700 from the head-count obtaining unit 206, receiving the information on the display regions for captured images of the information processing apparatuses 200, 500, and 700 from the setting unit 208, and receiving the image captured by the display system 400 and the image captured by the display system 600 which are received by the communication unit 212, the display image generation unit 210 may generate a display image to be displayed in the display region for captured images supplied from the information processing apparatuses 200, 500, and 700 included in the display screen of the display device 102 using the image captured by the display system 100, the image captured by the display system 400, and the image captured by the display system 600 in accordance with the received information on the face region including the face of the user A, the received information on the face region including the faces of the users B and C, the received information on the face region including the face of the user D, the received result of the obtainment of the number of users who use the information processing apparatus 200, the received result of the obtainment of the number of users who use the information processing apparatuses 500 and 700, and the information on the display region for captured images supplied from the information processing apparatuses 200, 500, and 700.
- In this case, the display
image generation unit 210 extracts a portion of the image captured by thedisplay system 100 corresponding to a quarter of the display region for captured images such that at least face feature portions of the user A are included in the portion in accordance with the information on the face region including the face of the user A, the result of the obtainment of the number of users who use theinformation processing apparatuses information processing apparatuses image generation unit 210 extracts a portion of the image captured by thedisplay system 400 corresponding to a quarter of the display region for captured images such that at least face feature portions of the user B are included in the portion in accordance with the information on the face region including the face of the user B, the information representing that the number of users who use theinformation processing apparatuses information processing apparatuses image generation unit 210 extracts a portion of the image captured by thedisplay system 400 corresponding to a quarter of the display region for captured images such that at least face feature portions of the user C are included in the portion in accordance with the information on the face region including the face of the user C, the information representing that the number of users who use theinformation processing apparatuses information processing apparatuses image generation unit 210 extracts a portion of the image captured by thedisplay system 600 corresponding to a quarter of the display region for captured images such that at least face feature portions of the user D are included in the portion in accordance with the information on the face region including the face of the user D, the information representing that the number of users who use theinformation processing apparatuses information processing apparatuses image generation unit 210 arranges the extracted captured images in the display region for captured images supplied from theinformation processing apparatuses display device 102 such that the faces of the users who use theinformation processing apparatuses - Note that, when the
communication unit 212 receives the information on the face regions including the faces of the users B and C detected in the image captured by thedisplay system 400 or the information on the face region including the face of the user D detected in the image captured by thedisplay system 600, the displayimage generation unit 210 may receive the information on the face images and generate a display image in accordance with the information on the face regions received by thecommunication unit 212 instead of the information on the face regions detected by thedetection unit 204. - Furthermore, the display
image generation unit 210 may extract portions of the captured images such that invalid regions which are wasted regions are not included in the display image in accordance with the received information on the face regions to thereby generate the display image. Moreover, the displayimage generation unit 210 may extract portions of the captured images such that the same portions which display the same images are not included in the display image to thereby generate the display image. In addition, the displayimage generation unit 210 may generate a display image regarding a plurality of users as a single user when the users are positioned near one another in a captured image in accordance with received information on face regions. - The
communication unit 212 which is an example of a reception unit according to the present disclosure receives an image captured using thedisplay system 400 from thecommunication server 300 through thecommunication network 800. Furthermore, thecommunication unit 212 receives an image captured by thedisplay system 600 from thecommunication server 300 through thecommunication network 800. Note that thecommunication unit 212 may directly receive an image captured by thedisplay system 400 from theinformation processing apparatus 500 through thecommunication network 800. Similarly, thecommunication unit 212 may directly receive an image captured by thedisplay system 600 from theinformation processing apparatus 700 through thecommunication network 800. - Moreover, the
communication unit 212 may receive a captured image supplied from theinput unit 202 and transmit the received captured image to thecommunication server 300 through thecommunication network 800. Furthermore, thecommunication unit 212 may receive the information on the face region including the user A detected in the image captured using thedisplay system 100 by thedetection unit 204 and transmit the received information on the face region to thecommunication server 300 through thecommunication network 800. Note that thecommunication unit 212 may directly transmit the received captured image and the information on the face region to theinformation processing apparatuses communication network 800. - Furthermore, the
communication unit 212 may receive the information on the face regions including the faces of the users B and C detected in the image captured using thedisplay system 400 and the information on the face region including the face of the user D detected in the image captured using thedisplay system 600 from thecommunication server 300 through thecommunication network 800. Note that thecommunication unit 212 may directly receive the information on the face regions including the faces of the users B and C detected in the image captured using thedisplay system 400 from theinformation processing apparatus 500 through thecommunication network 800. Similarly, thecommunication unit 212 may directly receive the information on the face region including the face of the user D detected in the image captured using thedisplay system 600 from theinformation processing apparatus 700 through thecommunication network 800. - The
output unit 214 receives a signal used to drive thedisplay device 102 from thedisplay controller 216 and transmits the received signal to thedisplay device 102. - The
display controller 216 which is an example of a controller according to the present disclosure receives information on the display region for content from thesetting unit 208 and the information on the display region for captured images supplied from theinformation processing apparatuses display controller 216 receives content corresponding to a still image or a moving image. Then, thedisplay controller 216 transmits a signal for displaying the display image generated by the displayimage generation unit 210 in the display region for captured images supplied from theinformation processing apparatuses display device 102 to theoutput unit 214. Moreover, thedisplay controller 216 transmits a signal for displaying content having a reduced image size in the display region for content in the display screen of thedisplay device 102 to theoutput unit 214. - Note that, when receiving the information on the display region for content and the information on the display region for captured images supplied from the information processing apparatuses, 200, 500, and 700 from the
setting unit 208 and receiving the display image from the displayimage generation unit 210, thedisplay controller 216 may transmit to the output unit 214 a signal for displaying the display image generated by the displayimage generation unit 210 in the display region for captured images supplied from theinformation processing apparatuses display device 102. - Next, a hardware configuration of the
information processing apparatus 200 shown inFIG. 2 will be described.FIG. 3 is a diagram illustrating a hardware configuration of theinformation processing apparatus 200 shown inFIG. 2 . - In
FIG. 3 , theinformation processing apparatus 200 includes anMPU 230, aROM 232, aRAM 234, arecording medium 236, an input/output interface 238, anoperation input device 240, adisplay device 242, and acommunication interface 244. Furthermore, in theinformation processing apparatus 200, the components are connected to one another through abus 246 serving as a data transmission path. - The
MPU 230 includes an MPU (Micro Processing Unit) and an integrated circuit in which a plurality of circuits are integrated to realize various functions such as image processing and functions as a controller (not shown) which controls the entireinformation processing apparatus 200. Furthermore, in theinformation processing apparatus 200, theMPU 230 functions as thedetection unit 204, the head-count obtaining unit 206, thesetting unit 208, the displayimage generation unit 210, and thedisplay controller 216. - The
ROM 232 stores programs and control data such as calculation parameters used by theMPU 230. TheRAM 234 temporarily stores programs executed by theMPU 230, for example. - The
recording medium 236 stores applications, for example. Here, examples of therecording medium 236 include a magnetic recording medium such as a hard disk and a nonvolatile memory such as an EEPROM (Electrically Erasable and Programmable Read Only Memory), a flash memory, an MRAM (Magnetoresistive Random Access Memory), an FeRAM (Ferroelectric Random Access Memory), or a PRAM (Phase change Random Access Memory). Furthermore, theinformation processing apparatus 200 may include arecording medium 236 which is detachable from theinformation processing apparatus 200. - The input/
output interface 238 is connected to theoperation input device 240 and thedisplay device 242, for example. Furthermore, the input/output interface 238 functions as theinput unit 202 and theoutput unit 214. Theoperation input device 240 functions as an operation unit (not shown), and thedisplay device 242 functions as adisplay unit 254 which will be described with reference toFIG. 13 . Here, examples of the input/output interface 238 include a USB (Universal Serial Bus) terminal, a DVI (Digital Visual Interface) terminal, an HDMI (High-Definition Multimedia Interface) terminal, and various processing circuits. Furthermore, theoperation input device 240 is disposed on theinformation processing apparatus 200, for example, and is connected to the input/output interface 238 inside theinformation processing apparatus 200. Examples of theoperation input device 240 include a button, a direction key, a rotatable selector such as a jog dial, and a combination thereof. Furthermore, thedisplay device 242 is disposed on theinformation processing apparatus 200, for example, and is connected to the input/output interface 238 inside theinformation processing apparatus 200. Examples of thedisplay device 242 include an LCD (Liquid Crystal Display) and an organic EL (ElectroLuminescence) display (which is also referred to as an OLED (Organic Light Emitting Diode) display). Note that it is apparent that the input/output interface 238 may be connected to an operation input device (for example, a keyboard and a mouse) serving as an external apparatus of theinformation processing apparatus 200 and an external device such as a display device (for example, an external display device such as the display device 102) and an image pickup device (for example, the image pickup device 104). Furthermore, thedisplay device 242 may be a device which is capable of performing display and which allows a user's operation, such as a touch screen. - The
communication interface 244 which is a communication unit included in theinformation processing apparatus 200 functions as thecommunication unit 212 which performs communication with an external apparatuses including theserver 300 and theinformation processing apparatuses communication interface 244 include a combination of a communication antenna and an RF circuit (wireless communication), a combination of an IEEE802.15.1 port and a transmission/reception circuit (wireless communication), a combination of an IEEE802.11b port and a transmission/reception circuit (wireless communication), and a combination of a LAN terminal and a transmission/reception circuit (wired communication). - Note that the hardware configuration of the
information processing apparatus 200 according to this embodiment is not limited to the configuration shown inFIG. 3 . For example, theinformation processing apparatus 200 may include an audio output device which serves as an audio output unit (not shown) and which includes a DSP (Digital Signal Processor), an amplifier, and a speaker. - Furthermore, the
information processing apparatus 200 may include an image pickup device which serves as animage pickup unit 252 shown inFIG. 13 and which includes a lens-and-image pickup element, and a signal processing circuit. In this case, theinformation processing apparatus 200 may process a captured image generated by itself. Here, the lens-and-image pickup element includes an optical lens and an image sensor including a plurality of image pickup elements such as CCD (Charge Coupled Device) sensors or CMOS (Complementary Metal Oxide Semiconductor) sensors. The signal processing circuit which includes an AGC (Automatic Gain Control) circuit and an ADC (Analog to Digital Converter) converts analog signals generated by the image pickup elements into digital signals (image data) and performs various signal processes. Examples of the signal processes performed by the signal processing circuit include a White Balance correction process, an interpolation process, a tone correction process, a gamma correction process, an YCbCr conversion process, an edge emphasizing process, and a coating process. - Moreover, the
information processing apparatus 200 may not include theoperation input device 240 and thedisplay device 242 shown inFIG. 3 , for example. - Next, a display process executed by the
information processing apparatus 200 shown inFIG. 2 will be described.FIG. 4 is a flowchart of the display process executed by theinformation processing apparatus 200 shown inFIG. 2 . - In
FIG. 4 , first, when the user A inputs an instruction for displaying content desired by the user A in thedisplay device 102 using theinformation processing apparatus 200, thedisplay controller 216 transmits a signal for displaying the content desired by the user A in the display screen of thedisplay device 102 to theoutput unit 214 which transmits the received signal to thedisplay device 102. By this, the content desired by the user A is displayed in the display screen of the display device 102 (step S100). - Next, when the user A inputs an instruction for connection to the users B, C, and D through the network using the
information processing apparatus 200, thecommunication unit 212 enters a state in which thecommunication unit 212 may communicate with thecommunication server 300 through the communication network 800 (in step S102). Note that, in step S102, thecommunication unit 212 may enter a state in which thecommunication unit 212 may directly communicate with theinformation processing apparatuses communication network 800. - Next, the
communication unit 212 transmits a captured image which is generated through image capturing performed by theimage pickup device 104 included in thedisplay system 100 and which is received through theinput unit 202 to thecommunication server 300 through the communication network 800 (in step S104). Note that, in step S104, thecommunication unit 212 may transmit information on a face region including the face of the user A transmitted from thedetection unit 204 to thecommunication server 300 through thecommunication network 800. Furthermore, in step S104, thecommunication unit 212 may directly transmit the captured image or the information on the face region to theinformation processing apparatuses communication network 800. - Subsequently, the
communication unit 212 receives an image which is captured by thedisplay system 400 and which is transmitted from theinformation processing apparatus 500 through thecommunication server 300. Furthermore, thecommunication unit 212 receives an image which is captured by thedisplay system 600 and which is transmitted from theinformation processing apparatus 700 through the communication server 300 (in step S106). Note that, in step S106, the information on the face regions including the faces of the users B and C detected in the image which is captured by thedisplay system 400 and which is transmitted from theinformation processing apparatus 500 and the information on the face region including the face of the user D detected in the image which is captured by thedisplay system 600 and which is transmitted from theinformation processing apparatus 700 may be received from thecommunication server 300. Furthermore, in step S106, thecommunication unit 212 may directly receive the captured images and the information on the face regions from theinformation processing apparatuses communication network 800. - Subsequently, the
information processing apparatus 200 executes a parallel display process which will be described with reference toFIG. 5 (in step S108) so that the content desired by the user A and the captured images supplied from theinformation processing apparatuses display device 102, and thereafter, the process is terminated. -
FIG. 5 is a flowchart of the parallel display process performed in step S108 included in the display process shown inFIG. 4 . - In
FIG. 5 , first, the head-count obtaining unit 206 obtains the number of users who use theinformation processing apparatuses count obtaining unit 206 may obtain the number of users who use theinformation processing apparatus 200 in accordance with the information on the face region including the face of the user A detected by thedetection unit 204. Furthermore, in step S200, when thecommunication unit 212 receives the information on the face regions including the faces of the users B and C detected in the image captured by thedisplay system 400 and the information on the face region including the face of the user D detected in the image captured by thedisplay system 600, the number of users who use theinformation processing apparatuses - Next, the
setting unit 208 sets a display region for content and a display region for captured images supplied from theinformation processing apparatuses display device 102 in accordance with information on an input performed by the user A to set the display ratio of the display region for content to the display region for captured images supplied from theinformation processing apparatuses setting unit 208 may set the display region for content and the display region for captured images supplied from theinformation processing apparatuses display device 102 in accordance with the information on an input performed by the user to set the display ratio of the display region for content to the display region for captured images supplied from theinformation processing apparatuses display device 102. - Subsequently, the display
image generation unit 210 extracts portions of the image which is captured by thedisplay system 400 and which is received by thecommunication unit 212 and a portion of the image which is captured by thedisplay system 600 and which is received by thecommunication unit 212 in accordance with the information on the face regions including the faces of the users B, C, and D transmitted from thedetection unit 204, a result of the obtainment of the number of users who use theinformation processing apparatuses count obtaining unit 206, and the information on the display region for captured images supplied from theinformation processing apparatuses information processing apparatuses - Note that, in step S204 and step S206, the display
image generation unit 210 may extract a portion of the image which is captured by thedisplay system 100, portions of the image captured by thedisplay system 400, and a portion of the image captured by thedisplay system 600 in accordance with the information on the face region including the face of the user A transmitted from thedetection unit 204, the information on the face regions including the faces of the users B, C, and D transmitted from thedetection unit 204, a result of the obtainment of the number of users who use theinformation processing apparatus 200 which is supplied from the head-count obtaining unit 206, a result of the obtainment of the number of users who use theinformation processing apparatuses count obtaining unit 206, and the information on the display region for captured images supplied from theinformation processing apparatuses setting unit 208, and generate a display image to be displayed in the display region for captured images supplied from theinformation processing apparatuses display device 102. - Furthermore, in step S204 and step S206, when the
communication unit 212 receives the information on the face regions including the faces of the users B and C detected in the image captured by thedisplay system 400 and the information on the face region including the face of the user D detected by in the image captured by thedisplay system 600, the displayimage generation unit 210 may generate a display image in accordance with the information on the face regions supplied from thecommunication unit 212 instead of the information on the face regions detected by thedetection unit 204. - Next, the
display controller 216 transmits a signal for displaying the content which is desired by the user A and which has a reduced size in the display region for content included in the display screen of thedisplay device 102 to theoutput unit 214 in accordance with the information on the display region for content supplied from thesetting unit 208. By this, the content desired by the user A which corresponds to the image having the reduced size is displayed in the display region for content in the display screen of the display device 102 (in step S208). - Next, the
display controller 216 transmits a signal for displaying the display image generated by the displayimage generation unit 210 in the display region for captured images supplied from theinformation processing apparatuses display device 102 to theoutput unit 214 in accordance with the information on the display region for the captured images supplied from theinformation processing apparatuses setting unit 208. By this, in the display screen of thedisplay device 102, the display image generated by the displayimage generation unit 210 is displayed in the display region for captured images, and accordingly, the content desired by the user A and the captured images supplied from theinformation processing apparatuses - Note that, in step S210, the
display controller 216 may transmit a signal for displaying the display image generated by the displayimage generation unit 210 in the display region for captured images supplied from theinformation processing apparatuses display device 102 to theoutput unit 214 in accordance with the information on the display region for captured images supplied from theinformation processing apparatuses setting unit 208. In this case, in the display screen of thedisplay device 102, the display image generated by the displayimage generation unit 210 is displayed in the display region for captured image, and accordingly, the content desired by the user A and the captured images supplied from theinformation processing apparatuses - According to the display process shown in
FIG. 4 , in the display screen of thedisplay device 102, the display image generated by the displayimage generation unit 210 is displayed in the display region for captured images, and the content desired by the user A which corresponds to the image having the reduced size is displayed in the display region for content. The display image to be displayed in the display region for captured image is generated by extracting portions of the captured images in accordance with the information on the face regions, the result of the obtainment of the number of users, and the information on the display region for captured images such as information on a size of the display region. Therefore, when the size of the display region for captured images is small, the display image to be displayed in the display region for captured images is obtained by removing portions in the captured images which are wasted portions to identify the users, that is, obtained by removing large portions of backgrounds of the captured images, for example. Therefore, since portions of the captured images which are significant portions to identify the users such as portions of the captured images including face feature portions of the users are displayed in the display region for captured images even when the size of the display region for captured images is small, it is not difficult to identify the users. Furthermore, image quality of the display image is not deteriorated. Furthermore, since the image size of the content to be displayed in the content display region is reduced, it is unlikely that a portion of the content is hidden due to the display of the display image. Accordingly, the display of the content and the display of the captured images are appropriately performed in parallel. - Next, examples of the present disclosure will be described.
FIG. 6 andFIGS. 7A to 7D are diagrams for explanation of a first example of the present disclosure. In the first example, a case where content desired by a user who uses thedisplay system 100 and an image captured by thedisplay system 400 are displayed in parallel in the display screen of thedisplay device 102 shown inFIG. 2 will be described. Note that, in this first example, a case where a single user uses thedisplay system 400 will be described. -
FIG. 6 is a diagram illustrating an image captured by thedisplay system 400. InFIG. 6 , an image IM1 captured by thedisplay system 400 is transmitted to thedetection unit 204 included in theinformation processing apparatus 200 as described above, and thedetection unit 204 detects a face region FA1 including a face of the user who uses thedisplay system 400. -
FIGS. 7A to 7D are diagrams illustrating content and a captured image supplied from theinformation processing apparatus 500 which are displayed in parallel in the display screen of thedisplay device 102. -
FIG. 7A shows a case where a display ratio of a display region for content to a display region for captured images supplied from theinformation processing apparatus 500 which is obtained through an input for setting of the display ratio of the display region for content to the display region for captured images is A1:B1 (B1=0).FIG. 7B shows a case where a display ratio of the display region for content to the display region for captured images is A2:B2 (B2<A2).FIG. 7C shows a case where a display ratio of the display region for content to the display region for captured images is A3:B3 (B3=A3).FIG. 7D shows a case where a display ratio of the display region for content to the display region for captured images is A4:B4 (A4=0). - As shown in
FIGS. 7B and 7C , in the display screen of thedisplay device 102, a display image IM2 or IM4 which is generated by extracting a portion of the image IM1 is displayed in the display region for captured images and content IM3 or IM5 which is desired by the user who uses thedisplay system 100 and which corresponds to an image having a reduced size is displayed in the display region for content. By this, the user who uses thedisplay system 100 may view the content IM3 or IM5 which is not partially hidden and easily identify the user who uses thedisplay system 400 and whose face is included in the captured image. -
FIG. 8A andFIGS. 9A to 9D are diagrams illustrating a second example of the present disclosure. In the second example, a case where content desired by a user who uses thedisplay system 100 and an image captured by thedisplay system 400 are displayed in parallel in the display screen of thedisplay device 102 shown inFIG. 2 will be described. Note that, in the second example, a case where two users use thedisplay system 400 will be described. -
FIG. 8A is a diagram illustrating an image captured by thedisplay system 400 when two users use thedisplay system 400. InFIG. 8A , an image IM6 captured by thedisplay system 400 is supplied to thedetection unit 204 included in theinformation processing apparatus 200 as described above, and thedetection unit 204 detects face regions FA2 and FA3 individually including faces of the two users who use thedisplay system 400. -
FIGS. 9A to 9D are diagrams illustrating content and a captured image supplied from theinformation processing apparatus 500 which are displayed in parallel in the display screen of thedisplay device 102. -
FIG. 9A shows a case where a display ratio of the display region for content to the display region for captured images supplied from theinformation processing apparatus 500 which is obtained through an input for setting of the display ratio of the display region for content to the display region for captured images is A5:B5 (B5=0).FIG. 9B shows a case where a display ratio of the display region for content to the display region for captured images is A6:B6 (B6<A6).FIG. 9C shows a case where a display ratio of the display region for content to the display region for captured images is A7:B7 (B7=A7).FIG. 9D shows a case where a display ratio of the display region for content to the display region for captured images is A8:B8 (A8=0). - As shown in
FIGS. 9B and 9C , in the display screen of thedisplay device 102, a display image IM9 or IM11 which is generated by extracting a portion of the image IM6 is displayed in the display region for captured images and content IM10 or IM12 which is desired by the user who uses thedisplay system 100 and which corresponds to an image having a reduced size is displayed in the display region for content. By this, the user who uses thedisplay system 100 may view the content IM10 or IM12 which is not partially hidden and easily identify the two users who use thedisplay system 400 and whose faces are included in the captured image. - Note that, as shown in
FIG. 8B , also in a case where, when a single user uses thedisplay system 400 and a single user uses thedisplay system 600, the content desired by the user who uses thedisplay system 100 and images IM7 and IM8 captured by thedisplay systems display systems display device 102 as shown inFIGS. 9A to 9D (specifically,FIGS. 9B and 9C ). -
FIG. 10A andFIGS. 11A to 11D are diagrams illustrating a third example of the present disclosure. Note that frames representing face regions are omitted inFIG. 10A and the following drawings. In the third example, a case where content desired by a user who uses thedisplay system 100 and an image captured by thedisplay system 400 are displayed in parallel in the display screen of thedisplay device 102 shown inFIG. 2 will be described. Note that a case where three users use thedisplay system 400 will be described in the third example. -
FIG. 10A is a diagram illustrating an image captured by thedisplay system 400 when three users use thedisplay system 400. InFIG. 10A , an image IM13 captured by thedisplay system 400 is transmitted to thedetection unit 204 included in theinformation processing apparatus 200 as described above, and thedetection unit 204 detects face regions (not shown) individually including faces of the three users who use thedisplay system 400. -
FIGS. 11A to 11D are diagrams illustrating the content and an image captured by theinformation processing apparatus 500 which are displayed in parallel in the display screen of thedisplay device 102. -
FIG. 11A shows a case where a display ratio of a display region for content to a display region for captured images supplied from theinformation processing apparatus 500 which is obtained through an input for setting of the display ratio of the display region for content to the display region for captured images is A9:B9 (B9=0).FIG. 11B shows a case where a display ratio of the display region for content to the display region for captured images is A10:B10 (B10<A10).FIG. 11C shows a case where a display ratio of the display region for content to the display region for captured images is A11:B11 (B11=A11).FIG. 11D shows a case where a display ratio of the display region for content to the display region for captured images is A12:B12 (A12=0). - As shown in
FIGS. 11B and 11C , in the display screen of thedisplay device 102, a display image IM19 or IM21 which is generated by extracting a portion of the image IM13 is displayed in the display region for captured images and content IM20 or IM22 which is desired by the user who uses thedisplay system 100 and which corresponds to an image having a reduced size is displayed in the display region for content. By this, the user who uses thedisplay system 100 may view the content IM20 or IM22 which is not partially hidden and easily identify the three users who use thedisplay system 400 and whose faces are included in the captured image. - Note that, even in a case where two users use the
display system 400 and a single user uses thedisplay system 600 as shown inFIG. 10B , the content desired by the user who uses thedisplay system 100 and images IM14 and IM15 captured by thedisplay systems display device 102 as shown inFIGS. 11A to 11D (specifically,FIGS. 11B and 11C ). Similarly, even in a case where a single user uses thedisplay system 400, a single user uses thedisplay system 600, and a single user uses another display system (not shown) as shown inFIG. 100 , the content desired by the user who uses thedisplay system 100 and images IM16, IM17, IM18 captured by thedisplay systems display device 102 as shown inFIGS. 11A to 11D (specifically,FIGS. 11B and 11C ). - Furthermore, similarly, as shown in
FIGS. 12A to 12D (specifically,FIGS. 12B and 12C ), even when four users use thedisplay system 400 and other systems, the content desired by the user who use thedisplay system 100 and images (not shown) captured by thedisplay system 400 and the other systems may be displayed in parallel in the display screen included in thedisplay device 102. -
FIGS. 14A to 14C are diagrams illustrating a fourth example according to the present disclosure. In the fourth example, a case where content desired by the user who uses thedisplay system 100 and an image captured by thedisplay system 400 are displayed in parallel in the display screen of thedisplay device 102 shown inFIG. 2 will be described. Note that, in the fourth example, a case where a single user uses thedisplay system 400 and the user is included in an end portion of an image captured by thedisplay system 400 will be described. -
FIG. 14A is a diagram illustrating the image captured by thedisplay system 400. InFIG. 14A , an image IM23 captured by thedisplay system 400 is transmitted to thedetection unit 204 included in theinformation processing apparatus 200 as described above, and thedetection unit 204 detects a face region FA4 including a face of the user who uses thedisplay system 400. -
FIGS. 14B and 14C are diagrams illustrating the content and the captured image supplied from theinformation processing apparatus 500 which are displayed in parallel in the display screen included in thedisplay device 102. - As shown in
FIG. 14B , a display image IM24 generated by extracting a portion of the captured image IM23 is displayed in a display region for captured images and content IM25 which is desired by the user who uses thedisplay system 100 and which has a reduced image size is displayed in a display region for content. - As shown in
FIG. 14B , when the user is included in the end portion of the captured image and the display image IM 24 is generated by extracting a portion of the captured image IM23 including the face region FA4 as a center, the display image IM 24 displayed in the display region for captured images includes an invalid region R1 which is a wasted region. - Therefore, in the fourth example, when the user is included in the end portion of the captured image, the portion of the captured image IM23 is extracted so that the invalid region R1 is not displayed in the display region for captured images whereby a display image IM26 is generated. For example, when the user is included in a right end portion of the captured image as shown in
FIG. 14A , a portion of the captured image IM23 is extracted using a right edge of the captured image IM23 as a reference so that the display image IM26 is generated. In this case, as shown inFIG. 14C , the display image IM26 which does not include the invalid region R1 is displayed in the display region for captured images and content IM27 which is desired by the user who uses thedisplay system 100 and which has a reduced image size is displayed in the display region for content in the display screen of thedisplay device 102. Accordingly, the user who uses thedisplay system 100 may be prevented from having a feeling of strangeness and a discomfort feeling. -
FIGS. 15A to 15D are diagrams illustrating a fifth example according to the present disclosure. In the fifth example, a case where content desired by the user who uses thedisplay system 100 and an image captured by thedisplay system 400 are displayed in parallel in the display screen of thedisplay device 102 shown inFIG. 2 will be described. Note that, in the fifth example, a case where two users use thedisplay system 400 and the two users are positioned close to each other in an image captured by thedisplay system 400 will be described. -
FIG. 15A is a diagram illustrating an image captured by thedisplay system 400 when two users use thedisplay system 400. InFIG. 15A , an image IM28 captured by thedisplay system 400 is transmitted to thedetection unit 204 included in theinformation processing apparatus 200 as described above, and thedetection unit 204 detects face regions FA5 and FA6 individually including faces of the two users who use thedisplay system 400. -
FIGS. 15B to 15D are diagrams illustrating the content and the captured image supplied from theinformation processing apparatus 500 which are displayed in parallel in the display screen of thedisplay device 102. - As shown in
FIG. 15B , in the display screen of thedisplay device 102, a display image IM29 generated by extracting portions of the captured image IM28 is displayed in a display region for captured images and content IM30 which is desired by the user who uses thedisplay system 100 and which has a reduced image size is displayed in a display region for content. - As shown in
FIG. 15B , in a case where the two users are positioned close to each other in the captured image, when the display image IM29 is generated by extracting portions of the captured image IM28 using the face regions FA5 and FA6 as centers, the display image IM29 includes an overlapping region R2 which includes the same images in the display region for captured images. - Therefore, in the fifth example, when the two users are positioned close to each other in the captured image, a display image IM31 is generated by extracting portions of the captured images IM28 such that the overlapping region R2 is not displayed in the display region for captured images. In this case, as shown in
FIG. 15C , the display image IM31 which does not include the overlapping region R2 is displayed in the display region for captured images and content IM32 which is desired by the user who uses thedisplay system 100 and which has a reduced image size is displayed in the display region for content in the display screen of thedisplay device 102. Accordingly, the user who uses thedisplay system 100 may be prevented from having a feeling of strangeness and a discomfort feeling. - Furthermore, in the fifth example, when the two users are positioned close to each other in the captured image, it may be determined that a single user uses the
display system 400 and a display image IM33 may be generated by extracting a portion of the captured image IM28 so that at least face feature portions of the two users are included in the display image IM33. Also in this case, as shown inFIG. 15D , the display image IM33 which does not include the overlapping region R2 is displayed in the display region for captured images and content IM34 which is desired by the user who uses thedisplay system 100 and which has a reduced image size is displayed in the display region for content in the display screen of thedisplay device 102. The display image IM31 shown inFIG. 15C is generated by arranging two images, and on the other hand, the display image IM33 shown inFIG. 15D is a single image, that is, an image generated by integrating the two images. Accordingly, the user who uses thedisplay system 100 may be prevented from having a feeling of strangeness and a discomfort feeling. -
FIGS. 16A to 16F are diagrams illustrating a sixth example according to the present disclosure. In the sixth example, a setting of a region to be extracted from a captured image will be described.FIGS. 16A to 16F are diagrams illustrating a setting of a region to be extracted from a captured image. - In the sixth example, in a captured image IM35, extraction region CA1, CA2, and CA3 are set using a vertical axis VA which passes the center of a face of a user as a reference as shown in
FIGS. 16A to 16C . - Furthermore, in the sixth example, in the captured image IM35, extraction regions CA4, CA5, and CA6 may be set using the vertical line VA which passes the center of the face of the user and a horizontal axis HA which passes eyes in the face of the user, for example, as references as shown in
FIGS. 160 to 16F . - A functional configuration of an information processing apparatus according to a second embodiment of the present disclosure will now be described.
FIG. 13 is a diagram illustrating the functional configuration of the information processing apparatus of this embodiment. Aninformation processing apparatus 250 serving as the information processing apparatus of this embodiment is different from the information processing apparatus of the first embodiment described above only in that theinformation processing apparatus 250 includes animage pickup unit 252 and adisplay unit 254. Therefore, descriptions of redundant configurations and operations are omitted, and configurations and operations different from those of the first embodiment will be described hereinafter. - Note that, in
FIG. 13 , a user A who uses theinformation processing apparatus 250, acommunication network 800 to which theinformation processing apparatus 250 is connectable, acommunication server 300 andinformation processing apparatuses communication network 800, users B and C who use theinformation processing apparatus 550, and a user D who uses theinformation processing apparatus 750 are shown. Theinformation processing apparatuses information processing apparatus 250, and therefore, detailed descriptions thereof are omitted. - In
FIG. 13 , theinformation processing apparatus 250 includes theimage pickup unit 252, adetection unit 204, a head-count obtaining unit 206, asetting unit 208, a displayimage generation unit 210, acommunication unit 212, thedisplay unit 254, and adisplay controller 216. - The
image pickup unit 252 is an example of an image pickup unit according to the present disclosure and may capture a still image or a moving image of a user A who watches a display screen of thedisplay unit 254. Then, theimage pickup unit 252 may transmit a captured image generated through image capturing to thecommunication unit 212, thedetection unit 204, and displayimage generation unit 210. - The
display unit 254 is an example of a display unit of the present disclosure and may display content of a still image or a moving image and captured images supplied from theinformation processing apparatuses display unit 254 may display content of a still image or a moving image in a display region for content and display captured images supplied from theinformation processing apparatuses information processing apparatuses - According to this embodiment, when the
information processing apparatus 250 executes the display process described with reference toFIG. 4 , advantages similar to those of the first embodiment described above may be attained. Furthermore, this embodiment may be carried out without separately providing thedisplay system 100 shown inFIG. 2 , for example. - Furthermore, an object of the present disclosure may be achieved by supplying a storage medium, such as a non-transitory storage medium, which stores a program code of software which realizes functions of the foregoing embodiments to a system or an apparatus and reading and executing the program code stored in the storage medium using a computer (or a CPU, an MPU, or the like) of the system or the apparatus.
- In this case, the program code which is read from the storage medium realizes the functions of the foregoing embodiments, and the program code and the storage medium which stores the program code are included in the present disclosure.
- Furthermore, examples of the storage medium to which the program code is supplied include a floppy (registered trademark) disk, a hard disk, a magneto-optical disc, an optical disc such as an CD-ROM, a CD-R, a CD-RW, a DVD-ROM, a DVD-RAM, a DVD-RW, or a DVD+RW, a magnetic tape, a nonvolatile memory card, and a ROM. Alternatively, the program code may be downloaded through a network.
- Furthermore, the present disclosure includes, in addition to the case where the functions of the foregoing embodiments are realized by executing the program code read by the computer, a case where an OS (Operating System) operating in a computer performs a part of or entire actual process in accordance with an instruction of the program code and the functions of the foregoing embodiments are realized by the process.
- The present disclosure further includes a case where, after the program code read from the storage medium is written to a memory included in a function expansion board inserted into the computer or a function expansion unit connected to the computer, a CPU or the like included in the expansion board or the expansion unit executes a part of or entire actual process in accordance with an instruction of the program code and the functions of the foregoing embodiments are realized by the process.
- Although the embodiments of the present disclosure have been described hereinabove with reference to the accompanying drawings, the present disclosure is not limited to the examples. It is apparent to those who skilled in the art that various modifications and variations may be made within the scope of the present disclosure and these modifications and the variations also belong to the scope of the present disclosure.
- For example, the
communication server 300 may detect face regions including faces of users who use theinformation processing apparatuses information processing apparatuses - Furthermore, the
information processing apparatus 200 may transmit content desired by the user A to theinformation processing apparatuses - It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Claims (16)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/571,473 US20150097920A1 (en) | 2010-09-30 | 2014-12-16 | Information processing apparatus and information processing method |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010221550 | 2010-09-30 | ||
JP2010-221550 | 2010-09-30 | ||
JP2010-289780 | 2010-12-27 | ||
JP2010289780A JP5740972B2 (en) | 2010-09-30 | 2010-12-27 | Information processing apparatus and information processing method |
US13/235,609 US8953860B2 (en) | 2010-09-30 | 2011-09-19 | Information processing apparatus and information processing method |
US14/571,473 US20150097920A1 (en) | 2010-09-30 | 2014-12-16 | Information processing apparatus and information processing method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/235,609 Continuation US8953860B2 (en) | 2010-09-30 | 2011-09-19 | Information processing apparatus and information processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150097920A1 true US20150097920A1 (en) | 2015-04-09 |
Family
ID=44651246
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/235,609 Active 2032-02-10 US8953860B2 (en) | 2010-09-30 | 2011-09-19 | Information processing apparatus and information processing method |
US14/571,473 Abandoned US20150097920A1 (en) | 2010-09-30 | 2014-12-16 | Information processing apparatus and information processing method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/235,609 Active 2032-02-10 US8953860B2 (en) | 2010-09-30 | 2011-09-19 | Information processing apparatus and information processing method |
Country Status (4)
Country | Link |
---|---|
US (2) | US8953860B2 (en) |
EP (1) | EP2437490B1 (en) |
JP (1) | JP5740972B2 (en) |
CN (1) | CN102446065B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5740972B2 (en) * | 2010-09-30 | 2015-07-01 | ソニー株式会社 | Information processing apparatus and information processing method |
JP5598232B2 (en) * | 2010-10-04 | 2014-10-01 | ソニー株式会社 | Information processing apparatus, information processing system, and information processing method |
JP6058978B2 (en) * | 2012-11-19 | 2017-01-11 | サターン ライセンシング エルエルシーSaturn Licensing LLC | Image processing apparatus, image processing method, photographing apparatus, and computer program |
EP2927902A4 (en) * | 2012-11-27 | 2016-07-06 | Sony Corp | Display device, display method, and computer program |
TWI573619B (en) * | 2012-12-21 | 2017-03-11 | 新力電腦娛樂(美國)責任有限公司 | Automatic generation of suggested mini-games for cloud-gaming based on recorded gameplay |
CN104349131B (en) * | 2013-08-09 | 2018-12-14 | 联想(北京)有限公司 | A kind of information processing method and electronic equipment |
KR102192704B1 (en) | 2013-10-22 | 2020-12-17 | 엘지전자 주식회사 | image outputting device |
KR20180070297A (en) * | 2016-12-16 | 2018-06-26 | 삼성전자주식회사 | Display apparatus and control method thereof |
CN110430384B (en) * | 2019-08-23 | 2020-11-03 | 珠海格力电器股份有限公司 | Video call method and device, intelligent terminal and storage medium |
TWI719800B (en) * | 2020-01-08 | 2021-02-21 | 華碩電腦股份有限公司 | Display device and method capable of switching display modes |
Citations (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5657096A (en) * | 1995-05-03 | 1997-08-12 | Lukacs; Michael Edward | Real time video conferencing system and method with multilayer keying of multiple video images |
US20020081003A1 (en) * | 2000-12-27 | 2002-06-27 | Sobol Robert E. | System and method for automatically enhancing graphical images |
US20020150280A1 (en) * | 2000-12-04 | 2002-10-17 | Pingshan Li | Face detection under varying rotation |
US20030112358A1 (en) * | 2001-09-28 | 2003-06-19 | Masao Hamada | Moving picture communication method and apparatus |
US20040263636A1 (en) * | 2003-06-26 | 2004-12-30 | Microsoft Corporation | System and method for distributed meetings |
US20050128221A1 (en) * | 2003-12-16 | 2005-06-16 | Canon Kabushiki Kaisha | Image displaying method and image displaying apparatus |
US20060259755A1 (en) * | 2001-08-20 | 2006-11-16 | Polycom, Inc. | System and method for using biometrics technology in conferencing |
US20070047775A1 (en) * | 2005-08-29 | 2007-03-01 | Atsushi Okubo | Image processing apparatus and method and program |
US20070070188A1 (en) * | 2005-05-05 | 2007-03-29 | Amtran Technology Co., Ltd | Method of audio-visual communication using a television and television using the same |
US20070079322A1 (en) * | 2002-05-13 | 2007-04-05 | Microsoft Corporation | Selectively overlaying a user interface atop a video signal |
US20070216773A1 (en) * | 2006-02-01 | 2007-09-20 | Sony Corporation | System, apparatus, method, program and recording medium for processing image |
US20080080743A1 (en) * | 2006-09-29 | 2008-04-03 | Pittsburgh Pattern Recognition, Inc. | Video retrieval system for human face content |
US7379568B2 (en) * | 2003-07-24 | 2008-05-27 | Sony Corporation | Weak hypothesis generation apparatus and method, learning apparatus and method, detection apparatus and method, facial expression learning apparatus and method, facial expression recognition apparatus and method, and robot apparatus |
US20080152197A1 (en) * | 2006-12-22 | 2008-06-26 | Yukihiro Kawada | Information processing apparatus and information processing method |
US20080240563A1 (en) * | 2007-03-30 | 2008-10-02 | Casio Computer Co., Ltd. | Image pickup apparatus equipped with face-recognition function |
US20090079813A1 (en) * | 2007-09-24 | 2009-03-26 | Gesturetek, Inc. | Enhanced Interface for Voice and Video Communications |
US7554571B1 (en) * | 2005-03-18 | 2009-06-30 | Avaya Inc. | Dynamic layout of participants in a multi-party video conference |
US20090175509A1 (en) * | 2008-01-03 | 2009-07-09 | Apple Inc. | Personal computing device control using face detection and recognition |
US20090185033A1 (en) * | 2006-06-29 | 2009-07-23 | Nikon Corporation | Replay Device, Replay System, and Television Set |
US20090190835A1 (en) * | 2008-01-29 | 2009-07-30 | Samsung Electronics Co., Ltd. | Method for capturing image to add enlarged image of specific area to captured image, and imaging apparatus applying the same |
US20090210491A1 (en) * | 2008-02-20 | 2009-08-20 | Microsoft Corporation | Techniques to automatically identify participants for a multimedia conference event |
US7598975B2 (en) * | 2002-06-21 | 2009-10-06 | Microsoft Corporation | Automatic face extraction for use in recorded meetings timelines |
US20090327418A1 (en) * | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Participant positioning in multimedia conferencing |
US20100005393A1 (en) * | 2007-01-22 | 2010-01-07 | Sony Corporation | Information processing apparatus, information processing method, and program |
US20100007796A1 (en) * | 2008-07-11 | 2010-01-14 | Fujifilm Corporation | Contents display device, contents display method, and computer readable medium for storing contents display program |
US7673015B2 (en) * | 2004-08-06 | 2010-03-02 | Sony Corporation | Information-processing apparatus, information-processing methods, recording mediums, and programs |
US20100064334A1 (en) * | 2008-09-05 | 2010-03-11 | Skype Limited | Communication system and method |
US20100079675A1 (en) * | 2008-09-30 | 2010-04-01 | Canon Kabushiki Kaisha | Video displaying apparatus, video displaying system and video displaying method |
US20100149305A1 (en) * | 2008-12-15 | 2010-06-17 | Tandberg Telecom As | Device and method for automatic participant identification in a recorded multimedia stream |
US20100171807A1 (en) * | 2008-10-08 | 2010-07-08 | Tandberg Telecom As | System and associated methodology for multi-layered site video conferencing |
US20100225815A1 (en) * | 2009-03-05 | 2010-09-09 | Vishal Vincent Khatri | Systems methods and apparatuses for rendering user customizable multimedia signals on a display device |
US20100238262A1 (en) * | 2009-03-23 | 2010-09-23 | Kurtz Andrew F | Automated videography systems |
US20100271457A1 (en) * | 2009-04-23 | 2010-10-28 | Optical Fusion Inc. | Advanced Video Conference |
US7847815B2 (en) * | 2006-10-11 | 2010-12-07 | Cisco Technology, Inc. | Interaction based on facial recognition of conference participants |
US20110044444A1 (en) * | 2009-08-21 | 2011-02-24 | Avaya Inc. | Multiple user identity and bridge appearance |
US20110050842A1 (en) * | 2009-08-27 | 2011-03-03 | Polycom, Inc. | Distance learning via instructor immersion into remote classroom |
US20110074915A1 (en) * | 2003-09-19 | 2011-03-31 | Bran Ferren | Apparatus and method for presenting audio in a video teleconference |
US20110096137A1 (en) * | 2009-10-27 | 2011-04-28 | Mary Baker | Audiovisual Feedback To Users Of Video Conferencing Applications |
US20110116685A1 (en) * | 2009-11-16 | 2011-05-19 | Sony Corporation | Information processing apparatus, setting changing method, and setting changing program |
US20110115877A1 (en) * | 2009-11-17 | 2011-05-19 | Kang Sung Suk | Method for user authentication, and video communication apparatus and display apparatus thereof |
US20110181683A1 (en) * | 2010-01-25 | 2011-07-28 | Nam Sangwu | Video communication method and digital television using the same |
US8006276B2 (en) * | 2006-02-13 | 2011-08-23 | Sony Corporation | Image taking apparatus and method with display and image storage communication with other image taking apparatus |
US20110216155A1 (en) * | 2004-07-27 | 2011-09-08 | Sony Corporation | Information-processing apparatus, information-processing methods, recording mediums, and programs |
US20110271210A1 (en) * | 2010-04-30 | 2011-11-03 | American Teleconferncing Services Ltd. | Conferencing Application Store |
US20110271197A1 (en) * | 2010-04-30 | 2011-11-03 | American Teleconferncing Services Ltd. | Distributing Information Between Participants in a Conference via a Conference User Interface |
US20110292232A1 (en) * | 2010-06-01 | 2011-12-01 | Tong Zhang | Image retrieval |
US20120027256A1 (en) * | 2010-07-27 | 2012-02-02 | Google Inc. | Automatic Media Sharing Via Shutter Click |
US20120038742A1 (en) * | 2010-08-15 | 2012-02-16 | Robinson Ian N | System And Method For Enabling Collaboration In A Video Conferencing System |
US20120057794A1 (en) * | 2010-09-06 | 2012-03-08 | Shingo Tsurumi | Image processing device, program, and image procesing method |
US20120082339A1 (en) * | 2010-09-30 | 2012-04-05 | Sony Corporation | Information processing apparatus and information processing method |
US20120128255A1 (en) * | 2010-11-22 | 2012-05-24 | Sony Corporation | Part detection apparatus, part detection method, and program |
US20120167001A1 (en) * | 2009-12-31 | 2012-06-28 | Flicklntel, LLC. | Method, system and computer program product for obtaining and displaying supplemental data about a displayed movie, show, event or video game |
US20120218375A1 (en) * | 2007-08-08 | 2012-08-30 | Qnx Software Systems Limited | Video phone system |
US20120313851A1 (en) * | 2011-06-13 | 2012-12-13 | Sony Corporation | Information processing apparatus and program |
US20120327172A1 (en) * | 2011-06-22 | 2012-12-27 | Microsoft Corporation | Modifying video regions using mobile device input |
US20130002806A1 (en) * | 2011-06-24 | 2013-01-03 | At&T Intellectual Property I, Lp | Apparatus and method for managing telepresence sessions |
US8350891B2 (en) * | 2009-11-16 | 2013-01-08 | Lifesize Communications, Inc. | Determining a videoconference layout based on numbers of participants |
US20130070973A1 (en) * | 2011-09-15 | 2013-03-21 | Hiroo SAITO | Face recognizing apparatus and face recognizing method |
US20130100238A1 (en) * | 2012-12-11 | 2013-04-25 | Vidtel, Inc. | Call routing based on facial recognition |
US8438598B2 (en) * | 2007-11-16 | 2013-05-07 | Sony Corporation | Information processing apparatus, information processing method, program, and information sharing system |
US8483428B1 (en) * | 2010-01-06 | 2013-07-09 | Kimberly Lynn Anderson | Apparatus for processing a digital image |
US20130177219A1 (en) * | 2010-10-28 | 2013-07-11 | Telefonaktiebolaget L M Ericsson (Publ) | Face Data Acquirer, End User Video Conference Device, Server, Method, Computer Program And Computer Program Product For Extracting Face Data |
US20130182914A1 (en) * | 2010-10-07 | 2013-07-18 | Sony Corporation | Information processing device and information processing method |
US20130201105A1 (en) * | 2012-02-02 | 2013-08-08 | Raymond William Ptucha | Method for controlling interactive display system |
US20130231185A1 (en) * | 2011-08-29 | 2013-09-05 | Bally Gaming, Inc. | Methodand apparatus for audio scaling at a display showing content in different areas |
US20130236069A1 (en) * | 2012-03-07 | 2013-09-12 | Altek Corporation | Face Recognition System and Face Recognition Method Thereof |
US20130293739A1 (en) * | 2007-03-15 | 2013-11-07 | Sony Corporation | Information processing apparatus, imaging apparatus, image display control method and computer program |
US8621088B2 (en) * | 2006-05-02 | 2013-12-31 | Sony Corporation | Communication system, communication apparatus, communication program, and computer-readable storage medium stored with the communication progam |
US20140055429A1 (en) * | 2012-08-23 | 2014-02-27 | Samsung Electronics Co., Ltd. | Flexible display apparatus and controlling method thereof |
US20140156364A1 (en) * | 2007-03-22 | 2014-06-05 | Sony Computer Entertainment America Llc | Scheme for determining the locations and timing of advertisements and other insertions in media |
US8788589B2 (en) * | 2007-10-12 | 2014-07-22 | Watchitoo, Inc. | System and method for coordinating simultaneous edits of shared digital data |
US20140229866A1 (en) * | 2008-11-24 | 2014-08-14 | Shindig, Inc. | Systems and methods for grouping participants of multi-user events |
US20160277712A1 (en) * | 2013-10-24 | 2016-09-22 | Telefonaktiebolaget L M Ericsson (Publ) | Arrangements and Method Thereof for Video Retargeting for Video Conferencing |
US20160371815A1 (en) * | 2015-06-17 | 2016-12-22 | Samsung Electronics Co., Ltd. | Electronic device for displaying a plurality of images and method for processing an image |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002032068A (en) * | 2000-07-18 | 2002-01-31 | Olympus Optical Co Ltd | Image processing apparatus |
JP2005039598A (en) * | 2003-07-16 | 2005-02-10 | Nippon Telegr & Teleph Corp <Ntt> | Interactive distribution system |
US20060222243A1 (en) * | 2005-04-02 | 2006-10-05 | Newell Martin E | Extraction and scaled display of objects in an image |
WO2007063922A1 (en) * | 2005-11-29 | 2007-06-07 | Kyocera Corporation | Communication terminal and communication system, and display method of communication terminal |
JP4832869B2 (en) * | 2005-11-29 | 2011-12-07 | 京セラ株式会社 | Communication terminal and display method thereof |
JP2008085546A (en) * | 2006-09-27 | 2008-04-10 | Funai Electric Co Ltd | Video output device |
JP2008236679A (en) * | 2007-03-23 | 2008-10-02 | Sony Corp | Videoconference apparatus, control method, and program |
KR101513616B1 (en) * | 2007-07-31 | 2015-04-20 | 엘지전자 주식회사 | Mobile terminal and image information managing method therefor |
JP4462334B2 (en) * | 2007-11-16 | 2010-05-12 | ソニー株式会社 | Information processing apparatus, information processing method, program, and information sharing system |
JP4322945B2 (en) * | 2007-12-27 | 2009-09-02 | 株式会社東芝 | Electronic device and image display control method |
JP4535150B2 (en) * | 2008-03-18 | 2010-09-01 | ソニー株式会社 | Image processing apparatus and method, program, and recording medium |
JP5087477B2 (en) * | 2008-06-12 | 2012-12-05 | 株式会社日立製作所 | Information recording / reproducing apparatus and information recording method |
JP2010016482A (en) * | 2008-07-01 | 2010-01-21 | Sony Corp | Information processing apparatus, and information processing method |
JP2010221550A (en) | 2009-03-24 | 2010-10-07 | Seiko Epson Corp | Method of manufacturing eccentric article and eccentric article |
-
2010
- 2010-12-27 JP JP2010289780A patent/JP5740972B2/en not_active Expired - Fee Related
-
2011
- 2011-09-06 EP EP11180184.1A patent/EP2437490B1/en active Active
- 2011-09-19 US US13/235,609 patent/US8953860B2/en active Active
- 2011-09-22 CN CN201110282919.0A patent/CN102446065B/en active Active
-
2014
- 2014-12-16 US US14/571,473 patent/US20150097920A1/en not_active Abandoned
Patent Citations (75)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5657096A (en) * | 1995-05-03 | 1997-08-12 | Lukacs; Michael Edward | Real time video conferencing system and method with multilayer keying of multiple video images |
US20020150280A1 (en) * | 2000-12-04 | 2002-10-17 | Pingshan Li | Face detection under varying rotation |
US20020081003A1 (en) * | 2000-12-27 | 2002-06-27 | Sobol Robert E. | System and method for automatically enhancing graphical images |
US20060259755A1 (en) * | 2001-08-20 | 2006-11-16 | Polycom, Inc. | System and method for using biometrics technology in conferencing |
US20030112358A1 (en) * | 2001-09-28 | 2003-06-19 | Masao Hamada | Moving picture communication method and apparatus |
US20070079322A1 (en) * | 2002-05-13 | 2007-04-05 | Microsoft Corporation | Selectively overlaying a user interface atop a video signal |
US7598975B2 (en) * | 2002-06-21 | 2009-10-06 | Microsoft Corporation | Automatic face extraction for use in recorded meetings timelines |
US20040263636A1 (en) * | 2003-06-26 | 2004-12-30 | Microsoft Corporation | System and method for distributed meetings |
US20090046139A1 (en) * | 2003-06-26 | 2009-02-19 | Microsoft Corporation | system and method for distributed meetings |
US7379568B2 (en) * | 2003-07-24 | 2008-05-27 | Sony Corporation | Weak hypothesis generation apparatus and method, learning apparatus and method, detection apparatus and method, facial expression learning apparatus and method, facial expression recognition apparatus and method, and robot apparatus |
US20110074915A1 (en) * | 2003-09-19 | 2011-03-31 | Bran Ferren | Apparatus and method for presenting audio in a video teleconference |
US20050128221A1 (en) * | 2003-12-16 | 2005-06-16 | Canon Kabushiki Kaisha | Image displaying method and image displaying apparatus |
US20110216155A1 (en) * | 2004-07-27 | 2011-09-08 | Sony Corporation | Information-processing apparatus, information-processing methods, recording mediums, and programs |
US7673015B2 (en) * | 2004-08-06 | 2010-03-02 | Sony Corporation | Information-processing apparatus, information-processing methods, recording mediums, and programs |
US7554571B1 (en) * | 2005-03-18 | 2009-06-30 | Avaya Inc. | Dynamic layout of participants in a multi-party video conference |
US20070070188A1 (en) * | 2005-05-05 | 2007-03-29 | Amtran Technology Co., Ltd | Method of audio-visual communication using a television and television using the same |
US20070047775A1 (en) * | 2005-08-29 | 2007-03-01 | Atsushi Okubo | Image processing apparatus and method and program |
US20070216773A1 (en) * | 2006-02-01 | 2007-09-20 | Sony Corporation | System, apparatus, method, program and recording medium for processing image |
US8006276B2 (en) * | 2006-02-13 | 2011-08-23 | Sony Corporation | Image taking apparatus and method with display and image storage communication with other image taking apparatus |
US8621088B2 (en) * | 2006-05-02 | 2013-12-31 | Sony Corporation | Communication system, communication apparatus, communication program, and computer-readable storage medium stored with the communication progam |
US20090185033A1 (en) * | 2006-06-29 | 2009-07-23 | Nikon Corporation | Replay Device, Replay System, and Television Set |
US20080080743A1 (en) * | 2006-09-29 | 2008-04-03 | Pittsburgh Pattern Recognition, Inc. | Video retrieval system for human face content |
US7847815B2 (en) * | 2006-10-11 | 2010-12-07 | Cisco Technology, Inc. | Interaction based on facial recognition of conference participants |
US20080152197A1 (en) * | 2006-12-22 | 2008-06-26 | Yukihiro Kawada | Information processing apparatus and information processing method |
US20100005393A1 (en) * | 2007-01-22 | 2010-01-07 | Sony Corporation | Information processing apparatus, information processing method, and program |
US20130293739A1 (en) * | 2007-03-15 | 2013-11-07 | Sony Corporation | Information processing apparatus, imaging apparatus, image display control method and computer program |
US20140156364A1 (en) * | 2007-03-22 | 2014-06-05 | Sony Computer Entertainment America Llc | Scheme for determining the locations and timing of advertisements and other insertions in media |
US20080240563A1 (en) * | 2007-03-30 | 2008-10-02 | Casio Computer Co., Ltd. | Image pickup apparatus equipped with face-recognition function |
US20120218375A1 (en) * | 2007-08-08 | 2012-08-30 | Qnx Software Systems Limited | Video phone system |
US20090079813A1 (en) * | 2007-09-24 | 2009-03-26 | Gesturetek, Inc. | Enhanced Interface for Voice and Video Communications |
US8788589B2 (en) * | 2007-10-12 | 2014-07-22 | Watchitoo, Inc. | System and method for coordinating simultaneous edits of shared digital data |
US8438598B2 (en) * | 2007-11-16 | 2013-05-07 | Sony Corporation | Information processing apparatus, information processing method, program, and information sharing system |
US20090175509A1 (en) * | 2008-01-03 | 2009-07-09 | Apple Inc. | Personal computing device control using face detection and recognition |
US20090190835A1 (en) * | 2008-01-29 | 2009-07-30 | Samsung Electronics Co., Ltd. | Method for capturing image to add enlarged image of specific area to captured image, and imaging apparatus applying the same |
US20090210491A1 (en) * | 2008-02-20 | 2009-08-20 | Microsoft Corporation | Techniques to automatically identify participants for a multimedia conference event |
US20090327418A1 (en) * | 2008-06-27 | 2009-12-31 | Microsoft Corporation | Participant positioning in multimedia conferencing |
US20100007796A1 (en) * | 2008-07-11 | 2010-01-14 | Fujifilm Corporation | Contents display device, contents display method, and computer readable medium for storing contents display program |
US20100064334A1 (en) * | 2008-09-05 | 2010-03-11 | Skype Limited | Communication system and method |
US20100079675A1 (en) * | 2008-09-30 | 2010-04-01 | Canon Kabushiki Kaisha | Video displaying apparatus, video displaying system and video displaying method |
US20100171807A1 (en) * | 2008-10-08 | 2010-07-08 | Tandberg Telecom As | System and associated methodology for multi-layered site video conferencing |
US20140229866A1 (en) * | 2008-11-24 | 2014-08-14 | Shindig, Inc. | Systems and methods for grouping participants of multi-user events |
US20100149305A1 (en) * | 2008-12-15 | 2010-06-17 | Tandberg Telecom As | Device and method for automatic participant identification in a recorded multimedia stream |
US20100225815A1 (en) * | 2009-03-05 | 2010-09-09 | Vishal Vincent Khatri | Systems methods and apparatuses for rendering user customizable multimedia signals on a display device |
US20100238262A1 (en) * | 2009-03-23 | 2010-09-23 | Kurtz Andrew F | Automated videography systems |
US20100271457A1 (en) * | 2009-04-23 | 2010-10-28 | Optical Fusion Inc. | Advanced Video Conference |
US20110044444A1 (en) * | 2009-08-21 | 2011-02-24 | Avaya Inc. | Multiple user identity and bridge appearance |
US20110050842A1 (en) * | 2009-08-27 | 2011-03-03 | Polycom, Inc. | Distance learning via instructor immersion into remote classroom |
US20110096137A1 (en) * | 2009-10-27 | 2011-04-28 | Mary Baker | Audiovisual Feedback To Users Of Video Conferencing Applications |
US8350891B2 (en) * | 2009-11-16 | 2013-01-08 | Lifesize Communications, Inc. | Determining a videoconference layout based on numbers of participants |
US20110116685A1 (en) * | 2009-11-16 | 2011-05-19 | Sony Corporation | Information processing apparatus, setting changing method, and setting changing program |
US20110115877A1 (en) * | 2009-11-17 | 2011-05-19 | Kang Sung Suk | Method for user authentication, and video communication apparatus and display apparatus thereof |
US20120167001A1 (en) * | 2009-12-31 | 2012-06-28 | Flicklntel, LLC. | Method, system and computer program product for obtaining and displaying supplemental data about a displayed movie, show, event or video game |
US8483428B1 (en) * | 2010-01-06 | 2013-07-09 | Kimberly Lynn Anderson | Apparatus for processing a digital image |
US20110181683A1 (en) * | 2010-01-25 | 2011-07-28 | Nam Sangwu | Video communication method and digital television using the same |
US20110271197A1 (en) * | 2010-04-30 | 2011-11-03 | American Teleconferncing Services Ltd. | Distributing Information Between Participants in a Conference via a Conference User Interface |
US20110271210A1 (en) * | 2010-04-30 | 2011-11-03 | American Teleconferncing Services Ltd. | Conferencing Application Store |
US20110292232A1 (en) * | 2010-06-01 | 2011-12-01 | Tong Zhang | Image retrieval |
US20120027256A1 (en) * | 2010-07-27 | 2012-02-02 | Google Inc. | Automatic Media Sharing Via Shutter Click |
US20120038742A1 (en) * | 2010-08-15 | 2012-02-16 | Robinson Ian N | System And Method For Enabling Collaboration In A Video Conferencing System |
US20120057794A1 (en) * | 2010-09-06 | 2012-03-08 | Shingo Tsurumi | Image processing device, program, and image procesing method |
US20120082339A1 (en) * | 2010-09-30 | 2012-04-05 | Sony Corporation | Information processing apparatus and information processing method |
US20130182914A1 (en) * | 2010-10-07 | 2013-07-18 | Sony Corporation | Information processing device and information processing method |
US20130177219A1 (en) * | 2010-10-28 | 2013-07-11 | Telefonaktiebolaget L M Ericsson (Publ) | Face Data Acquirer, End User Video Conference Device, Server, Method, Computer Program And Computer Program Product For Extracting Face Data |
US20120128255A1 (en) * | 2010-11-22 | 2012-05-24 | Sony Corporation | Part detection apparatus, part detection method, and program |
US20120313851A1 (en) * | 2011-06-13 | 2012-12-13 | Sony Corporation | Information processing apparatus and program |
US20120327172A1 (en) * | 2011-06-22 | 2012-12-27 | Microsoft Corporation | Modifying video regions using mobile device input |
US20130002806A1 (en) * | 2011-06-24 | 2013-01-03 | At&T Intellectual Property I, Lp | Apparatus and method for managing telepresence sessions |
US20130231185A1 (en) * | 2011-08-29 | 2013-09-05 | Bally Gaming, Inc. | Methodand apparatus for audio scaling at a display showing content in different areas |
US20130070973A1 (en) * | 2011-09-15 | 2013-03-21 | Hiroo SAITO | Face recognizing apparatus and face recognizing method |
US20130201105A1 (en) * | 2012-02-02 | 2013-08-08 | Raymond William Ptucha | Method for controlling interactive display system |
US20130236069A1 (en) * | 2012-03-07 | 2013-09-12 | Altek Corporation | Face Recognition System and Face Recognition Method Thereof |
US20140055429A1 (en) * | 2012-08-23 | 2014-02-27 | Samsung Electronics Co., Ltd. | Flexible display apparatus and controlling method thereof |
US20130100238A1 (en) * | 2012-12-11 | 2013-04-25 | Vidtel, Inc. | Call routing based on facial recognition |
US20160277712A1 (en) * | 2013-10-24 | 2016-09-22 | Telefonaktiebolaget L M Ericsson (Publ) | Arrangements and Method Thereof for Video Retargeting for Video Conferencing |
US20160371815A1 (en) * | 2015-06-17 | 2016-12-22 | Samsung Electronics Co., Ltd. | Electronic device for displaying a plurality of images and method for processing an image |
Also Published As
Publication number | Publication date |
---|---|
EP2437490A3 (en) | 2013-10-16 |
EP2437490A2 (en) | 2012-04-04 |
JP2012095258A (en) | 2012-05-17 |
EP2437490B1 (en) | 2018-05-30 |
CN102446065A (en) | 2012-05-09 |
CN102446065B (en) | 2017-10-10 |
US20120082339A1 (en) | 2012-04-05 |
US8953860B2 (en) | 2015-02-10 |
JP5740972B2 (en) | 2015-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8953860B2 (en) | Information processing apparatus and information processing method | |
US9674488B2 (en) | Information processing device and information processing method | |
US9704028B2 (en) | Image processing apparatus and program | |
JP5625643B2 (en) | Information processing apparatus and information processing method | |
EP2453384A1 (en) | Method and apparatus for performing gesture recognition using object in multimedia device | |
US9426270B2 (en) | Control apparatus and control method to control volume of sound | |
KR102208893B1 (en) | Display apparatus and channel map manage method thereof | |
US9805390B2 (en) | Display control apparatus, display control method, and program | |
US20160343158A1 (en) | Effect control device, effect control method, and program | |
US20160171308A1 (en) | Electronic device and image processing method | |
US9420218B2 (en) | Television system | |
JP2014209778A (en) | Information processing device and information processing method | |
CN104602101A (en) | Television system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKAI, YUSUKE;KONDO, MASAO;REEL/FRAME:034575/0319 Effective date: 20110822 |
|
AS | Assignment |
Owner name: SATURN LICENSING LLC, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SONY CORPORATION;REEL/FRAME:041455/0195 Effective date: 20150911 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |