US20050001024A1 - Electronic apparatus, electronic camera, electronic device, image display apparatus, and image transmission system - Google Patents

Electronic apparatus, electronic camera, electronic device, image display apparatus, and image transmission system Download PDF

Info

Publication number
US20050001024A1
US20050001024A1 US10/497,146 US49714604A US2005001024A1 US 20050001024 A1 US20050001024 A1 US 20050001024A1 US 49714604 A US49714604 A US 49714604A US 2005001024 A1 US2005001024 A1 US 2005001024A1
Authority
US
United States
Prior art keywords
user
electronic apparatus
electronic
wireless communication
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/497,146
Other versions
US7382405B2 (en
Inventor
Yosuke Kusaka
Shoei Nakamura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2001368014A external-priority patent/JP4114348B2/en
Priority claimed from JP2001373831A external-priority patent/JP4158376B2/en
Priority claimed from JP2001375568A external-priority patent/JP4114350B2/en
Application filed by Nikon Corp filed Critical Nikon Corp
Assigned to NIKON CORPORATION reassignment NIKON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAKAMURA, SHOEI, KUSAKA, YOSUKE
Assigned to NIKON CORPORATION reassignment NIKON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAKAMURA, SHOEI, KUSAKA, YOSUKE
Publication of US20050001024A1 publication Critical patent/US20050001024A1/en
Priority to US12/149,231 priority Critical patent/US7864218B2/en
Application granted granted Critical
Publication of US7382405B2 publication Critical patent/US7382405B2/en
Priority to US12/926,564 priority patent/US8482634B2/en
Priority to US13/909,283 priority patent/US8804006B2/en
Priority to US14/327,903 priority patent/US20140340534A1/en
Priority to US14/808,549 priority patent/US20150370360A1/en
Priority to US15/131,395 priority patent/US9894220B2/en
Priority to US15/160,379 priority patent/US9578186B2/en
Priority to US15/213,845 priority patent/US10015403B2/en
Priority to US15/342,266 priority patent/US9838550B2/en
Priority to US15/802,033 priority patent/US20180069968A1/en
Priority to US15/859,966 priority patent/US20180124259A1/en
Priority to US16/270,870 priority patent/US20190174066A1/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00281Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19654Details concerning communication with a camera
    • G08B13/19656Network used to communicate with a camera, e.g. WAN, LAN, Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00095Systems or arrangements for the transmission of the picture signal
    • H04N1/00103Systems or arrangements for the transmission of the picture signal specially adapted for radio transmission, e.g. via satellites
    • H04N1/00106Systems or arrangements for the transmission of the picture signal specially adapted for radio transmission, e.g. via satellites using land mobile radio networks, e.g. mobile telephone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00095Systems or arrangements for the transmission of the picture signal
    • H04N1/00114Systems or arrangements for the transmission of the picture signal with transmission of additional information signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00129Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a display device, e.g. CRT or LCD monitor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00148Storage
    • H04N1/00151Storage with selective access
    • H04N1/00153Storage with selective access for sharing images with a selected individual or correspondent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00249Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a photographic apparatus, e.g. a photographic printer or a projector
    • H04N1/00251Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a photographic apparatus, e.g. a photographic printer or a projector with an apparatus for taking photographic images, e.g. a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00962Input arrangements for operating instructions or parameters, e.g. updating internal software
    • H04N1/0097Storage of instructions or parameters, e.g. customised instructions or different parameters for different user IDs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/21Intermediate information storage
    • H04N1/2104Intermediate information storage for one or a few pictures
    • H04N1/2112Intermediate information storage for one or a few pictures using still video cameras
    • H04N1/2125Display of information relating to the still picture recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0084Digital still camera

Definitions

  • the present invention relates to an electronic apparatus that identifies a user and performs an operation corresponding to the user and a method for identifying a user of an electronic apparatus and identifying an electronic apparatus to be utilized by the user and, more specifically, it relates to an electronic apparatus that identifies a user through communication executed between the electronic apparatus and the user and a method for identifying an electronic apparatus user and an electronic apparatus to be utilized by a user by executing communication between the electronic apparatus and the user.
  • the present invention also relates to an electronic camera, an image display apparatus and an image display system that stores electronic image data obtained by photographing a desired subject with the electronic camera into a storage medium, reads out the electronic image data from the storage medium and displays an image reproduced from the electronic image data at a specific screen and, more specifically, it relates to an electronic camera, an image display apparatus and an image display system that brings up a display of information related to the image data at the screen at which the image reproduced from the image data is displayed.
  • the present invention relates to an information transmission system and an information transmission method through which data are exchanged among portable terminals or the like and, more specifically, it relates to an image transmission system and an image transmission method through which electronic image data obtained by photographing a desired subject with a photographing apparatus such as an electronic camera are transmitted to an electronic instrument such as a portable telephone terminal having an image display function.
  • Japanese Laid-Open Patent Publication No. H 11-164282 discloses a digital camera having a card read unit, which allows a user to enter personal information to the digital camera through the card reader unit, automatically connects with a data server corresponding to the user based upon the entered personal information and automatically transfers image data obtained through a photographing operation performed with the digital camera to the data server.
  • the electronic apparatus in the related art described above requires a magnetic card or a memory card having stored therein the personal information to be directly loaded at the electronic apparatus or necessitates a wired connection of the magnetic card or the memory card to the electronic apparatus so that the personal information can be read into the electronic apparatus.
  • the personal information is utilized for limited purposes such as the automatic selection of the recipient to which images are to be transmitted and setting and display operations are performed in the electronic apparatus without fully reflecting the personal information.
  • the user may not always be satisfied with the ease of operation it affords.
  • the design of the electronic apparatus is particularly lax in consideration of users such as first-time users, seniors and children, who may not be accustomed to operating electronic apparatuses.
  • the user reviewing electronic image data in the related art as described above is simply allowed to view the reproduced images on display, and there is difficulty in expanding the range of information utilization beyond reviewing of the images.
  • image data obtained through a photographing operation of an electronic camera are transferred to another image display terminal for reviewing the image data through an off-line information transmission method, in which the image data are temporarily saved into a memory card and then the memory card is loaded at the other image display terminal or through an on-line information transmission method, in which the image data are transmitted through a wireless or wired communication from the electronic camera to the other image display terminal, in the related art.
  • Japanese Laid-Open Patent Publication No. H 10-341388 discloses an image transmission system which enables a direct transmission of image data from an electronic camera to another electronic camera through optical communication achieved by using infrared light.
  • the present invention provides, preferably, an electronic apparatus that has a user identification function for performing identification of the user of the electronic apparatus through a wireless method so as to speed up the user identification without placing an onus on the user and for effectively preventing an erroneous identification that would identify a party other than the user when executing the user identification through the wireless method and a user identification method that may be adopted in the electronic apparatus.
  • the present invention also provides, preferably, an electronic apparatus identification method in which identification of an electronic apparatus to be utilized by a user is executed through a wireless method so as to speed up the identification without placing an onus on the user, and an erroneous identification that would identify another electronic apparatus different from the intended electronic apparatus as the electronic apparatus to be utilized is effectively prevented when electronic apparatus identification is executed through the wireless method.
  • the present invention further provides, preferably, an electronic apparatus that has a user identification function that allows beginners/unexperienced users, elderly persons and children as well as experienced users to operate the same electronic apparatus with ease, without requiring any special data input or registration setting to enable the electronic apparatus to execute a user identification.
  • the present invention further provides, preferably, an electronic camera, an image display apparatus and an image display method that enable an easy and prompt review of information related to a subject photographed in an image, which interests the user reviewing a reproduced display of electronic image data in linkage with the reproduced display of the electronic image data.
  • the present invention also provides, preferably, an image transmission system and an image transmission method that allow images to be transmitted to an image recipient side in a short time without placing an onus on the image sender side during an image transmission. More specifically, it provides an image transmission system and an image transmission method that allow image data obtained through a photographing operation executed in an electronic camera to be promptly transmitted to a partner-side electronic instrument without placing a great onus on the electronic camera. Furthermore, the present invention provides an image transmission system and an image transmission method that enable distribution of image data obtained through a photographing operation executed in an electronic camera to portable terminals carried by a plurality of parties who are subjects without placing a great onus on the electronic camera.
  • the electronic apparatus automatically identifies the user of the electronic apparatus through wireless communication executed between the electronic apparatus and an electronic instrument having stored therein user personal information. In addition, if the electronic apparatus identifies a plurality of possible users, the electronic apparatus automatically selects the correct user based upon specific user identification processing.
  • an electronic instrument carried by a user automatically identifies an utilization-object, or target, electronic apparatus through wireless communication executed between the electronic apparatus and the electronic instrument.
  • the electronic instrument if the electronic instrument identifies a plurality of possible utilization-object electronic apparatuses, the electronic instrument automatically selects the correct electronic apparatus based upon specific identification processing.
  • the electronic apparatus engages in wireless communication with an electronic instrument having stored therein personal information that is available for general purposes and in the electronic apparatus, to have the personal information transferred to the electronic apparatus.
  • the user is categorized based upon the personal information available for general purposes and various operational settings are automatically selected for the electronic apparatus based upon the categorization results.
  • information related to a subject which is made to correlate to the position of the subject in the image plane is appended to the electronic image data and stored in memory and the information related to the subject is displayed in correspondence to the subject position within the image plane in a reproduced display of the electronic image data.
  • an image generating apparatus (a photographing apparatus, or an electronic camera) has a function which allows it to be connected to the Internet, information (or an image) that has been generated and appended with identification data (or a file name), is first transmitted to a specific information server (or image server) on the Internet where it is temporarily saved (or uploaded), a direct connection is established with an electronic instrument which is the recipient of the information (image) through a short-distance communication, information indicating the address of the Information server (or image server) on the Internet and the identification data corresponding to the generated information (image) are transmitted to the electronic instrument and the electronic instrument utilizes the information (image) by reading it out (or downloading) from the Information server (or image server) on the Internet based upon the address information and the identification data having been received.
  • an image generating apparatus achieves a direct connection with an electronic instrument which is the recipient of information (or image) through short-distance communication, receives information indicating the address of the electronic instrument on the Internet from the electronic instrument on the partner side and transmits (or uploads), on a temporary basis, the address information of the electronic instrument together with the information (or image) which has been generated to a specific information server (or image server) on the Internet, and the information server (or image server), in turn, automatically transmits the information (or image) which has been received to the electronic instrument corresponding to the received address information.
  • FIG. 1 a conceptual diagram of the system configuration adopted in a first embodiment of the present invention
  • FIG. 2 is an external view (a front view) of the electronic camera achieved in the first embodiment of the present invention
  • FIG. 3 is an external view (a rear view) of the electronic camera achieved in the first embodiment of the present invention.
  • FIG. 4 is a block diagram of the electrical structure adopted in the electronic camera in the first embodiment
  • FIG. 5 shows the structure of the user registration information
  • FIG. 6 shows the structure of the personal information data
  • FIG. 7 is a transition diagram of the state assumed in the electronic camera according to the present invention.
  • FIG. 8 presents a correspondence list relevant to the user identification operation
  • FIG. 9 presents a flowchart of the main routine executed at the CPU
  • FIG. 10 presents a flowchart of a subroutine
  • FIG. 11 presents a flowchart of a subroutine
  • FIG. 12 presents a flowchart of a subroutine
  • FIG. 13 presents an example of screen display
  • FIG. 14 presents a flowchart of a subroutine
  • FIG. 15 presents a flowchart of a subroutine
  • FIG. 16 presents an example of screen display
  • FIG. 17 presents a flowchart of a subroutine
  • FIG. 18 presents a flowchart of a subroutine
  • FIG. 19 presents an example of screen display
  • FIG. 20 presents a flowchart of a subroutine
  • FIG. 21 presents a flowchart of a subroutine
  • FIG. 22 presents an example of screen display
  • FIG. 23 presents an example of screen display
  • FIG. 24 presents an example of screen display
  • FIG. 25 presents an example of screen display
  • FIG. 26 presents a flowchart of a subroutine
  • FIG. 27 presents an example of screen display
  • FIG. 28 presents an example of screen display
  • FIG. 29 presents an example of screen display
  • FIG. 30 presents an example of screen display
  • FIG. 31 presents a flowchart of a subroutine
  • FIG. 32 presents a flowchart of a subroutine
  • FIG. 33 presents a flowchart of a subroutine
  • FIG. 34 presents a flowchart of a subroutine
  • FIG. 35 illustrates the structure of the wireless circuit
  • FIG. 36 illustrates the communication directivity
  • FIG. 37 illustrate the positional detection
  • FIG. 38 presents a flowchart of a subroutine
  • FIG. 39 presents a flowchart of a subroutine
  • FIG. 40 presents an example of screen display
  • FIG. 41 presents a flowchart of a subroutine
  • FIG. 42 presents a flowchart of a subroutine
  • FIG. 43 presents a flowchart of a subroutine
  • FIG. 44 presents an example of screen display
  • FIG. 45 illustrates the diopter adjustment
  • FIG. 46 presents a flowchart of a subroutine
  • FIG. 47 illustrates communication achieved through a user internal path
  • FIG. 46 presents a flowchart of a subroutine
  • FIG. 49 is a conceptual diagram of an example of a variation of the system configuration
  • FIG. 50 shows an example of a variation of the communication sequence
  • FIG. 51 illustrates a communication mode
  • FIG. 52 presents an external view of a portable telephone
  • FIG. 53 presents an example of screen display
  • FIG. 54 presents an example of screen display
  • FIG. 55 presents an example of screen display
  • FIG. 56 illustrates the concept adopted in a second embodiment of the present invention
  • FIG. 57 illustrates the system configuration adopted in the second embodiment of the present invention.
  • FIG. 58 is an external view (a front view) of the electronic camera achieved in the second embodiment of the present invention.
  • FIG. 59 is an external view (a rear view) of the electronic camera achieved in the second embodiment of the present invention.
  • FIG. 60 is a block diagram of the electrical structure adopted in the electronic camera in the second embodiment.
  • FIG. 61 shows the structure of the data in the memory
  • FIG. 62 shows the structure of the photograph information data
  • FIG. 63 shows the structure of the general information data
  • FIG. 64 shows the structure of the personal information data
  • FIG. 65 is a transition diagram of the state assumed in an electronic camera according to the present invention.
  • FIG. 66 presents a flowchart of the main program
  • FIG. 67 presents a flowchart of a subroutine
  • FIG. 68 presents an example of screen display
  • FIG. 69 presents a flowchart of a subroutine
  • FIG. 70 presents a flowchart of a subroutine
  • FIG. 71 illustrates the communication sequence
  • FIG. 72 illustrates the position information detection
  • FIG. 73 illustrates the position information detection
  • FIG. 74 illustrates the position information detection
  • FIG. 75 presents a flowchart of a subroutine
  • FIG. 76 presents a flowchart of a subroutine
  • FIG. 77 presents an example of screen display
  • FIG. 78 presents a flowchart of a subroutine
  • FIG. 79 presents an example of screen display
  • FIG. 80 presents a flowchart of a subroutine
  • FIG. 81 presents an example of screen display
  • FIG. 82 presents an example of screen display
  • FIG. 83 presents an example of screen display
  • FIG. 84 presents an example of screen display
  • FIG. 85 presents an example of screen display
  • FIG. 86 illustrates the structure of the wireless circuit
  • FIG. 87 illustrates the communication directivity
  • FIG. 88 presents an external view of a portable telephone
  • FIG. 89 presents an example of screen display
  • FIG. 90 illustrates the image-plane position information detection
  • FIG. 91 illustrates the communication directivity
  • FIG. 92 presents a flowchart of a subroutine
  • FIG. 93 presents an example of screen display
  • FIG. 94 presents an example of screen display
  • FIG. 95 illustrates how a subject is specified
  • FIG. 96 presents a flowchart of a subroutine
  • FIG. 97 illustrates an example of a variation of the system configuration
  • FIG. 98 illustrates the concept adopted in a third embodiment of the present invention.
  • FIG. 99 illustrates the system configuration adopted in the third embodiment of the present invention.
  • FIG. 100 is an external view (front view) of the electronic camera achieved in the third embodiment of the present invention.
  • FIG. 101 is an external view (rear view) of the electronic camera achieved in the third embodiment of the present invention.
  • FIG. 102 presents an external view (front view) of a portable telephone terminal
  • FIG. 103 is a block diagram of the electrical structure adopted in the electronic camera.
  • FIG. 104 is a block diagram of the electrical structure adopted at the portable telephone terminal.
  • FIG. 105 presents a list of upload dial numbers and the corresponding image servers
  • FIG. 106 shows the structure of the data in the memory card
  • FIG. 107 shows the structure of the photograph information data
  • FIG. 108 presents a transition diagram of the operating state of the electronic camera
  • FIG. 109 presents a flowchart of the main program
  • FIG. 110 presents a flowchart of a subroutine
  • FIG. 111 presents an example of screen display
  • FIG. 112 presents a flowchart of a subroutine
  • FIG. 113 illustrates the communication sequence
  • FIG. 114 presents a flowchart of a subroutine
  • FIG. 115 presents a flowchart of a subroutine
  • FIG. 116 presents an example of screen display
  • FIG. 117 presents a flowchart of a subroutine
  • FIG. 118 presents an example of screen display
  • FIG. 119 illustrates the communication sequence
  • FIG. 120 presents a flowchart of a subroutine
  • FIG. 121 presents an example of screen display
  • FIG. 122 illustrates the communication sequence
  • FIG. 123 illustrates the concept adopted in an embodiment of the present invention
  • FIG. 124 presents a flowchart of a subroutine
  • FIG. 125 presents an example of screen display
  • FIG. 126 illustrates the structure of the wireless circuit
  • FIG. 127 illustrates the communication directivity
  • FIG. 123 illustrates the concept adopted in an embodiment of the present invention.
  • FIG. 1 is a conceptual diagram of an electronic camera adopting the present invention and an electronic image communication system achieved by utilizing the electronic camera.
  • an electronic camera 100 that includes a memory card saves electronic image data or digital image data (hereafter referred to as image data) obtained through a photographing operation into the memory card.
  • the electronic camera 100 has a short-distance wireless communication function (e.g., Bluetooth (registered trademark; RTM) enabling communication within a range of approximately a 10M ) and engages in communication with a portable telephone 160 that also has a short-distance wireless communication function. It is assumed that this portable telephone 160 is carried by a user of the electronic camera 100 and that a UIM card (user identification module card) 170 having stored therein user personal information (or user information) can be loaded at the portable telephone 160 .
  • a UIM card user identification module card
  • the user personal information is transferred to the electronic camera 100 and the electronic camera 100 , in turn, sets itself up for the specific user in conformance to the transferred personal information.
  • image data obtained through a photographing operation at the electronic camera 100 are transferred to the portable telephone 160 from the electronic camera 100 , and then the image data are transferred to a personal computer 140 for personal use or an image server 150 , first via a wireless base station 120 through a wireless portable telephone line 120 and then via a wired or wireless public telephone line or the Internet 130 .
  • FIGS. 2 and 3 present external views (a front view and a rear view) of an embodiment of the electronic camera 100 in FIG. 1 .
  • a photographic lens 10 which forms a subject image
  • a viewfinder 11 used to check the photographic image plane
  • a strobe 12 used to illuminate the subject during a photographing operation
  • a photometering circuit 13 that detects the brightness of the subject
  • a grip portion 14 that projects out from the camera main body to allow the electronic camera 100 to be held by hand
  • a shutter release button 16 operated to issue an instruction for a photographing start and a power switch 17 (a momentary switch which is alternately set to ON or OFF each time it is operated) through which the on/off state of the power to the electronic camera 100 is controlled are provided at the upper surface.
  • an eyepiece unit of the viewfinder 11 that can be rotated to select a user-identify mode (a registration mode selected for user registration, an automatic mode selected for automatic user identification or a fixed mode selected for the user), a dot 20 used to indicate the setting position of the MODE dial 19 , a left LCD (left screen) 21 having a substantially quadrangular screen for displaying text and images and a right LCD (right screen) 22 having a substantially quadrangular screen for displaying text and images are provided.
  • an up button 23 and a down button 24 operated to switch the image displayed at the left screen 21 are provided, whereas under the right screen 22 and the left screen 21 , a SET button 25 operated to set various operations of the electronic camera 100 , a SEND button 26 operated to issue an instruction for an image data transmission to the outside, a CONFIRM button 27 used to finalize user selections and setting items and an INITIALIZE button 29 operated to initialize the various operational settings in the electronic camera 100 are provided.
  • a PHOTOGRAPH/DISPLAY button 28 operated to select a reproduction mode in which image data saved in a memory card 104 are displayed at the left screen 21 or a photographing mode in which image data are obtained through photographing (the photographing mode or the reproduction mode is alternately selected each time the button is operated) is provided.
  • a memory card slot 30 in which the memory card 104 can be loaded is provided.
  • the shutter release button 16 , the MODE dial 19 , the up button 23 , the left LCD 24 , the SET button 25 , the SEND button 26 , the CONFIRM button 27 , the PHOTOGRAPH/DISPLAY button 28 and the INITIALIZE button 29 are all operating keys operated by the user.
  • touch tablets 66 having a function of outputting position data corresponding to a position indicated through a finger contact operation are provided, and they can be used to make a selection from selection items displayed on the screens or to select specific image data.
  • the touch tablets 66 are each constituted of a transparent material such as a glass resin so that the user can observe images and text formed inside the touch tablets 66 through the touch tablets 66 .
  • FIG. 4 which presents an example of an internal electrical structure that may be assumed in the electronic camera 100 shown in FIGS. 2 and 3 , various components are connected with one another via a data/control bus 51 through which various types of information data and control data are transmitted.
  • the components are divided into the following five primary blocks, i.e., 1) a block, the core of which is a photographing control circuit 60 that executes an image data photographing operation, 2) a block, the core of which is constituted by a wireless transmission circuit 71 and a wireless reception circuit 72 that execute transmission/reception of image data to and from the outside, 3) a block constituted of the memory card 104 in which image data are stored and saved, 4) a block, the core of which is a screen control circuit 92 that executes display of image data and information related to the image data and 5) a block, the core of which is constituted of user interfaces such as operating keys 65 and a CPU 50 that implements integrated control on various control circuits.
  • the CPU 50 central processing unit
  • the CPU 50 which is a means for implementing overall control on the electronic camera 100 , issues various instructions for the photographing control circuit 60 , the wireless transmission circuit 71 , the wireless reception circuit 72 , the screen control circuit 92 and a power control circuit 64 in conformance to information input from the operating keys 65 , the touch tablets 66 , various detection circuits 70 , the power switch 17 , a timer 74 , the photometering circuit 13 , a GPS circuit 61 and an attitude change detection circuit 62 .
  • the various detection circuits 70 include a grip sensor to be detailed later and the like.
  • the photometering circuit 13 measures the brightness of the subject and outputs photometric data indicating the results of the measurement to the CPU 50 .
  • the CPU 50 sets the length of the exposure time and the sensitivity of a CCD 55 through a CCD drive circuit 56 , and in addition, it controls the aperture value for an aperture 53 with an aperture control circuit 54 via the photographing control circuit 60 in conformance to the data indicating the settings.
  • the CPU 50 controls the photographing operation via the photographing control circuit 60 in response to an operation of the shutter release button 15 .
  • the CPU 50 engages the strobe 12 in a light emission operation via a strobe drive circuit 73 during the photographing operation.
  • the GPS circuit 61 (global positioning system circuit) detects information indicating the position of the electronic camera 100 by using information provided by a plurality of satellites orbiting around the earth and provides the detected position information to the CPU 50 . When an image is photographed, the CPU 50 transfers this position information or processed position information (the name of the site, the geographic location or the like) together with the image data to the memory card 104 for storage.
  • the attitude-change detection circuit 62 which is constituted of an attitude sensor of the known art or the like, capable of detecting a change in the attitude of the electronic camera 100 , provides information indicating the extent of the attitude change having occurred between the immediately preceding photographing operation and the current photographing operation to the CPU 50 . Based upon the attitude change information, a decision can be made with regard to whether or not the photographic composition has been changed for the current photographing operation from that for the immediately preceding photographing operation. The CPU 50 transfers this attitude change information together with the image data to the memory card 104 for storage.
  • the timer 74 having an internal clock circuit provides time information corresponding to the current time point to the CPU 50 .
  • the CPU 50 transfers the time point information indicating a photographing operation time point together with the image data to the memory card 104 for storage.
  • the CPU 50 controls the various units in conformance to a control program stored in a ROM 67 (read-only memory).
  • ROM 67 read-only memory
  • EEPROM 68 electrically erasable/programmable ROM
  • the CPU 50 implements control on a power supply 63 via the power control circuit 64 by detecting the operating state of the power switch 17 .
  • the photographing control circuit 60 focuses or zooms the photographic lens 10 through a lens drive circuit 52 , controls the exposure quantity at the CCD 55 through control implemented on the aperture 53 by engaging the aperture control circuit 54 and controls the operation of the CCD 55 through the CCD drive circuit 56 .
  • the photographic lens 10 forms a subject image onto the CCD 55 with a light flux from the subject via the aperture 53 which adjusts the light quantity. This subject image is captured by the CCD 55 .
  • the CCD 55 charge-coupled device having a plurality of pixels is a charge storage type image sensor that captures a subject image, and outputs electrical image signals corresponding to the intensity of the subject image formed on the CCD 55 to an analog processing unit 57 in response to a drive pulse supplied by the CCD drive circuit 56 .
  • the analog processing unit 57 samples the image signals having undergone photoelectric conversion at the CCD 56 with predetermined timing and amplifiers the sampled signals to a predetermined level.
  • An A/D conversion circuit 58 (analog digital conversion circuit) converts the image signals sampled at the analog processing unit 57 to digital data through digitization and the digital data are temporarily stored into a photographic buffer memory 59 .
  • the photographing control circuit 60 repeatedly executes the operation described above and the screen control circuit 92 repeatedly executes an operation in which the digital data sequentially stored into the photographic buffer memory 59 are readout via the data/control bus 51 , the digital data thus read out are temporarily stored into a frame memory 69 , the digital data are converted to display image data and are restored into the frame memory 69 and the display image data are displayed at the left screen 21 .
  • the screen control circuit 92 obtains text display information from the CPU 50 as necessary, converts it to display text data which are then stored into the frame memory 69 and displays the display text data at the left screen 21 and the right screen 22 . Since the image currently captured by the CCD 55 is displayed at the left screen 21 in real time in the photographing mode in this manner, the user is able to set the composition for the photographing operation by using this through screen (through image) as a monitor screen.
  • the photographing control circuit 60 detects the state of the focal adjustment at the photographic lens 10 by analyzing the degree of the high-frequency component of the digital data stored in the photographic buffer memory 59 and performs a focal adjustment for the photographic lens 10 with the lens drive circuit 52 based upon the results of the detection.
  • the photographing control circuit 60 engages the CCD 55 to capture a subject image via the CCD drive circuit 56 and temporarily stores image signals generated through the image-capturing operation as digital data (raw data) into the photographic buffer memory 59 via the analog processing unit 57 and the A/D converter circuit 58 .
  • the photographing control circuit 60 then generates image data by converting or compressing the digital data stored in the photographic buffer memory 59 on a temporary basis to a predetermined recording format (such as JPEG) and stores the image data back into the photographic buffer memory 59 .
  • the photographing control circuit 60 stores the image data stored back into the photographic buffer memory 59 into the memory card 104 as an image file.
  • the screen control circuit 92 reads out an image file specified by the CPU 50 from the memory card 104 , temporarily stores the image-file into the frame memory 69 , converts the image data to display image data and stores the display image data back into the frame memory 69 and then displays the display image data at the left screen 21 .
  • the wireless transmission circuit 71 In the reproduction mode, upon receiving a transmission instruction from the CPU 50 , the wireless transmission circuit 71 reads out a specified image file from the memory card 104 and outputs this image file to the outside through wireless transmission.
  • the wireless transmission circuit 71 and the wireless reception circuit 72 engage in communication with an external wireless instrument during a user identification operation and provides the results of the communication to the CPU 50 .
  • FIGS. 5 and 6 show the data structure of user registration information saved within the EEPROM 68 .
  • the user registration information is prepared for individual users (user A, user B, default user and most recent user in FIG. 5 ).
  • the default user is a general-purpose user having set for use in case a user identification could not be performed successfully, and the most recent user is the user who last operated the electronic camera 100 .
  • Each set of individual user information includes identification data (individual identification data used to identify the instrument at the communication partner), personal information data (see FIG. 6 ), camera data and priority order data.
  • the camera data are constituted of camera information, communication information, custom setting data, last operating time point data and last setting data.
  • the personal information data are general personal-information data having no relevance to the settings at the electronic camera 100 , which can be utilized for multiple purposes. For this reason, it is not necessary to specially set these personal information data in order to operate the electronic camera 100 .
  • the personal information data include the name, the date of birth and the preferred language of the individual, physical data (visual acuity, diopter(eyesight) and hand preference), tastes (favorite color, etc.) and other data, and the electronic camera 100 can be customized by using the personal information data.
  • FIG. 7 is a transition diagram of the state assumed in the electronic camera in the embodiment of the present invention.
  • wireless user identification processing is executed.
  • a communication operation and a camera setting operation are performed.
  • the operation shifts to the photographing mode in which a photographing operation and a camera setting operation are performed.
  • the operation shifts from the photographing mode to the reproduction mode or from the reproduction mode to the photographing mode.
  • an image data reproducing operation, a camera setting operation and an image data transmission operation are performed.
  • the operation shifts to user registration processing from the photographing mode or the reproduction mode, a communication operation and a registration operation for user registration are performed, and then when these operations are completed, the operation shifts back into the original mode, i.e., the photographing mode or the reproduction mode.
  • the table in FIG. 8 provided to facilitate an explanation of the user identification processing operation indicates that the default user is set as the current user if the number of users detected through the wireless communication operation is 0, the detected user is set as the current user if the number of detected users is one, and the user selected through automatic selection processing which is to be detailed later is set as the current user if the number of detected users is two or more.
  • FIG. 9 presents a chart of the main flow of the operations executed in the electronic camera 100 (the CPU 50 ) in the embodiment described above.
  • the power is turned on as the power switch 17 is operated in S 10 , and a user identification processing subroutine to be detailed later is executed in S 20 to identify the current user of the electronic camera 100 .
  • a photographing mode subroutine is executed in S 30 and thus the camera enters a photographing-enabled state. If the shutter release button 16 is operated in the photographing mode, a release interrupt processing subroutine in S 40 is executed to perform a photographing operation.
  • a photographing/reproduction interrupt processing subroutine in S 70 is executed and a reproduction mode subroutine in S 80 is executed to display the reproduced image of image data stored in the memory card 104 at the left screen 21 . If, on the other hand, the PHOTOGRAPH/DISPLAY button 28 is operated in the reproduction mode, the photographing/reproduction interrupt processing subroutine in S 70 is executed and the photographing mode subroutine in S 30 is executed.
  • a transmission interrupt processing subroutine in S 90 is executed to transmit the image data currently reproduced in the reproduction mode to the outside.
  • a setting interrupt processing in S 60 is executed to enable manual setting of the camera
  • initialize interrupt processing in S 95 is executed to initialize the camera settings in correspondence to the specific user and if the MODE dial 19 is set for registration in the photographing mode or the reproduction mode, a registration interrupt processing in S 50 is executed to enable a new user registration.
  • the last user is set as the most recent user in S 201 after the subroutine is started up in S 20 .
  • the MODE dial 19 is checked to ascertain whether or not it is set to “auto”, the most recent user is set as the current user if the MODE dial 19 is not set to “auto” and, accordingly, the most recent user settings are selected for the camera before the operation proceeds to S 210 . Namely, if the MODE dial 19 is set to “fixed”, the last user who operated the camera is set as the current user and the camera settings are selected accordingly without executing a user identification.
  • step S 204 the operation proceeds to step S 204 to attempt communication with the outside for user detection by engaging the wireless transmission circuit 71 and the wireless reception circuit 72 .
  • the user detection is performed over the maximum wireless communication range by setting the output signal level G of the wireless transmission circuit 71 to the maximum (Gmax).
  • S 205 a verification as to whether or not the number of registered users among the detected users is 0 is accomplished, and if the number of registered users is 0, the default setting is selected for the user, the camera settings for the default user are selected and the most recent user is updated as the default user in S 206 before the operation proceeds to S 210 . If, on the other hand, it is ascertained in S 205 that the number of registered users is not 0, a verification is achieved in S 207 as to whether or not the number of registered users among the detected users is 1, and if the number of registered users is 1, the detected registered user is set as the current user, the camera settings for the detected registered user are selected and the most recent user is updated as the detected registered user in S 208 before the operation proceeds to S 209 .
  • S 207 If it is ascertained in S 207 that the number of registered users among the detected users is not 1, i.e., if number of registered users among the detected users is at least 2, a single registered user is selected through an automatic user selection subroutine executed in S 25 which is to be detailed later, and then the operation proceeds to S 209 .
  • S 209 the individual user information of the registered user having determined to be the current user is read and stored into the EEPROM 68 by communicating with the selected registered user again.
  • step S 210 a verification as to whether or not the most recent user whose individual user information is stored in the EEPROM 68 matches the last user is achieved , and if they match, the operation makes a direct return in S 211 . If, on the other hand, they do not match, an automatic setting subroutine in S 22 which is to be detailed later is executed to initialize the camera settings before the operation makes a return in S 211 .
  • the output signal level G of the wireless transmission circuit 71 is first initialized to the minimum (Gmin) in S 251 to reduce the wireless communication range to the minimum, and then a wireless communication is attempted at the output signal level setting G to communicate with a registered user in S 252 .
  • a verification is achieved in S 253 as to whether or not communication with the registered user has been achieved, and if no communication has been achieved, the output signal level G of the wireless transmission circuit 71 is increased by ⁇ G in S 254 .
  • the registered user ranked highest in the priority order among the registered users with whom communication has been achieved is selected as the current user in S 259 , and the registered user thus selected is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user in S 260 before the operation makes a return in S 261 .
  • the default user is set as the current user, the camera settings for the default user are selected and the most recent user is updated as the default user in S 256 before the operation makes a return in S 261 .
  • the registered users are searched while gradually increasing the wireless communication range in the user automatic selection subroutine shown in FIG. 11 , the registered user present within the shortest distance from the camera (likely to be the current user of the camera) can be automatically selected and even if communication is achieved simultaneously with a plurality of registered users, the user most likely to be operating the camera can be selected in conformance to the predetermined priority order.
  • S 505 data indicating the selected user and his priority order rank and data related to the selected user (the identification data and the personal information data) are registered and saved at the EEPROM 68 .
  • the setting interrupt processing subroutine is executed to select the camera settings and the like for the newly registered user, and then in S 506 , data indicating the selected settings and the like are registered and saved at the EEPROM 68 as the camera data of the newly registered user and also the setting data are sent back through the wireless transmission circuit 71 to be saved into an instrument such as a portable telephone carried by the user before the operation proceeds to S 509 .
  • signal output levels of the signal transmitted by the electronic camera and received by the users are detected on the reception side, and the data indicating the detected reception signal output levels are sent back to the camera where the registered user with the highest output level for reception signal (within the shortest distance) is selected.
  • the electronic camera communicates with detected registered users and obtains data indicating the output levels of signals transmitted by the user-side instruments from the user-side instruments in S 281 .
  • the signals transmitted by the user-side are received and the output levels of the received signals are detected.
  • the registered user with the lowest signal-attenuation rate is selected by comparing the reception-signal output levels with the transmission signal output-level data.
  • the registered user thus selected is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user, and then the operation makes a return in S 285 .
  • the registered user with the lowest signal attenuation rate at the electronic camera (the registered user within the shortest distance) is selected.
  • the registered user whose last operation of the electronic camera is the most recent (most likely to be the current user) is selected from the detected registered users.
  • the level of the user identification capability can be improved.
  • the wireless transmission circuit 71 and the wireless reception circuit 72 are enclosed by an electromagnetic shield film 79 over the camera front surface (along with the photographic optical axis), the camera lower surface, the camera upper surface and the camera left and right side surfaces. Accordingly, the wireless transmission circuit 71 and the wireless reception circuit 72 are allowed to achieve communication readily with a portable instrument 180 A carried by a user likely to be present to the rear of the camera, i.e., the photographer (user A in the figure) and are not allowed to achieve ready communication with a portable instrument 180 B carried by the subject (user B in the figure) or the like present along the photographing direction.
  • FIG. 35 illustrating the structure of the wireless transmission circuit 71 and the wireless reception circuit 72 within the electronic camera
  • the wireless transmission circuit 71 and the wireless reception circuit 72 are enclosed by an electromagnetic shield film 79 over the camera front surface (along with the photographic optical axis), the camera lower surface, the camera upper surface and the camera left and right side surfaces. Accordingly, the wireless transmission circuit 71 and the wireless reception circuit 72 are allowed to achieve communication readily
  • the directivity may be adjusted by adopting a specific antenna structure as well as by using an electromagnetic shield film.
  • FIG. 37 illustrating a user selection achieved based upon the positional relationships between the camera and users
  • camera user A and camera user B individually detect position data (x, y, z) with GPS (global positioning system) units, and the detected position data (x 1 , y 1 , z 1 ) and (x 2 , y 2 , z 2 ) of user A and user B respectively are provided to the camera which calculates the relative distances between the camera and user A and between the camera and user B based upon its own position data (x 0 , y 0 , z 0 ) and selects the user present within the shortest relative distance.
  • the relative distance between the camera and user A can be calculated as the square root of the sum of the squares of (x 1 -x 0 ), (y 1 -y 0 ) and (z 1 -z 0 ).
  • the position data indicating the user positions are obtained by communicating with the detected registered users in S 391 .
  • position data indicating the position of the camera itself are detected.
  • the relative distances between the camera and detected registered users are calculated.
  • the registered user present within the shortest relative distance is selected.
  • the registered user thus selected is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user before the operation makes a return in S 396 .
  • the registered user present within the shortest distance from the camera is selected based upon the positions detected with the GPS units.
  • the grip-sensor-interrupt-processing subroutine in S 45 is started up as the grip sensor senses that there has been a change at the grip portion 14 , i.e., a change from a non-gripped state to a gripped state and the user identification processing subroutine in S 20 is executed accordingly before the operation makes a return in S 451 .
  • the grip sensor interrupt subroutine shown in FIG. 41 as described above, the current user can be identified with a high degree of reliability even when a change of photographer occurs while the power is on.
  • the timer interrupt processing in S 46 is started up over a predetermined time interval, a verification is achieved in S 461 as to whether or not the camera is currently set in a self timer photographing mode (a mode set with an operating member (not shown), in which an image-capturing operation is executed with a predetermined delay following a shutter release operation) or a remote control mode (a mode set with an operating member (not shown), in which the shutter is released in the camera through a remote control operation), and if the camera is set in either the self timer photographing mode or the remote control mode, the operation makes a return in S 462 without executing a user identification. If, on the other hand, the camera is not set in the self timer photographing mode or the remote control mode, the user identification processing subroutine in S 20 is executed and then the operation makes a return in S 462 .
  • a self timer photographing mode a mode set with an operating member (not shown), in which an image-capturing operation is executed with a predetermined delay following a shutter release operation
  • the timer interrupt processing subroutine shown in FIG. 42 the user identification is executed over predetermined time intervals to achieve a reliable user identification even when the photographer hands the camera over to another operator while the power is on.
  • the user identification is not executed in an operating mode (the self timer photographing mode or the remote control mode) in which the photographer is likely to be away from the camera, the reliability of user identification is improved.
  • the camera operation is set based upon the personal information data.
  • the default settings for the preferred language, the display character size, the preferred text style and the operating mode are selected in S 961 .
  • S 962 a verification is achieved as to whether or not the most recent user is the default user, and if the most recent user is the default user, the operation makes a return in S 570 .
  • a verification is achieved in S 963 as to whether or not the preferred language of the most recent user is Japanese, and if the preferred language is Japanese, the operation proceeds to S 965 . If, on the other hand, the preferred language of the most recent user is not Japanese, English is set for the preferred language in S 964 before the operation proceeds to S 965 . In S 965 , a verification is achieved as to whether or not the visual acuity of the most recent user is equal to or higher than 1.0 and if the visual acuity of the most recent user and is not equal to or higher than 1.0, the operation proceeds to S 967 .
  • the character size is set to a standard size before the operation proceeds to S 967 .
  • S 967 a verification is achieved as to whether or not the age of the most recent user is equal to or higher than 15 and equal to or lower than 60, and if the age of the most recent user is not equal to or higher than 15 and equal to or lower than 60, the operation makes a return in S 570 .
  • the text style is set to “standard” in S 968
  • the operating mode is set to “manual” in S 969 and then the operation makes a return in S 570 .
  • the settings thus selected are reflected in the camera operation and the display operation executed after the return.
  • the operating mode of the camera is adjusted to satisfy individual user needs (a user likely to be inexperienced can use the camera with ease in the full auto (fully automatic) mode, whereas an experienced user can use the camera in a more advanced application through manual setting), the character size is adjusted for an easy-to-check display (a display of larger-sized characters for a user with poor vision), the text style is adjusted to ensure that an error-free operation is performed (an easy text style, free of technical terms, and graphics are used for seniors and children) and a reliable man-machine interface is provided by adjusting the preferred language, based upon the personal information data (the age, the visual acuity and the preferred language).
  • a Kepler viewfinder (a real-image finder) is constituted with an objective lens 110 , an erecting prism 111 and an eyepiece lens 112 and the photographic field can be observed with an eye 113 set on a viewfinder optical axis 115 .
  • the eyepiece lens 112 By moving the eyepiece lens 112 along the viewfinder optical axis, the diopter can be adjusted.
  • the eyepiece lens 112 is moved by a motor 114 which is controlled by the CPU 50 .
  • FIG. 46 presents a detailed flowchart of an automatic diopter (eyesight) adjustment performed for each user by adopting the configuration shown in FIG. 45 through the initialize interrupt processing subroutine.
  • the diopter is set to the default value in S 971 , a verification is achieved in S 972 as to whether or not the most recent user is the default user and if the most recent user is the default user, the operation proceeds to S 975 . If, on the other hand, the most recent user is not the default user, the operation proceeds to S 973 to make a verification as to whether or not the diopter data of the most recent user are available and if the diopter data are not available, the operation proceeds to S 975 .
  • the diopter is set as indicated by the diopter data of the most recent user and then the operation proceeds to S 975 .
  • the CPU 50 drives the motor 114 in conformance to the diopter that has been set to adjust the position of the eyepiece lens 112 , thereby completing the diopter adjustment executed in conformance to the user diopter before the operation makes a return in S 976 .
  • data indicating the extent of user fatigue may be read out by communicating with the user over predetermined time intervals to fine-adjust the diopter in conformance to the fatigue data, or the length of time over which the electronic camera 100 has been continuously used since the user started operating the electronic camera 100 may be measured to fine-adjust the diopter in conformance to the length of the continuous operation.
  • FIG. 47 showing the structure that may be adopted in an example of a variation of the wireless communication between the electronic camera and an instrument carried by the user
  • a weak electrical current (a weak electromagnetic wave) traveling through a path 202 inside the body (a hand 201 ) of the user holding the portion 14 of the electronic camera 100 is used to enable communication between the CPU 50 and a wristwatch-type instrument 200 worn by the user at which the user information is stored, through a transmission/reception circuit 75 .
  • the surface of the grip portion 45 an electrical contact layer that transmits/receives electrical signals through the hand placed in contact with the surface is formed. This contact layer is connected with the transmission/reception circuit 75 , so as to enable communication with the instrument 200 carried by the user via the hand holding the portion 14 .
  • the instrument 200 too, includes an electrical contact layer to be placed in contact with the hand 201 of the user, so that data can be received and stored data can be transmitted through the contact layer.
  • the default user is set as the current user, the camera settings for the default user are selected and the most recent user is updated as the default user in S 556 , and then the operation makes a return in S 557 .
  • the operator (the user) of the electronic camera 100 can be selected with a high degree of reliability.
  • FIG. 49 is another conceptual diagram of an electronic camera adopting the present invention and an electronic image communication system achieved by utilizing this electronic camera. While the personal information data stored in the UIM card 170 loaded at the portable telephone 160 are read into the electronic camera 100 through a short-distance wireless communication between the electronic camera 100 and the portable telephone 180 carried by the user in the electronic image communication system in FIG. 1 , personal information data stored in a wireless tag (a non-contact IC card) are automatically read from the wireless tag into the electronic camera 100 as the wireless tag enters the vicinity of the electronic camera 100 internally provided with a non-contact wireless read unit in the electronic image communication system shown in FIG. 49 .
  • a wireless tag a non-contact IC card
  • the electronic camera 100 also has a function of engaging in wireless communication through a wireless telephone line 190 and thus is capable of transmitting image data obtained through an image-capturing operation to an external database 140 and the like. As described above, in the electronic image communication system shown in FIG. 49 , the operations of the electronic camera 100 can be customized by the user simply by holding a compact wireless tag (a non-contact IC card) which does not require a power source.
  • a compact wireless tag a non-contact IC card
  • a wireless communication between the electronic apparatus and a portable instrument carried by a user is achieved simply as the user approaches the electronic apparatus, the personal information on the user is transferred to the electronic apparatus and the electronic apparatus automatically selects operational settings that are optimal for the user in conformance to the personal information.
  • the user is able to immediately operate the electronic apparatus which has been customized for his use without having to perform a special operation.
  • the likelihood of communicating with a partner (or an opponent) that is the current user is increased.
  • the user identification is executed at power ON before the main operation of the electronic apparatus starts and the settings of the electronic apparatus are selected in correspondence to the identified user.
  • a given user can always operate the electronic apparatus at the settings customized for the user.
  • the communication partner most likely to be the current user is automatically selected as the user if wireless communication with a plurality of partners has been achieved during the wireless user identification in the embodiment described above (see FIG. 10 ), the confusion that might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus is prevented.
  • the predetermined default user is set as the current user if wireless communication has not been achieved with any partner during the wireless user identification in the embodiment described above (see FIG. 10 ), even a user who is not capable of transferring his personal information data to the electronic apparatus can operate the electronic apparatus at the default settings.
  • the electronic apparatus can be operated at the settings corresponding to the most recent user without executing a user identification if the MODE dial 19 is set to “fixed” in the embodiment described above (see FIG. 10 ), the confusion that might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus is prevented.
  • the electronic apparatus can be initialized to the custom settings set in advance for a given user through a single action, and thus, the camera can be set to the user's preferred settings promptly to correspond to a specific photographing condition.
  • the names of the partners with whom wireless communication has been achieved are brought up on displayed at the screen and the user himself selects the name of the user to be newly registered in the embodiment described above (see FIGS. 21 and 23 ) and, as a result, the user to be newly registered can be specified with a high degree of reliability.
  • the communication partner reporting the highest-reception-signal level among a plurality of partners with whom wireless communication has been achieved is selected as the current user during the wireless user identification and, as a result, the partner most likely to be the current user (the partner present at the shortest distance from the electronic apparatus) can be selected, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • the communication partner with the lowest transmission signal attenuation rate among a plurality of partners with whom wireless communication has been achieved is selected as the current user during the wireless user identification and, as a result, the partner most likely to be the current user (the partner present at the shortest distance from the electronic apparatus) can be selected, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • the communication partner who has operated the electronic apparatus most recently among a plurality of partners with whom wireless communication has been achieved is selected as the current user during the wireless user identification and, as a result, the partner most likely to be the current user can be selected, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • the directivity of the wireless communication in the electronic apparatus is restricted along the direction in which the operator of the electronic apparatus is likely to be present relative to the position of the electronic apparatus main unit by using an electromagnetic shield or the like and, as a result, the likelihood of wireless communication being achieved with the true user is increased and the likelihood of wireless communication being achieved with parties other than the true user is reduced during the user identification executed through a wireless communication, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • the communication partner who is present within the shortest distance from the electronic apparatus among a plurality of partners with whom wireless communication has been achieved is selected as the current user during the wireless user identification and, as a result, the partner most likely to be the current user can be selected, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • the user identification is executed by detecting that the electronic apparatus is held by the user in the embodiment described above (see FIG. 41 ), the user identification is executed only if there is a user who is bound to be operating the electronic apparatus and thus, an erroneous user identification resulting from wireless communication achieved with a non-user partner when there is no true user present in the vicinity of the electronic apparatus is prevented.
  • the user identification is executed over predetermined time intervals.
  • a change-over of the user taking place when the power to the electronic apparatus is on can be automatically detected with a high degree of reliability and the electronic apparatus can be set for operation by the new user.
  • the operational settings and the display settings are selected for the electronic apparatus in conformance to the extent of expertise of a given user in handling the electronic apparatus which is indirectly assessed based upon the general personal information on the user (information bearing no direct relevance to the electronic apparatus) in the embodiment described above (see FIGS. 43 and 44 ), unexperienced users (e.g. seniors and children) unaccustomed to handling electronic apparatuses can use the electronic apparatus through an easy and simple operation and, at the same time, users in the age group likely to be familiar with handling electronic apparatuses can utilize the electronic apparatus in a more advanced application through manual settings and the like.
  • the diopter adjustment of the viewfinder is executed based upon the information indicating the user's visual acuity in the embodiment described above (see FIGS. 45 and 46 ), it is not necessary to manually perform a viewfinder diopter adjustment each time one of a plurality of users sharing the camera operates the camera and, as a result, a photographing operation can be started promptly without missing a good photo opportunity.
  • the electronic apparatus communicates with an instrument worn by the user through the hand of the user holding the electronic apparatus in the embodiment described above (see FIGS. 47 and 48 ), the user can be identified with a high degree of reliability.
  • the electronic instrument may be embedded in the body of the user.
  • the electronic apparatus includes a read circuit that reads a non-contact IC card (wireless tag) and thus is able to perform a non-contact read of the personal information from the non-contact IC card (wireless tag) in the embodiment described above (see FIG. 49 ), the personal information on the user carrying a light weight and compact non-contact IC card that does not require a power supply is provided to the electronic apparatus without requiring the user to perform any special operation and thus, the electronic apparatus can be operated with the settings preferred by the user.
  • FIG. 50 illustrates the sequence of a wireless communication exchange between an electronic instrument carried by the user and the electronic apparatus that the user wishes to utilize, which is initiated on the electronic instrument side.
  • the electronic instrument is a portable telephone 160 shown in FIG. 52 at which a UIM card or the like is loaded and thus the personal information on the user carrying the portable telephone 160 is stored and saved.
  • the portable telephone 160 includes a display screen 161 and operating buttons 162 . As shown in FIG.
  • FIG. 53 presents an example of a display in which the electronic camera is selected as the category of the electronic apparatus the user wishes to use.
  • the electronic instrument transmits electronic instrument identification information, information indicating the selected category and the user identification information to the electronic apparatuses present in the vicinity of the electronic instrument together with a utilization state information request 1 .
  • the electronic apparatuses having received the utilization state information request 1 from the electronic instrument, any electronic apparatus that does not belong to the category indicated by the category information does not return any response.
  • the electronic apparatus verifies the current utilization state (i.e. whether or not the user indicated by the user information can use the apparatus) and then executes communication 2 to transmit the utilization state information, resulting from the verification, to which both the identification information of the sender electronic apparatus and the electronic instrument information corresponding to the recipient are appended.
  • the electronic instrument corresponding to the electronic instrument information that has received the utilization state information 2 from eligible electronic apparatuses displays their utilization states, i.e., the identities of the electronic apparatuses that are available for use, as shown in FIG. 54 , at the display screen 161 and then automatically selects a single electronic apparatus based upon, for instance, varying distances between the electronic instrument and the electronic apparatuses. Alternatively, the user may be allowed to select the specific category of the electronic apparatus that he wishes to use with a selection member (not shown).
  • FIG. 54 presence an example of a display in which the electronic camera A is selected as the electronic apparatus to be used.
  • the electronic instrument After the electronic apparatus to be used is selected at the electronic instrument, the electronic instrument issues a utilization request to the selected electronic apparatus by transmitting the identification information corresponding to the selected electronic apparatus, the identification information corresponding to the sender electronic instrument and the user personal information through a communication 3 .
  • the electronic apparatus selected by the electronic instrument Upon receiving the utilization request 3 , the electronic apparatus selected by the electronic instrument performs a custom setting operation for the electronic apparatus based upon the user personal information and also, it sends a verification notification attached with the identification information corresponding to the sender electronic apparatus and the identification information corresponding to the recipient electronic instrument through a communication 4 with the electronic instrument that has transmitted the utilization state.
  • the electronic instrument i.e., the recipient of the verification notification, displays a message indicating that the selected electronic apparatus is now available for use at the display screen 161 , as shown in FIG. 55 , for the user. Subsequently, the user can operate the electronic apparatus at the settings customized for his use.
  • the user-side electronic instrument having received the electronic apparatus utilization information from an electronic apparatus (e.g., information indicating that the electronic apparatus is currently being used by another person, information indicating that only limited users are allowed use of the electronic apparatus), makes a decision as to whether or not the user can use this particular electronic apparatus. If the user-side electronic instrument decides that there is a single electronic apparatus in its vicinity that can be used by the user, this electronic apparatus is selected as the apparatus to be used by the user and transmits a message indicating that the user's intention to use the electronic apparatus and the user personal information to the electronic apparatus.
  • an electronic apparatus e.g., information indicating that the electronic apparatus is currently being used by another person, information indicating that only limited users are allowed use of the electronic apparatus
  • the electronic apparatus Upon receiving the message indicating the user's intention to use the electronic apparatus and the user personal information from the electronic instrument, the electronic apparatus identifies the user carrying the electronic instrument as the user of the electronic apparatus and customizes the settings of the electronic apparatus for the user based upon the user personal information.
  • the user-side electronic instrument determines that there are a plurality of electronic apparatuses in its vicinity that can be used by the user, it automatically selects the electronic apparatus that the user most likely wishes to use through a method similar to any of those adopted in the embodiment (the electronic apparatus present at the shortest distance, the electronic apparatus which is highest in the priority order, manual selection, etc.) and transmits a message indicating the user's intention to use the electronic apparatus and the user personal information to the selected electronic apparatus.
  • the electronic apparatus identifies the user carrying the electronic instrument as the user of the electronic apparatus and customizes the settings of the electronic apparatus for the user based upon the user personal information.
  • the user-side electronic instrument may transmits a message indicating the user's intention to use the electronic apparatus and the user personal information to the plurality of electronic apparatuses.
  • each of the plurality of electronic apparatuses identifies the user carrying the electronic instrument as the user of the electronic apparatus and, as a result, the settings at the plurality of electronic apparatuses are simultaneously customized for the user based upon the user personal information.
  • the need to execute the user identification at each electronic apparatus is eliminated when the user wishes to use a plurality of electronic apparatuses at the same time.
  • the user-side electronic instrument may immediately transmit a message indicating the user's intention and the user personal information to any electronic apparatus in its vicinity without verifying the number of electronic apparatuses in its vicinity that can be used by the user.
  • each electronic apparatus having received the message indicating the user's intention to use and the personal information from the electronic instrument identifies the user carrying the electronic instrument as the user of the electronic apparatus and, as a result, the settings at a plurality of electronic apparatuses or a single electronic apparatus can be customized for the user based upon the user personal information.
  • the electronic apparatus automatically selected by the user-side electronic instrument may be displayed at the display unit of the electronic instrument for user verification.
  • a list of electronic apparatuses available for use by the user may be displayed at the user-side electronic instrument based upon the results of the communication to allow the user to select the electronic apparatus he wishes to use.
  • the user identification is achieved through a wireless (electromagnetic wave) communication between the electronic apparatus and the electronic instrument carried by the user.
  • a redundant communication with an electronic instrument carried by a party other than the true user should be prevented by keeping the wireless transmission signal output level within a range lower than a predetermined value and thus, limiting the communication-enabled distance range based upon the assumption that electronic instruments have the average reception capability.
  • This distance range is determined by taking into consideration the standard manner in which the electronic apparatus is operated, whether or not an object that blocks the electromagnetic waves is present between the electronic apparatus and the electronic instrument, the performance of the receiver at the electronic instrument and the like. It is desirable to set the range equal to or less than approximately 10 m under normal circumstances and to set the range equal to or less than 2 m and preferably equal to or less than 1 m in the case of an electronic apparatus that is primarily operated by the user carrying the apparatus.
  • the user identification is achieved through a wireless (electromagnetic wave) communication between the electronic apparatus and the electronic instrument carried by the user.
  • the user identification may be instead achieved through either a non-contact method or a contact method other than wireless communication.
  • a contact-type user identification method e.g., finger print detection
  • a non-contact user identification method e.g., pupil pattern detection, retina pattern detection, face image detection or voiceprint detection
  • the wireless user detection method that uses the electromagnetic wave described above, an even more reliable user identification is enabled.
  • the electronic apparatus detects the user's physical characteristics and then identifies the communication partner whose physical characteristics read out from his electronic instrument present in the vicinity of the electronic apparatus through wireless communication match the detected physical characteristics as the true user.
  • the user identification is executed through wireless communication between the electronic apparatus and the electronic instrument carried by the user and the user verifies that he has been identified by the electronic apparatus by checking the user name displayed at the screen of the electronic apparatus.
  • the electronic apparatus information may be transmitted by the electronic apparatus to the electronic instrument of the user identified by the electronic apparatus and the user's instrument, upon receiving the information, may display at the user-side electronic instrument, information indicating specifically which electronic apparatus has identified the user.
  • the electronic instrument may notify the user that he has been identified by an electronic apparatus by utilizing a means for sound generation. In such a case, one of various sound patterns may be used in correspondence to a specific electronic apparatus type so that a user can ascertain the type of electronic apparatus which has identified him without having to check the display.
  • the electronic apparatus wirelessly communicates with an electronic instrument carried by a user, and if a plurality of users are detected as a result, the electronic apparatus attempts communication with the electronic instruments by gradually increasing the transmission output and selects the user able to use the electronic apparatus with whom communication is achieved initially, as the correct user.
  • the electronic apparatus may attempt communication with electronic instruments by gradually increasing the transmission output from the initial state and select the user able to use the electronic apparatus with whom communication is achieved first as the correct user, instead.
  • the user identification is executed with the timing with which the electronic apparatus is held by the user.
  • the user identification may instead be executed through another method or with timing other than that used in the embodiment.
  • the user identification may be executed with timing with which a given operating member of the electronic apparatus is operated (e.g., the timing with which the shutter release button 16 of the electronic camera 100 is pressed half way down). Since this allows the user identification to be executed at time when the user of the electronic apparatus is surely present in the vicinity of the electronic apparatus, a reliable user identification is achieved and, at the same time, the need for a special detection sensor such as a grip sensor is eliminated.
  • a pair of elements i.e., a light emitting element and a light-receiving element may be provided near the eyepiece unit of the viewfinder 11 to execute the user identification with the timing with which an increase in the quantity of reflected light is detected as a user moves his eyes or face closer to the eyepiece unit.
  • the user identification may be executed with the timing with which the memory card 104 is detached.
  • the user identification may be executed with the timing with which the photographic lens is replaced, a zooming operation is performed, a focusing operation is performed or an external strobe is detached/attached in the electronic camera.
  • the remote control device may be equipped with a user identification function, instead. In this case, a user identification is enabled even in the remote control mode and, at the same time, it is not necessary to execute the user identification at individual electronic apparatuses when a plurality of electronic apparatuses are controlled by using a single remote control device
  • the camera setting operation and the display operation are customized based upon the personal information on the user identified by the electronic apparatus. However, processing other than the camera setting operation and the display operation may be executed in conformance to the personal information which has been obtained.
  • the functions of operating members may be changed (e.g., the functions of the shutter release button 16 and the power switch 17 in FIG. 2 may be switched) in conformance to the information indicating the user's favored hand (left-handed or right-handed).
  • the level of force (resistance) required to depress the shutter release button 16 may be adjusted in correspondence to the age or gender of the user.
  • any portion of the display on the screen that uses different hues may instead be displayed with varying levels of contrast if the user has a vision problem such as color blindness.
  • the electronic camera 100 communicates with the electronic instrument carried by the photographer (user) and identifies the photographer operating the electronic camera 100 .
  • the electronic camera 100 may instead communicate with an electronic instrument carried by a subject to set the operations of the electronic camera 100 in conformance to the personal information of the subject.
  • the electronic camera 100 engages in wireless communication with electronic instruments in the vicinity and eliminates the electronic instrument that appears to be carried by the photographer among the electronic instruments with which communication has been achieved, for instance.
  • the photographer's electronic instrument may be pre-registered at the electronic camera 100 , or the electronic instrument present at the shortest distance from the electronic camera 100 may be identified as the electronic instrument carried by the photographer and thus eliminated.
  • the electronic camera 100 sets the distance range over which the subject of the electronic camera 100 should be present by using information indicating the photographic lens focal length or information indicating the photographic lens photographing distance and then selects the electronic instrument carried by the user (subject) of the electronic camera 100 present within the subject distance range from the electronic camera 100 among the electronic instruments with which wireless communication has been achieved based upon information related to the distances between the electronic camera 100 and the electronic instruments (the signal levels, the position information or the like).
  • the photographing direction and the photographic field angle of the electronic camera 100 may be determined based upon azimuth information detected by a means for azimuth detection internally provided at the electronic camera 100 and the information indicating the photographic lens focal length, and the positions (the directions of the electronic instruments relative to the electronic camera 100 may be calculated based upon information indicating the measured positions (position information) of the electronic camera 100 and the electronic instruments so as to select the electronic instrument present within the photographic field angle along the photographing direction.
  • the electronic camera 100 may obtain data indicating the color of the irises of the subject so that if any red-eye phenomenon occurs in the image data obtained by photographing the subject, the portion of the image data corresponding to the red-eye can be touched up by using the pupil color data.
  • the personal information may include data indicating whether or not the red-eye phenomenon tends to occur readily so that the electronic camera 100 can automatically set itself in a red-eye reproduction mode (a mode for reducing the extent of the red-eye phenomenon by emitting a small quantity of light at the strobe prior to a photographing operation and causing the pupils of the subject to contract) to perform a strobe photographing operation if the personal information obtained from the subject indicates that the user tends to induce the red-eye phenomenon readily.
  • a red-eye reproduction mode a mode for reducing the extent of the red-eye phenomenon by emitting a small quantity of light at the strobe prior to a photographing operation and causing the pupils of the subject to contract
  • the personal information does not necessarily need to include direct data indicating whether or not the red-eye phenomenon tends to occur readily and, instead, it may include indirect data obtained by deducing the likelihood of the red-eye phenomenon based upon the general personal information such as the user's age and visual acuity. For instance, statistically, young children and myopic people tend to have large pupils and thus are likely to induce the red-eye phenomenon. Accordingly, the camera is forcibly set in the red-eye reproduction mode when photographing a young child or a myopic person.
  • a lamp or a very bright LED capable of continuously providing illumination over a predetermined length of time may be set in an ON state continuously prior to the photographing operation.
  • the first embodiment described above in which a prompt user identification is achieved without placing a great onus on the user by identifying the user through wireless communication and the user highly likely to be the true user is automatically selected if a plurality of user candidates are detected wirelessly, enables the user to promptly start operating the electronic apparatus such as an electronic camera without having to perform any time-consuming tasks.
  • the settings and the display at the electronic apparatus are adjusted by automatically optimizing them for use by the specific user in conformance to the general personal information bearing no direct relevance to the electronic apparatus settings, which the electronic apparatus has obtained from the user, an environment that allows inexperienced users, seniors and children as well as seasoned users to operate the electronic apparatus with ease and comfort is provided.
  • an electronic camera that photographs a subject present within the photographing range also obtains related information from the subject and an information source present in the vicinity of the camera through wireless communication.
  • the information related to the subject which is made to correspond to the position of the subject within the photographic image plane, is appended to the image data obtained through the photographing operation together with other related information to constitute an image file.
  • the image data having been obtained through a photographing operation are first reproduced and displayed as a home screen. Then, as the position within the image plane is specified at the home screen, the information related to the subject reproduced and displayed at the position is displayed at the screen. From the related information thus displayed, further related information linked with the information can be brought up on display.
  • FIG. 57 is a conceptual diagram of an electronic camera adopting the present invention and an electronic image communication system achieved by utilizing the electronic camera.
  • an electronic camera 100 that includes a memory card saves electronic image data or digital image data (hereafter referred to as image data) obtained by photographing a photographic field contained within a photographic image plane in a predetermined size into a memory card.
  • the electronic camera 100 has a short-distance wireless communication function (e.g., Bluetooth (registered trademark) enabling communication within a range of approximately a 10 meters) and engages in communication with a portable telephone 160 that also has a short-distance wireless communication function.
  • a short-distance wireless communication function e.g., Bluetooth (registered trademark) enabling communication within a range of approximately a 10 meters
  • this portable telephone 160 is carried by a subject, i.e., a person to be photographed by the electronic camera 100 and that a UIM card (user identification module card) 170 having stored therein user personal information, that can be loaded at the portable telephone 160 .
  • the portable telephone 160 is internally provided with a means for positional measurement (a means for positional detection such as a GPS) unit.
  • the personal information of the subject is transferred to the electronic camera 100 through communication between the electronic camera 100 and the portable telephone 160 carried by the subject.
  • the electronic camera 100 detects the position of the subject within the photographic image plane through a method which is to be detailed later and stores an image file constituted of a set of the image-plane position information and the personal information as well as the image data obtained through the photographing operation into the memory card.
  • the electronic camera 100 also transfers the image file to a personal computer 140 for personal use or an image server 150 via a wireless base station 120 through a wireless portable telephone line 190 and then via a wired or wireless public telephone line or the Internet 130 .
  • the electronic camera 100 includes a liquid crystal display screen at which an image file can be reproduced and displayed by reading out the image file saved in the memory card.
  • the electronic camera 100 is capable of accessing various information servers and information sites based upon the information related to the subject via the wireless portable telephone line 190 and the Internet 130 , obtaining detailed information on the subject and reproducing and displaying the detailed information at the liquid crystal display screen.
  • the personal computer 140 for personal use includes a CRT display screen at which an image file can be reproduced and displayed by reading out the image file saved at the image server 150 .
  • the personal computer 140 for personal use is also capable of accessing various information servers and information sites based upon the information related to the subject via the Internet 130 , obtaining detailed information on the subject and reproducing and displaying the detailed information at the CRT display screen.
  • FIGS. 58 and 59 present external views (a front view and a rear view) of the electronic camera 100 in FIG. 57 , as achieved in an embodiment.
  • a photographic lens 10 which forms a subject image
  • a viewfinder 11 through which the photographic image plane is checked
  • a strobe 12 used to illuminate the subject during a photographing operation
  • a photometering circuit 13 that detects the brightness of the subject
  • a grip portion 14 that projects out from the camera main body to allow the electronic camera 100 to be held by hand
  • a shutter release button 16 operated to issue an instruction for a photographing start and a power switch 17 (a momentary switch which is alternately set to ON and OFF each time it is operated) through which the on/off state of the power to the electronic camera 100 is controlled are provided at the upper side of the camera.
  • an eyepiece unit of the viewfinder 11 a left LCD (left screen) 21 having a substantially quadrangular screen for text and image display and a right LCD (right screen) 22 having a substantially quadrangular screen for text and image display are provided, and an up button 23 and a down button 24 for switching the image displayed at the left screen 21 are provided in the vicinity of, and to the left of the left LCD 21 , whereas the photographing mode button 25 operated to set the electronic camera 100 in a photographing mode, a reproduction mode button 26 operated to set the electronic camera 100 in a reproduction mode, an information display mode button 27 operated to set the electronic camera 100 in an information display mode and a CONFIRM button 29 used to set a selection item are provided around the right screen 22 and the left screen 21 .
  • a memory card slot 30 at which a memory card 104 may be loaded is provided.
  • the shutter release button 16 , the up button 23 , the down button 24 , the photographing mode button 25 , the reproduction mode button 26 , the information display mode button 27 , the SEND button 28 and the CONFIRM button 29 are all operating keys operated by the user.
  • touch tablets 66 having a function of outputting position data corresponding to a position indicated through a finger contact operation are provided, and they can be used to make a selection from selection items displayed on the screens or to select a subject displayed on the screen.
  • the touch tablets 66 are each constituted of a transparent material such as a glass resin so that the user can observe images and text formed inside the touch tablets 66 through the touch tablets 66 .
  • FIG. 60 which presents an example of an internal electrical structure that may be assumed in the electronic camera 100 shown in FIGS. 58 and 59 , various components are connected with one another via a data/control bus 51 through which various types of information data and control data are transmitted.
  • the components are divided into the following five primary blocks, i.e., a block, the core of which is a photographing control circuit 60 that executes an image data photographing operation, a block, the core of which is constituted of a wireless communication circuit 71 that communicates with electronic instruments in the vicinity of the electronic camera 100 through wireless communication (Bluetooth or the like) and a wireless telephone circuit 72 that exchanges data with the outside through a wireless portable telephone line, a block constituted of the memory card 104 in which image files are stored and saved, a block, the core of which is a screen control circuit 92 that executes display of image data and information related to the image data and a block, the core of which is constituted of user interfaces such as operating keys 65 and a CPU 50 that implements integrated control on various control circuits.
  • a block the core of which is a photographing control circuit 60 that executes an image data photographing operation
  • a block the core of which is constituted of a wireless communication circuit 71 that communicates with electronic instruments in the vicinity of the electronic
  • the CPU 50 central processing unit
  • the CPU 50 which is a means for implementing overall control on the electronic camera 100 , issues various instructions for the photographing control circuit 60 , the wireless communication circuit 71 , the wireless telephone circuit 72 , the screen control circuit 92 and a power control circuit 64 in conformance to information input from the operating keys 65 , the touch tablets 66 , various detection circuits 70 , the power switch 17 , a timer 74 , the photometering circuit 13 , a GPS circuit 61 and an attitude detection circuit 62 .
  • An audio recording circuit 80 records sound generated by the subject during a photographing operation and sound made by the photographer after the photographing operation as a memorandum as audio data in conformance to control implemented by the CPU 50 , and such audio data are stored in the image file together with the image data.
  • An audio reproduction circuit 81 reproduces the audio data which are stored in the image file together with the image data during an image data reproduction or other audio data, in conformance to control implemented by the CPU 50 .
  • the photometering circuit 13 measures the brightness of the subject and outputs photometric data indicating the results of the measurement to the CPU 50 .
  • the CPU 50 sets the length of the exposure time and the sensitivity of a CCD 55 through a CCD drive circuit 56 , and in addition, it controls the aperture value for an aperture 53 with an aperture control circuit 54 via the photographing control circuit 60 in conformance to the data indicating the settings.
  • the CPU 50 controls the photographing operation via the photographing control circuit 60 in response to an operation of the shutter release button 15 . If the photometric data indicate a low subject brightness, the CPU 50 engages the strobe 12 in a light emission operation via a strobe drive circuit 73 during the photographing operation.
  • the 61 (global positioning system circuit) detects information indicating the position of the electronic camera 100 by using information provided by a plurality of satellites orbiting around the earth and provides the photographing position information to the CPU 50 when an image is photographed.
  • the attitude change detection circuit 62 which is constituted of an attitude sensor of the known art (a gyro sensor, an azimuth sensor or the like) capable of detecting the attitude of the electronic camera 100 , detects information indicating the camera attitude and provides the photographing attitude information to the CPU 50 during the image photographing operation.
  • the timer 74 having an internal clock circuit detects information indicating a current time point and provides the photographing time point (day and time) information to the CPU 50 when a photographing operation is executed, the CPU 50 controls the various units in conformance to a control program stored in a ROM 67 (read only memory).
  • ROM 67 read only memory
  • EEPROM 68 electrically erasable/programmable ROM
  • the CPU 50 implements control on a power supply 63 via the power control circuit 64 by detecting the operating state of the power switch 17 .
  • the photographing control circuit 60 focuses or zooms the photographic lens 10 through a lens drive circuit 52 , controls the exposure quantity at the CCD 55 through control implemented on the aperture 53 by engaging the aperture control circuit 54 and thus controls the operation of the CCD 55 through the CCD drive circuit 56 .
  • the photographic lens 10 forms a subject image onto the CCD 55 with a light flux from the subject via the aperture 53 which adjusts the light quantity. This subject image is captured by the CCD 55 .
  • the CCD 55 charge-coupled device having a plurality of pixels is a charge storage type image sensor that captures a subject image, and outputs electrical image signals corresponding to the intensity of the subject image formed on the CCD 55 to an analog processing unit 57 in response to a drive pulse supplied by the CCD drive circuit 56 . It is to be noted that the photographic field contained inside the angle of field which is determined by the effective image plane size at the CCD 55 and the focal length of the photographic lens 10 is photographed into the photographic image plane as image data.
  • the photographing control circuit 60 repeatedly executes the operation described above and, concurrently, the screen control circuit 92 repeatedly executes an operation in which the digital data sequentially stored into a photographic buffer memory 59 are read out via the data/control bus 51 .
  • the digital data thus read out are temporarily stored into a frame memory 69 , the digital data are converted to display image data and are restored into the frame memory 69 and the display image data are displayed at the left screen 21 .
  • the screen control circuit 92 obtains text display information from the CPU 50 as necessary, converts it to display text data which are then stored into the frame memory 69 and displays the display text data at the left screen 21 or the right screen 22 . Since the image currently captured by the CCD 55 is displayed at the left screen 21 in real time in the photographing mode in this manner, the user is able to set the composition for the photographing operation by using this through screen as a monitor screen.
  • the photographing control circuit 60 detects the state of the focal adjustment at the photographic lens 10 by analyzing the degree of the high-frequency component of the digital data stored in the photographic buffer memory 59 and performs a focal adjustment for the photographic lens 10 with the lens drive circuit 52 based upon the results of the detection.
  • the photographing control circuit 60 engages the CCD 55 to capture a subject image via the CCD drive circuit 56 and temporarily stores image signals generated through the image-capturing operation as digital data (raw data) into the photographic buffer memory 59 via the analog processing unit 57 and the A/D conversion circuit 58 .
  • the photographing control circuit 60 then generates image data by converting or compressing the digital data stored in the photographic buffer memory 59 on a temporary basis in a predetermined recording format (such as JPEG) and stores the image data back into the photographic buffer memory 59 .
  • a predetermined recording format such as JPEG
  • the CPU 50 communicates with the electronic instrument such as a portable telephone carried by the subject via the wireless communication circuit 71 to collect information related to the subject and also to obtain information indicating the position of the electronic instrument.
  • the CPU 50 obtains the photographing position information from the GPS circuit 61 , the photographing attitude information from the attitude detection circuit 62 and the focal length information with regard to the focal length of the photographic lens 10 from the photographing control circuit 60 and executes an arithmetic operation to obtain image-plane position information indicating the subject position within the image plane through a method which is to be detailed later based upon the information indicating the electronic instrument position, the photographing position information, the photographing attitude information and the focal length information.
  • the CPU 50 stores the image data, the information related to the subject and the image plane subject position information into the memory card 104 as an image file.
  • the screen control circuit 92 reads out an image file specified by the CPU 50 from the memory card 104 , temporarily stores the image file into the frame memory 69 , converts the image data to display image data and stores the display image data back into the frame memory 69 and then displays the display image data at the left screen 21 . It also stores text data of the reproduction mode instructions and the like into the frame memory 69 and displays the text data at the right screen 22 in response to an instruction issued by the CPU 50 .
  • the wireless telephone circuit 72 upon receiving a transmission instruction issued by the CPU 50 , the wireless telephone circuit 72 reads out the specified image file from the memory card 104 and wirelessly transmits the image file to the outside.
  • the CPU 50 In the information display mode, the CPU 50 first reproduces and displays specific image data at the left screen 21 via the screen control circuit 92 as in the reproduction mode, and also brings up a superimposed display of an information icon at the position in the image plane of the subject corresponding to the position information. In addition, it displays instruction text data at the right screen 22 . If the information icon is selected through the touch tablet 66 in the information display mode, the CPU 50 accesses the information source (homepage or the like) on the Internet via the wireless telephone circuit 72 based upon the subject-related information corresponding to the image-plane position information, displays a screen or the like of the homepage at the left screen 21 and display is an operational instruction screen at the right screen 22 .
  • the information source homepage or the like
  • FIG. 61 shows the data structure of image files stored in the memory card 104 .
  • the image files are each constituted of image data and additional information data.
  • the additional information data are constituted of photographing information data (see FIG. 62 ) indicating various settings selected for the photographing operation, time point information data indicating the photographing time point, the position information data indicating the photographing position, the attitude information data indicating the camera attitude during the photographing operation, audio information data recorded during or after the photographing operation, general information data (see FIG. 63 ) containing general information related to the photographing operation input during or after the photographing operation and subject information data containing information related to subjects which is entered during or after the photographing operation.
  • the subject information data are constituted of information related to the subjects (each of which may be a person, a building, a landscape or the like) photographed in the photographic image plane.
  • the subject information data may be constituted of personal information data (see FIG. 64 ) of a subject person and general information data related to a subject building.
  • FIG. 62 shows the structure of the photographing information data constituted of setting information indicating the photographic lens setting and the camera settings selected for the photographing operation.
  • FIG. 63 shows the structure of the general information data constituted of Internet access data (homepage addresses or the like) indicating how individual sets of information may be accessed on the Internet, data indicating the contents of the individual sets of information and position information data indicating the positions of the originators of the individual sets of information.
  • Internet access data homepage addresses or the like
  • FIG. 64 shows the structure of the personal information data constituted of the image-plane position information indicating a specific subject within the image plane to which the personal information corresponds, position information indicating the position of the electronic instrument from which the personal information has been transmitted, Internet access information data (the homepage URL and the like) of the individual, the e-mail address of the individual, the individual's name, date of birth, preferred language, physical data (visual acuity, diopter, favored hand), tastes or person's liking (favorite color, etc.) and other data.
  • the homepage URL and the like the homepage URL and the like
  • the e-mail address of the individual the individual's name, date of birth, preferred language, physical data (visual acuity, diopter, favored hand), tastes or person's liking (favorite color, etc.) and other data.
  • FIG. 65 is a transition diagram of the state assumed by the electronic camera 100 in the embodiment of the present invention.
  • the electronic camera which assumes one of the three operating modes, i.e., the photographing mode, the reproduction mode and the information display mode, makes a shift among the modes in response to an operation of one of the three operating buttons (the photographing mode button 25 , the reproduction mode button 26 and the information display mode button 27 ).
  • the electronic camera shifts to the photographing mode first.
  • a photographing operation, a wireless communication operation and an image file preparation and storage operation are executed.
  • an image file reproducing operation and a transmission operation to transmit an image file to the outside are executed.
  • information display mode information related to the image data is collected by, for instance, accessing the Internet based upon the additional information data corresponding to the image data and the information thus collected is displayed.
  • FIG. 66 presents a flowchart of the main flow of the operations executed in the electronic camera 100 (the CPU 50 ) in the embodiment described above.
  • the power is turned on as the power switch 17 is operated in S 10 , and then, a photographing mode subroutine is executed in S 20 to enter a photographing-enabled state.
  • a release interrupt processing subroutine in S 30 is executed to perform a photographing operation.
  • a wireless interrupt subroutine in S 40 is called up from the release interrupt processing subroutine in S 30 to engage the electronic camera in wireless communication with electronic instruments present in the vicinity of the electronic camera to collect subject information data and general information data.
  • the subject information data and the general information data thus collected constitute the additional information data together with the photographing information data, and the additional information data are stored together with the image data into the memory card 104 as an image file.
  • mode switching interrupt processing in S 50 is started up to switch over to the mode selected in correspondence to the operating button which has been operated.
  • an image file stored in the memory card 104 is read out and the image data are reproduced and displayed at the display screen, and if the SEND button 28 is operated, a transmission interrupt processing subroutine in S 70 is executed to transmit the image file containing the image data currently reproduced in the reproduction mode to an external recipient.
  • image data are first reproduced and displayed at the display screen as detailed later and also, the subject information itself, which includes the image-plane position information corresponding to the position specified by the user through the touch tablet 66 on the display screen, or information collected by accessing the Internet based upon the Internet access information contained in the subject information is brought on display at the display screen.
  • an image file is generated by incorporating the image data and the additional information data and the image file is saved into the memory card 104 .
  • the image data are displayed at the screen over a predetermined length of time, and then the operation makes a return in S 305 . It is to be noted that the image data achieves a predetermined photographic image plane size so that a photographed subject is bound to be present somewhere within this photographic image plane in accordance to the photographic composition.
  • the signal transmission output for the wireless communication is adjusted in conformance to the focal length of the photographic lens. For instance, since it can be assumed that the subject is present over a long distance when the focal length is large, a signal transmission output level is set higher accordingly. If it is decided in S 403 that no communication has been achieved, the operation makes a return in S 408 . If communication has been achieved, on the other hand, the position information data indicating the electronic instrument position and the subject information data (or the general information data) are obtained through wireless communication from the electronic instrument of the communication partner with whom communication has been achieved in S 404 .
  • the position information indicating the position of the electronic camera 100 itself is received from the GPS circuit 61
  • the attitude information indicating the attitude of the electronic camera 100 itself is received from the attitude detection circuit 62
  • the position of the subject (electronic instrument) relative to the electronic camera 100 is determined by using the electronic instrument position information, the position information indicating the position of the electronic camera 100 and the attitude information indicating the attitude of the electronic camera 100 , as explained later.
  • the focal length information indicating the focal length of the photographic lens 10 is received from the photographing control circuit 60 and the angle of field for the photographing operation is calculated based upon the focal length information and the effective image plane size at the image-capturing element as detailed later, and also, based upon the position information indicating the position of the subject relative to the electronic camera 100 , data indicating the position of the subject within the photographed image plane (image-plane position information data) are generated before the operation makes a return in S 408 .
  • a shutter release operation is first performed at the electronic camera 100 , a photographing operation is performed in response and then wireless communication is attempted at the electronic camera 100 by transmitting a communication request 1 to electronic instruments present in the vicinity.
  • the transmission request 1 includes the identification information of the electronic camera 100 indicating the sender.
  • An electronic instrument having received the communication request 1 returns a response 2 to the electronic camera 100 by transmitting the identification information of the sender electronic instrument and the identification information of the recipient electronic camera.
  • the electronic camera 100 Upon receiving the response 2 , the electronic camera 100 verifies the communication partner based upon the electronic instrument identification information and transmits an information request 3 so that information related to the communication partner can be requested by specifying the communication partner's electronic instrument.
  • the information request 3 transmitted to the electronic instrument includes the identification information of the recipient electronic instrument, the identification information of the sender electronic camera 100 , the contents of the information being requested and the like.
  • the electronic instrument corresponding to the electronic instrument identification information included in the information request 3 transmits information 4 , i.e., the requested related information (the position information, the access information, etc.) to the electronic camera.
  • the information 4 transmitted to the electronic camera includes the identification information corresponding to the electronic camera 100 , i.e., the recipient, the identification information corresponding to the electronic instrument, i.e., the sender and the related information.
  • the electronic camera 100 receives the information 4 transmitted by the electronic instrument.
  • the electronic camera 100 sequentially issues the information request 3 to all the electronic instruments having responded to the communication request 1 .
  • FIGS. 72 and 73 illustrate how the image-plane position information data indicating the position of a subject within the image plane may be obtained.
  • the subject position information data and the photographing position information data are provided as positional coordinates in a three-dimensional coordinate system the center of which is set at a predetermined reference origin point.
  • the reference coordinate system is converted to a coordinate system (an axis P: extending along the camera optical axis, an axis Q: extending along the lateral side of the camera, an axis R: extending along the longitudinal side of the camera) through a parallel translation of the origin point of the reference coordinate system to a central point 3 (the photographing position) of the electronic camera 100 , and the positional coordinates of the subject 203 after the conversion are determined.
  • a coordinate system an axis P: extending along the camera optical axis, an axis Q: extending along the lateral side of the camera, an axis R: extending along the longitudinal side of the camera
  • an angle of field 5 (indicated by the one-dot-and-bar chain line) is calculated along photographic optical axis (along the P axis) based upon the photographic lens focal length information and the effective image plane size of the CCD.
  • the direction in which a straight line connecting the origin point 3 and the subject position 203 extends is calculated.
  • a decision can be made as to whether or not the subject is contained in the image plane and the position of the subject within the image plane (the image-plane position information data) can be calculated as shown in FIG. 74 .
  • the image file containing the image data currently reproduced and displayed at the left screen 21 is read out from the memory card 104 and this image file is transmitted through the wireless telephone circuit 72 to the recipient specified by the user before the operation makes a return in S 703 .
  • the home screen of the information display mode is first displayed in S 801 .
  • image data are displayed at the left screen 21 , as shown in FIG. 77 . If the operation has shifted into the information display mode from the reproduction mode, the image data that were reproduced and displayed at the left screen 21 in the reproduction mode are kept on display, whereas if the operation has shifted into the information display mode from the photographing mode, the image data of the image file obtained through the most recent photographing operation are displayed.
  • graphics 82 and 83 indicating subject information are displayed at positions corresponding to individual sets of image-plane position information data contained in the additional information data and also an icon 85 for bringing up a display of a map based upon the photographing position information data, an icon 84 for bringing up a display of news items reported on the photographing date based upon the photographing time point information data and an icon 86 for bringing up a display of the general information are displayed, all superimposed over the image data.
  • the operating instructions with respect to the left screen 21 are displayed.
  • the operation branches to one of the following steps in correspondence to the position on the screen that has been touched by the user.
  • the operation branches to S 803 in which a map database on the Internet is accessed via the wireless telephone circuit 72 , map data containing the photographing position are downloaded from the map database in conformance to the photographing position information corresponding to the image data displayed at the home screen and the downloaded map data are displayed at the left screen 21 , as shown in FIG. 82 .
  • a reduction icon 87 and an enlargement 88 are superimposed over the map display and as the user touches one of the icons, the map display is either reduced or enlarged in correspondence to the icon that has been touched.
  • a thumbnail image (reduced image) of the image data having been displayed at the home screen is displayed and the home screen display in S 801 is restored if the user touches this thumbnail image.
  • the operation branches to S 804 in which a news database on the Internet is accessed via the wireless telephone circuit 72 , news data of the news reported on the photographing date are downloaded from the news database based upon the photographing time point information corresponding to the image data displayed at the home screen and the downloaded news data are displayed at the left screen 21 , as shown in FIG. 83 .
  • a list of news items and detail graphics 89 are displayed and if a user touches a specific detail icon, details of the corresponding news item are displayed at the left screen 21 .
  • a thumbnail image reduced image for pre-view
  • the operation branches to S 805 in which an information source (a homepage or a database) where detailed information related to the subject is stored is accessed on the Internet via the wireless telephone circuit 72 in conformance to the Internet access information data contained in the subject information indicated by the icon and the top screen of the homepage is brought up on display.
  • an information source a homepage or a database
  • the top screen of the homepage of the subject person is displayed at the left screen 21 , as shown in FIG. 84 , to enable the user to display a screen at a lower layer or link to another homepage through the touch tablet 66 .
  • a commentary screen related to a historical structure photographed in the image is brought up on display at the left screen 21 , as shown in FIG. 85 , to enable the user to display a more detailed description or an image of the historical structure at the left screen 21 through the touch tablet 66 .
  • a thumbnail image (reduced image) of the image data having been displayed at the home screen is displayed and the home screen display in S 801 is restored if the user touches this thumbnail image.
  • the wireless communication circuit 71 is enclosed by an electromagnetic shield film 79 along the camera rear surface (in the direction opposite from the direction along which the photographic optical axis extends, i.e., the side on which the photographer is present), the camera lower surface, the camera upper surface and the camera left and right side surfaces.
  • the wireless communication circuit 71 is allowed to communicate readily with the portable telephone 160 carried by the subject present in the direction along which the photographing operation is performed with the camera to collect the subject information with a high degree of reliability.
  • the wireless communication circuit 71 By allowing the wireless communication circuit 71 to manifest extreme directivity along the direction in which the photographing operation is performed with the camera through the structure shown in FIG. 86 as described above, the likelihood of communicating with the electronic instrument carried by the subject is increased. It is to be noted that the directivity may be adjusted by adopting a specific antenna structure instead of by using an electromagnetic shield film.
  • the electronic camera 100 collects subject information by communicating with electronic instruments in the vicinity during the photographing operation.
  • the subject information may be transmitted to the electronic camera engaged in the photographing operation from a subject-side electronic instrument either before or after the photographing operation, instead.
  • a UIM card or the like having stored and saved therein the personal information on the user carrying the portable telephone 160 is loaded.
  • the portable telephone 160 includes a display screen 161 and operating buttons 162 .
  • the subject person When the subject person wishes to transmit his personal information to the electronic camera, the subject person first operates an operating button 162 of the portable telephone 160 to set the portable telephone 160 in a personal information transmission mode, enabling transmission of the subject information to the electronic camera. Once the transmission is completed, a message indicating the recipient electronic camera and the transmission completion is displayed at the display screen 161 , as shown in FIG. 89 . Since this allows the subject to decide whether or not his personal information should be transmitted, random circulation of the personal information is prevented.
  • FIG. 90 shows an example in which the subject position within the photographic image plane is detected by using a highly-directional wireless antenna 75 (its directivity 43 manifests with great intensity along a single direction 42 , as shown in FIG. 91 ) which is utilized as a communication antenna of the wireless communication circuit 71 . It is to be noted that details of a structure that may be adopted in such a compact directional antenna are disclosed in, for instance, Japanese Laid-Open Patent Publication No. 2000-278037.
  • the electronic camera 100 is internally provided with the highly-directional wireless antenna 75 and a drive unit 76 that adjusts the direction of the highly-directional wireless antenna 75 by scanning the highly-directional wireless antenna 75 through mechanical drive. While the wireless communication processing is in progress during the photographing operation, the CPU 50 engages the drive unit 76 to drive the highly-directional wireless antenna so as to scan inside the photographic image plane and it concurrently communicates with a subject present within the image plane to collect the subject information data. This makes it possible to correlate scanning position of the highly-directional wireless antenna 75 (i.e., image-plane position information data) with the subject information data.
  • image-plane position information data i.e., image-plane position information data
  • wireless communication with electronic instruments present in the vicinity of the electronic camera 100 (a portable telephone carried by a subject person, an electronic sightseeing guide apparatus installed near a historical site and capable of transmitting sightseeing information with regard to the historical site through wireless communication) is attempted with the means for wireless communication 71 . If it is decided in S 414 that no communication has been achieved, the operation proceeds to S 417 . If, on the other hand, communication has been achieved, the subject information data are obtained through wireless communication from the electronic instrument of the communication partner with whom communication has been achieved in S 415 . In S 416 , image-plane position information contained in the subject information data is designated as the information indicating the scanning position of the highly-directional wireless antenna 75 .
  • FIG. 93 showing a display example of the home screen brought up in the information display mode
  • no graphics are displayed at the left screen 21 unlike the example shown in FIG. 81 .
  • the contact position is detected through the touch tablet 66 , and the subject information containing the image-plane position information data corresponding to the contact position is brought up on display at the right screen 22 , as shown and FIG. 94 .
  • an arrow mark 44 is displayed at the subject position corresponding to the subject information.
  • the outlines of the image a subjects in the image and displayed may be extracted through an image analysis, so as to enable a display of the subject information corresponding to image-plane position information data contained in an area 46 at the left screen 21 if the user touches the area near an outline (the area 46 in the vicinity of the person 45 in FIG. 95 ).
  • the subject information corresponding to the subject person can be brought up on display.
  • FIG. 96 presents a detailed flowchart of an information display mode subroutine achieved by executing additional processing between S 801 and S 802 in the information display mode subroutine shown in FIG. 80 .
  • additional processing the subject information contained in an image file is transplanted into a similar image file that does not have any subject information.
  • an image analysis is executed on the image data reproduced and displayed at the left screen 21 in S 811 to identify the primary subjects (a primary subject maybe, for instance, a person or a building taking up a relatively large area in the image plane) and then a verification is achieved in S 812 as to whether or not there is any subject information containing image-plane position information data present in the area occupied by a primary subject. If it is decided that there is such subject information, the operation proceeds to S 802 .
  • the primary subjects a primary subject maybe, for instance, a person or a building taking up a relatively large area in the image plane
  • a verification is achieved in S 812 as to whether or not there is any subject information containing image-plane position information data present in the area occupied by a primary subject. If it is decided that there is such subject information, the operation proceeds to S 802 .
  • image files with the photographing time point data indicating the same photographing date as the photographing date indicated by the photographing time point data of the image data the reproduced image of which is currently on display are searched in the memory card 104 and are extracted in S 813 .
  • an image analysis is executed on the image data in these files and image data having a subject similar to the primary subject of the image data the reproduced image of which is currently on display are extracted.
  • the image file containing the image data the reproduced image of which is currently on display is updated by attaching image-plane position information indicating the position of the primary subject within the image plane determined through the image analysis.
  • the photographing position information data, the photographing attitude information data and the focal length information data are read out from the image file containing the image data the reproduced image of which is currently on display, and a primary subject is identified upon data corresponding to the information data, which are obtained from a map database on the Internet and the results of the image analysis executed in S 813 . For instance, if a triangular object appearing to be a mountain is shown in an image plane photographed along a southerly direction from the vicinity of Lake Yamanaka, the object is identified as Mount Fuji.
  • a search is conducted on the Internet by using the name of the primary subject as a keyword, access information with regard to a homepage providing information on the primary subject is obtained and the image file containing the image data the reproduced image of which is currently on display is updated by appending image-plane position information corresponding to the position of the primary subject within the image plane ascertained through the image analysis.
  • the operation proceeds to S 802 .
  • no subject information related to a subject has been obtained during the photographing operation (e.g. when there is no sightseeing guide apparatus capable of wirelessly providing subject information present in the vicinity of the electronic camera used to photograph a landscape image or the like)
  • information related to the subject can be displayed when the image data are reproduced in the information display mode through this subroutine.
  • the image files searched in S 813 are not limited to those obtained on the same photographing date. For instance, a predetermined number of image files photographed before and after the image file with no subject information may be searched, instead.
  • the camera automatically identifies the subject by using the photographing position information data, the photographing attitude information data and the focal length information data and downloads information on the identified subject from the Internet in S 814 .
  • the user may identify the subject by reviewing the image data, input or download from the Internet information on the specified subject and update the image file by pasting the information onto the subject position on the screen image instead.
  • FIG. 97 is another conceptual diagram of an electronic camera adopting the present invention and an electronic image communication system achieved by using the electronic camera.
  • the personal information data stored in the UIM card 170 loaded at the portable telephone 160 are read into the electronic camera 100 through a short-distance wireless communication between the electronic camera 100 and the portable telephone 160 carried by the user.
  • the electronic image communication system in FIG. 97 on the other hand, as a wireless tag (a non-contact IC card) having stored therein personal information data moves closer to the electronic camera 100 having an internal non-contact wireless read unit, the personal information data are automatically read from the wireless tag into the electronic camera 100 .
  • the electronic image communication system shown in FIG. 97 allows the subject to communicate with the electronic camera 100 simply by carrying a compact wireless tag (a non-contact IC card) which does not require a power source.
  • subject information (including position information) is automatically transferred to the electronic camera through wireless communication between the electronic camera and a portable instrument carried by the subject during an image data photographing operation and, as a result, the need to perform a special operation to obtain the subject information is eliminated.
  • the image-plane position information data indicating a position accurately corresponding to the actual position on the image plane can be obtained by using the position information.
  • the subject position information should be obtained without a significant time lag relative to the timing with which the subject is photographed (it is desirable to obtain the subject position information within about 1 second prior to or following the photographing operation) in order to ensure that a position accurately corresponding to the actual position on the image plane is ascertained.
  • information related to a subject in the image which interests the user viewing the reproduced image data on display can be reviewed with ease and speed simply by touching the icon displayed in correspondence to the image-plane position information at the screen or by touching the subject.
  • subject information is obtained from a subject through a short-distance wireless communication to raise the probability of achieving communication with the subject present in the vicinity of the electronic camera and also to reduce the likelihood of unnecessarily communicating with an electronic instrument that is present at a long distance from the electronic camera and that has no relation to the subject.
  • the signal transmission output level for the wireless communication is adjusted in conformance to the photographic lens focal length, a reliable wireless communication with the subject is achieved and, at the same time, the likelihood of achieving wireless communication with a party other than the subject is reduced.
  • the image data reproduced and displayed in the reproduction mode can be transmitted to a transfer address indicated in subject information data obtained during the image data photographing operation.
  • image data can be transmitted with ease to a stranger met by chance on a trip or the like without having to keep track of the postal address, the e-mail address or the like of the stranger.
  • This feature is particularly convenient when there are numerous people photographed in a group picture because it relieves the user from the task of performing a special operation to send the image data to multiple recipients.
  • the original image data in which the subject is photographed are displayed (as a thumbnail image) at the same time, enabling the user to understand the relationship between the image data and the subject intuitively, and also, the home screen display can be promptly recovered by specifying the image data through the touch tablet 66 which enables the user to quickly and easily review the subject information in reference to the image data.
  • the wireless communication directivity of the wireless communication circuit 71 is restricted so as to manifest along the direction in which the subject is likely to be present relative to the electronic apparatus main unit by using an electromagnetic shield or the like and thus, reliable wireless communication with the subject is achieved to obtain the subject information wirelessly and the likelihood of achieving wireless communication with a party other than the subject is reduced.
  • a scan within the image plane is performed with a highly-directional wireless antenna to collect the image plane position information corresponding to a subject, to allow the system to be configured at low cost without having to provide a special means for positional measurement such as GPS unit at the electronic instrument on the subject side.
  • the image plane position information corresponding to a specific subject is detected through an image analysis and the subject information is copied from a similar image file, to enable the user to reference the subject information for a review even when no subject information has been obtained during the photographing operation.
  • the subject since information indicating the image-plane position of a subject is detected through image analysis, the subject is identified based upon the results of the image analysis and the position information and subject information related to the identified subject is collected from an information source on the Internet or the like, the user is able to reference the subject information for review even when no subject information has been obtained during the photographing operation.
  • the electronic camera includes a non-contact IC card (wireless tag) read circuit to execute a non-contact read of subject information from a non-contact IC card (wireless tag) carried by the subject.
  • a non-contact IC card wireless tag
  • the subject only needs to hold a compact and light-weight non-contact IC card which does not require a power source to ensure that his subject information is provided to the electronic camera without having to perform any operation.
  • subject information is obtained through a wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by a subject.
  • a redundant communication with electronic instruments carried by parties other than the true user should be prevented by keeping the wireless transmission signal output level within a range lower than a predetermined value and thus limiting the communication-enabled distance range based upon the assumption that electronic instruments have the average reception capability.
  • This distance range is-determined by taking into consideration standard manner in which the electronic apparatus is operated, whether or not an object that blocks electromagnetic waves is present between the electronic apparatus and the electronic instrument, the performance of the receiver at the electronic instrument and the like. However, it is desirable to set the distance range to approximately 10 m or less in correspondence to the focal length of the standard photographic lens.
  • the wireless communication circuit engages in communication through a short-distance wireless system such as Bluetooth.
  • a short-distance wireless system such as Bluetooth
  • the short-distance wireless communication may be achieved through a system other than Bluetooth, such as wireless LAN (IEEE802.11) or infrared communication (IrPA)
  • communication may be achieved by simultaneously utilizing a plurality of short-distance wireless systems. In the latter case, an electronic instrument with any type of short-distance function, carried by a subject can engage in wireless communication with the electronic camera.
  • subject information is transferred through a wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by a subject.
  • the subject information may instead be transferred through another non-contact or contact method.
  • sound can be recorded with the audio recording circuit 80 during the photographing operation.
  • a highly directional microphone may be utilized during a photographing operation to record sound generated by a subject present within the image plane, the recorded audio data may be stored into the image file in correspondence to the image plane subject position information and the recorded audio may be reproduced if the subject in the reproduced image display is specified through the touch tablet or the like in the information display mode, instead.
  • a plurality of microphones with extreme directivity may be utilized to record sound generated by a plurality of subjects in the image plane at the same time and the sound corresponding to a subject specified through the touch tablet or the like in the reproduced image display in the information display mode may be reproduced.
  • sound generated by a subject or voice commentary related to the subject may be transmitted as subject audio information to the electronic camera from an electronic instrument mounted with a microphone which is carried by the subject or an electronic instrument having pre-stored audio data of the information related to the subject any time, regardless of whether or not a photographing operation is in progress and regardless of photographing timing.
  • subject audio information may be appended to image data obtained by photographing the subject and saved in correspondence to the position of the subject within the photographic image plane indicated by the image-plane position information. Then.
  • a position on the left screen 21 or the right screen 22 is specified through the touch tablet 66 .
  • another pointing device may be utilized for this purpose.
  • a track ball or a mouse may be utilized as a means for screen position specification.
  • an electronic viewfinder a viewfinder that enlarges through an optical system a screen display brought up at a compact means for screen display to facilitate user observation
  • the direction along which the line of sight of the user extends (the site line position, the visual focus position) may be detected to enable a screen position specification.
  • a member (pen) provided exclusively for position specification may be used instead of the user's finger to ensure that a position is specified with greater accuracy and that unnecessary subject information is not displayed in response to an inadvertent finger contact.
  • subject information is transferred through wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by a subject.
  • a notification acknowledging the reception of the subject information may be issued from the electronic camera to the electronic instrument having transmitted the subject information to the electronic camera, and a message indicating that the subject information has been transferred may be brought up on display at the subject-side electronic instrument as well.
  • the electronic instrument may issue an audio message for the subject person to notify him that the subject information has been transferred by employing a means for sound generation.
  • the subject can verify that the subject information has been transferred.
  • the subject provided with an audio notification can verify that the subject information has been transferred even if his electronic instrument does not include a means for display, while the electronic instrument is left in a pocket or the like.
  • the electronic camera engages in the photographing mode operation, the reproduction mode operation and the information display mode operation.
  • image data may be generated through a photographing operation performed in the electronic camera
  • an image file may be prepared by collecting subject information and image-plane position information through wireless communication and this image file may be transferred on line or off-line to an image display apparatus other than the electronic camera, which then executes the reproduction mode operation and the information display mode operation, instead.
  • the switch over between the reproduction mode and the information display mode is achieved through a manual operation.
  • the user may specify a given subject at a specific position within the reproduced image display through the touch tablet in the reproduction mode to bring up a display of the subject information related to this particular subject. This allows the user to check the information on the subject immediately as he becomes interested in the subject while reviewing the image data, without having to switch from the reproduction mode to the information display mode.
  • audio data recorded with the audio recording circuit 80 during a photographing operation may be reproduced with the audio reproduction circuit 81 when reproducing the image data.
  • a music database on the Internet may be accessed based upon the photographing time point data to download music which was popular at the time of the photographing operation and this downloaded music may be automatically reproduced as background music when reproducing the image data, as well. In this case, the user, who is reminded both visually and audibly of the time when the picture was taken, can fully enjoy the viewing experience.
  • the electronic camera does not provide the photographer with a report of the subject information collected through communication with subjects during the photographing operation.
  • a list of collected subject information may be brought up on display together with the image data obtained through the photographing operation, instead. Since this allows the photographer to verify exactly what subject information was collected during the photographing operation, he is then able to take further action to collect more information if there is any missing information.
  • the electronic camera attempts wireless communication with electronic instruments present in the vicinity of the electronic camera to collect subject information during a photographing operation.
  • the electronic camera may be set in advance so as not to engage in wireless communication with the electronic instrument carried by the photographer operating the electronic camera in this situation.
  • the electronic camera may engage in wireless communication with the electronic instrument carried by the photographer during the photographing operation and record information related to the photographer collected through the wireless communication as photographer information constituting part of the photographing information data.
  • the electronic camera attempts wireless communication with electronic instruments present in the vicinity of the electronic camera to collect subject information during a photographing operation.
  • the transmission signal output level for the wireless communication (by means of e.g. electromagnetic waves, infrared light) may be adjusted in conformance to the subject distance (the output level is raised as the distance to the subject increases).
  • the distance to a subject can be obtained through a detection performed by a distance detection device of the known art, or the photographing distance of the photographic lens, which is manually set, may be used as the subject distance.
  • the electronic camera is able to achieve communication with the electronic instrument held by a subject with a high degree of possibility, while the likelihood of achieving communication with an electronic instrument held by a party other than the subject is reduced.
  • subject information is automatically transferred to the electronic camera through a wireless communication between the electronic camera and the electronic instrument held by a subject during a photographing operation.
  • the electronic instrument may notify the subject of the reception so as to determine whether or not any further communication is to be carried out in conformance to an instruction from the subject. Since the subject is given the option of refusing further communication as necessary in this manner, uncontrolled circulation of his personal information to the outside is prevented.
  • the electronic instrument may compare the identification number of the electronic camera having transmitted the request with the identification numbers of electronic cameras preregistered at the electronic instrument and may refuse any further communication if the electronic camera is not registered. In this case, the extent to which the personal information is allowed to circulate can be automatically controlled.
  • subject information is automatically transferred to the electronic camera through a wireless communication between the electronic camera and the electronic instrument held by a subject during a photographing operation.
  • the subject information may be automatically transferred into the electronic camera through a wireless communication between the electronic camera and the electronic instrument held by the subject while a photographing operation is not in progress.
  • a series of pictures can be taken quickly, and also, subject information corresponding to the plurality of sets of image data can be collected through a single wireless communication executed either automatically or manually when the photographing operation has been completed or the like.
  • subject information is automatically transferred to the electronic camera through a wireless communication between the electronic camera and the electronic instrument held by a subject, which is carried on at all times during a photographing operation.
  • the wireless communication operation for subject information collection may be disallowed to prevent a delay in the photographing operation. For instance, when a continuous photographing (continuous shooting) operation is performed with the electronic camera, wireless communication may be automatically disallowed to prevent a significant lag between continuously photographed image frames due to the wireless communication operation.
  • information indicating the position of a subject within the image plane is calculated by using the electronic camera attitude information and the positional measurement information for the electronic camera and the electronic apparatus obtained through the GPS units.
  • the image plane subject position information may be detected through another method. For instance, if the electronic camera and the electronic instrument held by a subject engage in communication with each other by using infrared light, the position of the electronic instrument within the image plane may be specified by detecting the light-receiving position on a two-dimensional infrared light image-capturing element provided at the electronic camera based upon the image data captured at the image-capturing element.
  • the image-plane position information alone may be obtained by detecting the position of a point light source of infrared light originating from the electronic instrument with a two-dimensional infrared light image-capturing element, and the subject information may be obtained through a wireless communication (electromagnetic waves).
  • the infrared point light source should be turned on with the timing with which the electronic instrument engages in wireless communication.
  • the subject information corresponding to a position at which the user touches the screen is accessed on the Internet and is brought up on display in the information display mode.
  • the Internet does not always need to be accessed, and if sufficient subject information has been collected during the photographing information, such subject information alone may be brought up on display at the screen. Since this eliminates the need to access the Internet to display subject information, a prompt subject information display is achieved, thereby reducing the stress of the user as he no longer needs to wait a long time for the subject information to come up on display for review.
  • image data having been photographed in a predetermined photographic image plane size are reproduced and displayed in a size matching the display screen size and the subject information corresponding to a position on the display screen specified by the user is brought up on display at the display screen in the information display mode.
  • the sizes of the photographic image plane and the display screen do not necessarily need to match each other, and the photographic image plane may be displayed either in an enlargement or in a reduction on the display screen, instead.
  • the image data may be modified, rotated or the like for display at the display screen.
  • the coordinates of a position specified on the display screen are converted to positional coordinates on the photographic image plane (through enlargement, reduction, shifting, rotation or the like) and the subject information is displayed based upon the positional coordinates resulting from the conversion. Since this eliminates the need to ensure that the shape of the photographic image plane and the shape of the display screen are congruent with each other, it becomes possible to utilize various types of image display apparatus in the information display mode and to specify a desired subject by enlarging the image at the display screen if multiple subjects are present in close proximity to one another in the image.
  • the electronic camera includes two display screens, at one of which subject information is displayed with the image data obtained by subjects displayed (as a thumbnail image) at the other screen in the information display mode.
  • the subject information may be displayed over the entire screen with a superimposed display of the image data containing the subjects brought up over part of the screen. As this allows simultaneous review of the subject information and the corresponding image data, the user can intuitively understand the relationship between the subject information and the image data even when the electronic camera has a single display screen.
  • information related to a subject which is made to correlate to a subject position within the image plane, is appended to the electronic image data for storage, and the subject-related information is brought up on display in correspondence to the subject position within the image plane when the electronic image data are reproduced and displayed.
  • FIG. 98 is a conceptual diagram of the third embodiment of the present invention.
  • An electronic camera photographs subjects within the photographing range, assigns a file name to image data obtained through the photographing operation, connects with an image server on the Internet and uploads the image data into a specific folder at the image server.
  • the electronic camera transmits address information (the Internet address of the image server is equivalent to URL (Uniform Resource Locator), the folder name and the image data file name at the image server) to electronic instruments (such as portable telephones) carried by the subjects in the vicinity of the camera through short-distance wireless communication.
  • the electronic instrument (such as a portable telephone) carried by a subject connects to the image server on the Internet based upon the information received from the electronic camera through short-distance wireless communication and displays at its screen the image data downloaded from the specific folder at the image server.
  • FIG. 99 is a conceptual diagram of an information transmission system achieved by utilizing an electronic camera, an image server and a portable telephone terminal adopting the present invention.
  • an electronic camera 100 photographs a subject present within its photographic range and generates electronic image data or digital image data (hereafter referred to as “image data”).
  • the electronic camera 100 having a wireless telephone function is capable of connecting with a public telephone line network 140 via a base station 120 through a wireless portable telephone line 190 .
  • the public telephone line network 140 is also connected with the Internet 130 to allow the electronic camera 100 to connect with an image server 150 accessible via the Internet 130 .
  • the image server 150 having a large capacity memory for saving image data saves image data uploaded via the Internet 130 into folders set in correspondence to individual users and downloads specified image data via the Internet 130 if a read request is issued from an external partner via the Internet 130 .
  • the electronic camera 100 uploads and saves image data into a specific folder.
  • the electronic camera 100 has a short-distance wireless communication function such as Bluetooth (registered trademark) and transfers identification data (URL) of the image server to which image data have been uploaded, folder identification data (folder name) and image data identification data (filename) to a portable telephone terminal 160 through communication achieved via a short-distance wireless line 180 with the portable telephone 160 having a matching short-distance wireless communication function.
  • the portable telephone 160 is assumed to be carried by a person who is the subject of a photographing operation executed with the electronic camera 100 , and a UIM card (user identity module card) 170 having stored therein user personal information at which image data can also be stored is loaded at the portable telephone 160 .
  • the portable telephone 160 having a wireless telephone function is capable of connecting with public telephone line network 140 through the wireless portable telephone line 190 .
  • the public telephone line network 140 is also connected with the Internet 130 and thus allows the portable telephone 160 to connect with the image server 150 installed on the Internet 130 .
  • the portable telephone 160 downloads the image data in the specific folder, saves the image data into the UIM card 170 , and also displays the downloaded image data at its screen.
  • Bluetooth refers to a mobile communication technology (for portable electronic instruments) with a low transmission output for short-distance communication, which covers a short-range of approximately 10 m by using wireless electromagnetic waves in the 2.45 GHz band and realizes a one-on-one or multiple-partner connection at a transfer rate of 1 Mbps.
  • FIGS. 100 and 101 present external views (a front view and a rear view) of an embodiment of the electronic camera 100 in FIG. 99 .
  • a photographic lens 10 which forms a subject image
  • a viewfinder 11 used to check the photographic image plane
  • a strobe 12 used to illuminate the subject during a photographing operation
  • a photometering circuit 13 that detects the brightness of the subject
  • a grip portion 14 that projects out from the camera main body to allow the electronic camera 100 to be held by hand
  • a shutter release button 16 operated to issue an instruction for a photographing start and a power switch 17 (a momentary switch which is alternately set to on and off each time it is operated) through which the on/off state of the power to the electronic camera 100 is controlled are provided at the upper surface.
  • an eyepiece unit of the viewfinder 11 at the rear surface of the electronic camera 100 , an eyepiece unit of the viewfinder 11 , a left LCD (left screen) 21 having a substantially quadrangular screen for text and image display and a right LCD (right screen) 22 having a substantially quadrangular screen for text and image display are provided, and an up button 23 and a down button 24 for scrolling the image displayed at the left screen 21 are provided in the vicinity of, and to the left of the left screen 21 , whereas a photographing mode button 25 operated to set the electronic camera 100 in a photographing mode, a reproduction mode button 26 operated to set the electronic camera 100 in a reproduction mode, a PRINT button 27 operated to print image data, a SEND button 28 operated to transmit image data and a DOWNLOAD button 29 operated to download image data are provided around the right screen 22 and the left screen 21 .
  • An upload dial 32 is an operating member rotated to select an image server to which specific image data are to be uploaded and as it is rotated, the image server corresponding to the number at the position of a dot 31 is selected.
  • a memory card slot 30 At a side surface, a memory card slot 30 at which a memory card 77 can be loaded is provided.
  • the shutter release button 16 , the up button 23 , the down button 24 , the photographing mode button 25 , the reproduction mode button 26 , the PRINT button 27 , the SEND button 28 , the DOWNLOAD button 29 and the upload dial 32 are all operating keys operated by the user.
  • touch screens 66 having a function of outputting position data corresponding to a position indicated through a finger contact operation are provided, and they can be used to make a selection from selection items or make an image data selection on the screens.
  • the touch screens 66 are each constituted of a transparent material such as a glass resin so that the user can observe images and text formed inside the touch screens 66 through the touch screens 66 .
  • FIG. 102 presents an example of an external view (front view) of the portable telephone 160 shown in FIG. 99 .
  • the portable telephone 160 includes a display screen 161 , operating keys 162 and a UIM card loading slot 163 .
  • FIG. 103 which presents an example of an internal electrical structure to be positioned assumed in the electronic camera 100 shown in FIGS. 100 and 101 , various components are connected with one another via a data/control bus 51 through which various types of information data and control data are transmitted.
  • the components are divided into the following five primary blocks, i.e., (1) a block, the core of which is a photographing control circuit 60 that executes an image data photographing operation, (2) a block, the core of which is constituted of wireless communication circuit 71 that communicates with electronic instruments in the vicinity of the electronic camera 100 through short-distance wireless communication (Bluetooth or the like) and a wireless telephone circuit 72 that exchanges data with the outside through a wireless portable telephone line, (3) a block constituted of the memory card 77 in which image files are stored and saved, (4) a block, the core of which is a screen control circuit 92 that executes display of image data and information related to the image data and (5) a block, the core of which is constituted of user interfaces such as operating keys 65 and a CPU 50 that implements integrated control on various control circuits.
  • a block the core of which is a photographing control circuit 60 that executes an image data photographing operation
  • a block the core of which is constituted of wireless communication circuit 71 that communicates with
  • the CPU 50 central processing unit
  • the CPU 50 which is a means for implementing overall control on the electronic camera 100 , issues various instructions for the photographing control circuit 60 , the wireless communication circuit 71 , the wireless telephone circuit 72 , the screen control circuit 92 and a power control circuit 64 in conformance to information input from the operating keys 65 , the touch screen 66 , the power switch 17 , a timer 74 and the photometering circuit 13 .
  • the photometering circuit 13 measures the brightness of the subject and outputs photometric data indicating the results of the measurement to the CPU 50 .
  • the CPU 50 sets the length of the exposure time and the sensitivity of a CCD 55 through a CCD drive circuit 56 , and in addition, it controls the aperture value for an aperture 53 with an aperture control circuit 54 via the photographing control circuit 60 in conformance to the data indicating the settings.
  • the CPU 50 controls the photographing operation via the photographing control circuit 60 in response to an operation of the shutter release button 15 .
  • the CPU 50 engages the strobe 12 in a light emission operation via a strobe drive circuit 73 during the photographing operation.
  • the timer 74 having an internal clock circuit detects information indicating a current time point and provides the photographing time point information to the CPU 50 during the photographing operation.
  • the CPU 50 controls the various units in conformance to a control program stored in a ROM 67 (read only memory).
  • an EEPROM 68 electrically erasable/programmable ROM
  • a RAM 70 which is a volatile memory is used as a work area of the CPU 50 .
  • the CPU 50 implements control on a power supply 63 via the power control circuit 64 by detecting the operating state of the power switch 17 .
  • the photographing control circuit 60 focuses or zooms the photographic lens 10 through the lens drive circuit 52 , controls the exposure quantity at the CCD 55 through control implemented on the aperture 53 by engaging the aperture control circuit 54 and thus controls the operation of the CCD 55 through a CCD drive circuit 56 .
  • the photographic lens 10 forms a subject image onto the CCD 55 with a light flux from the subject via the aperture 53 which adjusts the light quantity and this subject image is then captured by the CCD 55 .
  • the CCD 55 charge-coupled device having a plurality of pixels is a charge storage type image sensor that captures a subject image, and outputs electrical image signals corresponding to the intensity of the subject image formed on the CCD 55 to an analog processing unit 57 in response to a drive pulse supplied by the CCD drive circuit 56 .
  • the photographing control circuit 60 repeatedly executes the operation described above and the screen control circuit 92 repeatedly executes an operation in which the digital data sequentially stored into the photographic buffer memory 59 are readout via the data/control bus 51 , the digital data thus read out are temporarily stored into a frame memory 69 , the digital data are converted to display image data and are restored into the frame memory 69 and the display image data are displayed at the left screen 21 .
  • the screen control circuit 92 obtains text display information from the CPU 50 as necessary, converts it to display text data which are then stored into the frame memory 69 and displays the display text data at the left screen 21 and the right screen 22 .
  • the photographing control circuit 60 detects the state of the focal adjustment at the photographic lens 10 by analyzing the degree of the high-frequency component of the digital data stored in the photographic buffer memory 59 and performs a focal adjustment for the photographic lens 10 with the lens drive circuit 52 based upon the results of the detection.
  • the photographing control circuit 60 engages the CCD 55 to capture a subject image via the CCD drive circuit 56 and temporarily stores image signals generated through the image-capturing operation as digital data (raw data) into the photographic buffer memory 59 via the analog processing unit 57 and the A/D conversion circuit 58 .
  • the photographing control circuit 60 then generates image data by converting or compressing the digital data stored in the photographic buffer memory 59 on a temporary basis in a predetermined recording format (such as JPEG) and stores the image data back into the photographic buffer memory 59 .
  • a predetermined recording format such as JPEG
  • the CPU 50 engages the wireless telephone circuit 72 to wirelessly output the image data through an antenna 76 and thus uploads the image data into the folder at the image server on the Internet. It also engages the wireless communication circuit 71 to wirelessly output the identification data of the image server to which the image data have been uploaded, the folder identification data and the image data identification data through an antenna 75 . Subsequently, the CPU 50 converts the image data stored in the photographic buffer memory 59 to save them as reduced image data (hereafter referred to as “thumbnail image data”) with a smaller data volume and stores the thumbnail image data appended with the related information (the image server identification data, the folder identification data and the image data identification data), into the memory card 77 .
  • thumbnail image data reduced image data
  • the screen control circuit 92 reads out thumbnail image data specified by the CPU 50 from the memory card 77 , temporarily stores the thumbnail image data thus read out into the frame memory 69 and displays the thumbnail image data at the left screen 21 . It also stores text data such as reproduction mode instructions into the frame memory 69 and displays the text data at the right screen 22 , in response to an instruction issued by the CPU 50 . If the PRINT button 27 is operated in the reproduction mode, the CPU 50 executes an operation for having the image data corresponding to the reproduced thumbnail image data printed on a printer. If, on the other hand, the SEND button is operated in the reproduction mode, the CPU 50 executes an operation for having the image data corresponding to the reproduced thumbnail image data transmitted to a recipient electronic instrument.
  • the CPU 50 downloads the image data corresponding to the reproduced thumbnail image data from image server through the wireless telephone circuit 72 and sets the downloaded image data into the frame memory 69 .
  • the screen control circuit 92 converts the image data set in the frame memory 69 to display image data, stores the display image data back into the frame memory 69 and displays the display image data at the left screen 21 .
  • FIG. 104 presenting a block diagram of an example of an electrical structure that may be adopted in the portable telephone 160 shown in FIG. 102 , various components are connected around a CPU 151 at their center.
  • the CPU 151 central processing unit
  • the CPU 151 controls a wireless communication circuit 171 , a wireless telephone circuit 172 , a display screen 161 and a power source 163 in conformance to information input through the operating keys 162 , a touch screen 166 , a power switch 117 and a timer 174 .
  • the CPU 151 converts sound input through a microphone 164 to audio data and outputs the audio data in a wireless transmission to the outside through an antenna 176 by engaging the public telephone line network 172 and reproduces through a speaker 165 audio data received from the outside at the wireless telephone circuit 172 via the antenna 176 .
  • the CPU 151 engages the antenna 175 and the wireless communication circuit 171 to exchange information related to image data with the electronic camera 100 through short-distance wireless communication and, based upon related information (the image server identification data, the folder identification data and the image data identification data) thus obtained, it connects with the image server on the Internet via the public telephone circuit 172 , downloads the desired image data, displays a image reproduced from the image data at its display screen and saves the image data into a UIM card or a flash memory 168 .
  • related information the image server identification data, the folder identification data and the image data identification data
  • the CPU 151 controls the various components in conformance to a control program stored in a ROM 167 (read only memory).
  • ROM 167 read only memory
  • flash memory 168 electrically erasable programmable memory
  • a RAM 169 is a volatile memory which is used as a work area of the CPU 151 .
  • the CPU 151 controls the power source 163 by detecting the operating state of the power switch 117 .
  • FIG. 105 shows the relationship between the numbers set at the upload dial 32 of the electronic camera 100 shown in FIG. 101 and the image servers that are upload t recipients. As shown in FIG. 105 , numbers 1 , 2 , 3 and 4 respectively correspond to image servers W, X, Y and Z.
  • FIG. 106 shows the structure of the data stored in the memory card 77 .
  • a plurality of image files are saved in the memory card 77 .
  • Each image file is constituted of thumbnail image data and additional information data.
  • the additional information data is constituted of photographing information data (see FIG. 107 ) indicating the various settings selected for the photographing operation and upload information data obtained when the image data are uploaded to an image server.
  • the upload of information data include date/time data indicating the date and time point at which the image data were uploaded to the image server, the image server identification data (upload destination (location) URL), the folder name, the file name, the volume of the image data, the format adopted in recording the image data and the like.
  • the photographing information data are constituted of photographing date/time point data and setting information data related to the photographic lens setting and the camera settings selected for the photographing operation.
  • FIG. 108 is a transition diagram for the state of the electronic camera according to the embodiment of the present invention.
  • the electronic camera 100 which may be set in either of the photographing mode or the reproduction mode, shifts from one mode to the other in response to an operation of one of two operating buttons (the photographing mode button 25 and the reproduction mode button 26 ).
  • the electronic camera first shifts to the photographing mode.
  • the photographing mode the electronic camera executes a photographing operation, an upload operation to upload the image data to an image server, an operation for converting the image data to thumbnail image data and then storing the thumbnail image data into the memory card 77 , and a short-distance wireless communication operation for communicating with a portable telephone.
  • the reproduction mode it executes a reproducing operation for reproducing the thumbnail image data, a download operation for downloading image data from an image server, a transmission operation for transmitting the image data to a portable telephone and a print operation for printing the image data on the printer.
  • FIG. 108 presents a flowchart of the main flow of the operations executed in the electronic camera 100 (the CPU 50 ) in the embodiment described above.
  • the power switch 17 is operated, the power is turned on in S 10 , and then, a photographing mode subroutine is executed in S 20 thereby setting the electronic camera in a photographing-enabled state. If the shutter release button 16 is operated in the photographing mode, a release interrupt processing subroutine in S 30 is executed to perform a photographing operation.
  • the image data obtained through the photographing operation are uploaded into an image server on the Internet specified with the upload dial 32 through a wireless telephone line, the image data are also converted to thumbnail image data and stored into the memory card 77 , and information with regard to the image data upload is automatically transmitted to the portable telephone 160 carried by a subject in the vicinity of the electronic camera 100 through short-distance wireless communication.
  • mode switching interrupt processing in S 50 is started up to switch over to a mode corresponding to the operating button that has been operated.
  • a transmission interrupt processing subroutine in S 70 is executed to transmit information with regard to the upload of the image data corresponding to the selected thumbnail image data to the portable telephone 160 carried by the subject present in the vicinity of the electronic camera 100 through short-distance wireless communication.
  • the portable telephone 160 in turn, automatically connects with the image server on the Internet based upon the information received from the electronic camera 100 and downloads the image data which are then displayed at the screen and saved into the UIM card.
  • a print interrupt processing subroutine in S 80 is executed to transmit the information related to the upload of the image data corresponding to the selected thumbnail image data to a printer present in the vicinity of the electronic camera 100 through short-distance wireless communication, and the printer, in turn automatically connects with the image server oh the Internet based upon the information received from the electronic camera 100 , downloads the image data and then executes a printing operation.
  • a download interrupt processing subroutine S 90 is executed to download through a wireless telephone line the image data corresponding to the selected thumbnail image data from the image server at which the image data have been uploaded and the downloaded image data are reproduced and displayed at the display screen.
  • the image data are uploaded into a specific folder (a folder inherent to the photographer or the electronic camera 100 ) at the image server specified through the upload dial 32 .
  • a notification of the image data upload completion is transmitted to the electronic instrument in the vicinity of the electronic camera 100 through the means for wireless communication 71 in S 309 .
  • the image data display ends and then the operation makes a return in S 311 to start a through image display in the photographing mode again.
  • a photographing operation is executed in response to a shutter release operation performed at the electronic camera 100 , and then the image data are converted to thumbnail image data which are saved.
  • the electronic camera 100 attempts a wireless search communication 1 with an electronic instrument in the vicinity.
  • the search communication 1 the identification information of the electronic camera 100 indicating a sender is transmitted.
  • An electronic instrument having received the search communication 1 returns a response 2 to the electronic camera 100 in response, through which the identification information of the sender electronic instrument and the identification information of the recipient electronic camera are transmitted.
  • the electronic camera 100 Upon receiving the response 2 , the electronic camera 100 verifies the communication partner based upon the electronic instrument identification information, and also executes an information transmission 3 to transmit the upload information to the specific communication partner's electronic instrument. Through the information transmission 3 , the identification information of the recipient electronic instrument, the identification information of the sender electronic camera 100 , the upload information, the thumbnail image data and the like are transmitted.
  • the electronic instrument corresponding to the electronic instrument identification information transmitted through the information transmission 3 executes an information transmission 4 to the electronic camera 100 in response.
  • the identification information of the recipient electronic camera 100 the identification information of the sender electronic instrument and related information which includes specific image server information originating from the electronic instrument) are transmitted.
  • the electronic camera 100 Upon receiving the information 4 transmitted by the electronic instrument, the electronic camera 100 uploads the image data to the image server on the Internet through a wireless telephone line. Once the image data upload is completed, the electronic camera 100 transmits an upload completion notification 5 to the electronic instrument. The electronic instrument having received this notification connects with the image server on the Internet through a wireless telephone line based upon the upload information received through the information transmission 3 , and downloads and utilizes the specified image data.
  • the electronic camera 100 may either execute the information transmission 3 through a simultaneous one-to-multiple partner communication with the electronic instruments or may individually execute the information transmission 3 through a one-to-one communication with each electronic instrument. In the latter case, the information transmission 4 must be executed a plurality of times.
  • thumbnail image data in image files stored in the memory card 77 are sequentially read out starting with the data photographed most recently, and are reproduced and displayed at the left screen 21 while concurrently displaying relevant operational instructions at the right screen 22 , as shown in FIG. 116 .
  • the thumbnail image data displayed at the left screen 21 are scrolled in response to an operation of the direction button 23 or 24 .
  • one of the plurality of sets of thumbnail image data displayed at the left screen 21 is selected by the user through the touch screen 66 , and the thumbnail image data thus selected subsequently undergo a transmission operation, a print operation or a download operation as described later.
  • the electronic instruments with which communication has been achieved are displayed at the right screen 22 as possible image data recipients in S 704 to allow the user of the electronic camera 100 to select the desired recipient through the touch screen 66 , as shown in FIG. 118 .
  • the operation waits in standby for the SEND button 28 to be operated again, and if the SEND button 28 is operated, the image data upload information is-transmitted to the selected recipient through short-distance wireless communication in S 706 before the operation makes a return in S 707 .
  • the image data to be transmitted are first selected at the electronic camera 100 . Then, as the SEND button 28 is operated, a wireless search communication 1 with electronic instruments present within the vicinity is attempted by the electronic camera 100 . Through the search communication 1 , the identification information of the sender electronic camera 100 is transmitted.
  • Each electronic instrument having received the search communication 1 returns a response 2 to the electronic camera 100 in response by transmitting the identification information of the sender electronic instrument and the identification information of the recipient electronic camera.
  • the electronic camera 100 having received the response 2 verifies the communication partners with whom the communication has been achieved based upon the electronic instrument identification information and selects a recipient. Next, it reads out the upload information corresponding to the selected image data from the memory card 77 . As the SEND button 28 is operated again, the electronic camera 100 executes an information transmission 3 to transmit the upload information to the specific electronic instrument carried by the selected recipient. Through the information transmission 3 , the identification information of the selected recipient's electronic instrument, the identification information of the sender electronic camera 100 , the upload information, the thumbnail image data and the like are transmitted.
  • the selected recipient's electronic instrument having received the information 3 connects with the image server on the Internet through a wireless telephone line based upon the upload information, and downloads and displays the specified image data.
  • the printers with which communication has been achieved are displayed at the right screen 22 in S 804 to allow the user of the electronic camera 100 to select the desired printer through the touch screen 66 , as shown in FIG. 121 .
  • the operation waits, in standby state, for the SEND button 28 to be operated again and if the PRINT button 27 is operated, the image data upload information and a print instruction are transmitted to the selected printer through short-distance wireless communication in S 806 before the operation makes a return in S 807 .
  • the image data to be transmitted are first selected at the electronic camera 100 . Then, as the PRINT button 27 is operated, a wireless printer search communication 1 with any printer within the vicinity is attempted by the electronic camera 100 . Through the printer search communication 1 , the identification information of the sender electronic camera 100 and function identification information (“printer” in this case) indicating the function of electronic device with which the electronic camera-side wishes to communicate are transmitted.
  • a printer having received the printer search communication 1 returns a response 2 to the electronic camera 100 in response by transmitting the identification information of the sender printer, the identification information of the recipient electronic camera and printer specification information and the like.
  • the electronic camera 100 having received the response 2 verifies all the communication partners with whom communicate has been achieved based upon the printer identification information and selects a desired printer. Next, it reads out the upload information corresponding to the selected image data from the memory card 77 . As the PRINT button 27 is operated again, the electronic camera 100 executes an information transmission 3 to transmit the upload information to the specific selected printer. Through the information transmission 3 , the identification information of the selected printer, the identification information of the sender electronic camera 100 , the upload information, the print instruction information and the like are transmitted.
  • the selected printer having received the information 3 connects with the image server on the Internet through a wireless telephone line based upon the upload information, downloads the specified image data, converts the image data to image data for printing based upon the print instruction information and executes a printing operation.
  • FIG. 123 The conceptual diagram presented in FIG. 123 illustrating a mode of the information transmission among the electronic camera, a printer and the image server executed during the print interrupt processing described above, the electronic camera uploads image data obtained through a photographing operation to the image server on the Internet through a wireless telephone line.
  • the electronic camera transmits image data the upload information (the image server's Internet address is equivalent to URL (Uniform Resource Locator), the folder name and the image data filename at the image server) and control information (the print instruction information) an electronic device (printer) within a short-distance wireless communication range through short-distance wireless communication.
  • URL Uniform Resource Locator
  • the electronic device Based upon the information received from the electronic camera through the short-distance wireless communication, the electronic device (printer) connect with the image server on the Internet through a wireless telephone line, downloads the image data from the specific folder at the image server and executes an operation (printing processing) on the downloaded image data based upon the control information (print instruction information).
  • FIG. 126 illustrates the structure of the wireless communications circuit 71 in the electronic camera.
  • the wireless communication circuit 71 is enclosed by an electromagnetic shield film 79 along the camera rear surface (in the direction opposite from the direction along which the photographic optical axis extends, i.e., the side on which the photographer is present), the camera lower surface, the camera upper surface and the camera left and right side surfaces.
  • the wireless communication circuit 71 is allowed to communicate readily with the portable telephone 160 carried by a subject present in the direction along which the photographing operation is performed with the camera and since it cannot readily communicate with a portable telephone or an electronic device carried by a non-subject party who is not present along the photographic optical axis.
  • the image data recipient can be limited to the portable telephone carried by the subject who has been photographed.
  • FIG. 127 shows the communication directivity of the wireless communication circuit 71 , and extreme directivity manifests to the front of the electronic camera 100 (along the optical axis 40 ) and little directivity to the rear.
  • the wireless communication circuit 71 By allowing the wireless communication circuit 71 to manifest extreme directivity along the direction in which the photographing operation is performed with the camera through the structure shown in FIG. 126 as described above, the likelihood of communicating with the electronic instrument carried by the subject is increased. It is to be noted that the directivity may be adjusted by adopting a specific antenna structure instead of by using an electromagnetic shield film. A compact directional antenna that may be adopted is disclosed, for instance, in Japanese Laid-Open Patent Publication No. 2000-278037.
  • the directivity of the wireless communication circuit 71 may be adjusted in conformance to the focal length of the photographic lens. For instance, the communication directivity should be reduced if the focal length is small since the photographic field angle will be wide in such a case, whereas the communication directivity should be increased if the focal length is larger since the photographic field angle will be narrow in such a case.
  • the directivity of the wireless communication circuit 71 may be adjusted by mechanically moving the electromagnetic shield film or by selectively using one of a plurality of directional antennas with varying directional characteristics. By implementing such an adjustment, the likelihood of the electronic camera 100 achieving communication with an electronic instrument held by a photographed subject is increased without allowing ready communication with an electronic instrument carried by a party other than the subject.
  • the electronic camera uploads image data obtained through a photographing operation to an image server on the Internet
  • an electronic instrument such as a portable telephone carried by a subject connects with the image server on the Internet based upon related information received from the electronic camera through short-distance wireless communication and displays at its screen the image data downloaded from a specific folder at the image server.
  • the electronic camera may obtain upload information (indicating an image server on the Internet, a folder name and an image data filename specified by the electronic instrument-side) from the electronic instrument during the short-distance wireless communication with the electronic instrument carried by the subject and upload the image data obtained through a photographing operation, appended with the filename specified by the electronic instrument into the specified folder at the specified image server on the Internet based upon the upload information.
  • the electronic instrument downloads the image data assigned with the specified file name from the specified folder of the image server specified by itself.
  • This variation relieves the subject from a task of saving the downloaded image data into his own image server.
  • the location at which an image is to be saved (a specific folder at a specific image server) may be specified by the electronic instrument and a specific image data file name may be assigned by the electronic camera.
  • the electronic camera may specify an image server and a folder, and the electronic instrument may assign a specific in image data filename.
  • the electronic camera may obtain upload information (indicating an image server on the Internet, a folder name and an image data file name) specified by the electronic-instrument-side from an electronic instrument carried by a subject during the short-distance wireless communication with the electronic instrument and transmit the upload information obtained from the electronic instrument together with image data obtained through a photographing operation to a preassigned image server on the Internet so as to enable the preassigned server to transmit the received image data, appended the specified file name, into the specified folder at the specified image server on the Internet, instead.
  • the electronic instrument can download the image data assigned with the specified file name from the specified folder on the image server specified by itself.
  • short-distance communication between the electronic camera and a portable instrument carried by a subject is executed while obtaining image data through a photographing operation, or either immediately before or after the photographing operation, to automatically transfer the image data upload information to the portable instrument, which eliminates the need for performing a special operation such as manually writing down the upload information and also ensures that the image data upload information is transmitted to the portable instrument of a person highly likely to be a subject photographed n the image data.
  • image data are not directly exchanged between the electronic camera used to obtain the image data through a photographing operation and the portable telephone which is the ultimate recipient of the image data and as a result, the electronic camera does not need to adjust the image data in conformance to various image display specifications of different portable telephones or to engage in communication with the individual portable telephones over an extended period of time in order to transfer the image data to multiple users.
  • the onus placed on the electronic camera is reduced.
  • image data upload information is exchanged between the electronic camera and an electronic instrument through short-distance wireless communication.
  • the likelihood of communicating with limited partners such as a portable telephone carried by a subject and a printer in the vicinity of the electronic camera is increased and, at the same time, the likelihood of communicating with an irrelevant electronic instrument present at a large distance from the electronic camera is practically eliminated.
  • the signal transmission output level for the wireless communication is adjusted in conformance to the focal length of the photographic lens, a reliable wireless communication with a subject is achieved while reducing the likelihood of wirelessly communicating with a party other than the subject.
  • image data upload information is exchanged between the electronic camera and an electronic instrument through short-distance wireless communication. Since the short-distance wireless communication is achieved at a lower information transmission rate than that of a wireless portable telephone line, the structure of the short-distance wireless communication circuit can be simplified, thereby making it possible to reduce the cost of the electronic camera and the cost of the portable telephone and also to reduce their sizes.
  • the electronic camera which only needs to upload image data obtained through a photographing operation to an image server on the Internet and does not need to individually transmit the image data to a plurality of electronic instruments is able to start the next photographing operation immediately after the image data upload.
  • the individual electronic instruments can download the image data simultaneously at high speed and the downloaded image data can be promptly reviewed at the individual electronic instruments.
  • thumbnail image data corresponding to a specific set of image data are transmitted by the electronic camera to the electronic instrument while the image data upload information is exchanged between the electronic camera and the electronic instrument through a wireless communication.
  • the basic contents of the image data can be checked at the electronic instrument prior to downloading of an image data by reviewing the thumbnail image data and if necessary, an action such as cancellation of the downloading can be taken.
  • the electronic camera generates thumbnail image data correspondence to image data being obtained through a photographing operation, uploads the image data to an image server on the Internet and saves the upload information, appended to the thumbnail image data, in the electronic camera.
  • desirable image data can be selected and promptly downloaded from the image server for use.
  • the wireless communication directivity of the wireless communication circuit 71 is restricted so that the directivity manifests in a direction a long which a subject is highly likely to be present relative to the electronic camera main unit by using an electromagnetic shield film or the like.
  • reliable wireless communication with the portable telephone carried by the subject can be achieved to exchange the image data upload information between the electronic camera and the subject's portable telephone through wireless communication and, at the same time, the likelihood of wirelessly communicating with a party other than the subject is reduced.
  • the image data are ultimately saved at the image server specified by the subject.
  • the subject is relieved from the task of saving the image data at his own image server, whereas the electronic camera is able to start the next photographing operation immediately after the image data upload since it only needs to upload the image to a single preassigned image server.
  • the electronic camera uploads image data to an image server on the Internet and a portable telephone downloads the image data from this image server.
  • the image server does not necessarily need to be one on the Internet, and it may be an image server on a local communication network, instead.
  • the present invention may be adopted in a system in which an image server set up on a wireless LAN (local area network) conforming to the IEEE 802 standard is accessed by electronic cameras and portable telephones (electronic instruments) to upload and download image data.
  • an information transmission system and an information transmission method through which image data obtained through a photographing operation performed at an electronic camera are delivered to a portable telephone carried by a subject or a printer, are described.
  • the present invention is not limited to the application in image data transmission, and it may be adopted in general information transmission between an information sender and an information recipient present in close proximity to each other.
  • the present invention may be adopted in an audio data transmission between an information sender that is an audio recording apparatus and an information recipient that is a portable telephone, in which audio data recorded by the audio recording apparatus are transmitted to the portable telephone via an audio server on the Internet.
  • the electronic camera uploads image data to an image server on the Internet and a portable telephone (electronic instrument) downloads the image data from this image server.
  • the electronic camera may transmit the electronic instrument identification information of each electronic instrument with which communication has been achieved, together with the image data, to the image server and the image server, in turn, may only allow a download of the image data by the electronic instruments corresponding to the identification information thus received, in order to restrict recipients allowed to download the image data from the image server. In this case, an unauthorized image data download from the image server is prevented for privacy protection.
  • the electronic camera uploads image data to an image server on the Internet and a portable telephone downloads the image data from this image server.
  • the portable telephone may transfer the upload information received from the electronic camera to another electronic device (e.g., a personal computer belonging to the subject) to download the image data by accessing the image server from the other electronic device, instead.
  • the image data can be reviewed on a screen larger than the portable telephone screen and also a sufficient memory capacity for saving the image data can be assured.
  • the electronic camera uploads image data to an image server on the Internet and a portable telephone (electronic instrument) downloads the image data from this image server.
  • the electronic camera may transmit the electronic instrument identification information of each electronic instrument with which communication has been achieved, together with the image data to the image server, and the image server, in turn, may transmit the image data (in a push-type information transmission) to the electronic instrument corresponding to the identification information by issuing a transmission request to the electronic instrument based upon the identification information.
  • the electronic camera and a portable telephone exchange the image data upload information (the image server address information and the image data identification information), the electronic camera uploads the image data to the image server on the Internet and the portable telephone (electronic instrument) downloads the image data from the image server.
  • the configuration described below may be adopted to provide an image service system in which electronic cameras are installed at a plurality of photographing points in a larger premises such as a theme park, a guest having taken a picture at a desired photographing point downloads the image data obtained through the photographing operation from an image server.
  • each camera does not need to include an internal means for communication to communicate with a portable telephone carried by the guest but is instead paired with a means for communication installed separately from the electronic camera, image data obtained through a photographing operation executed with a given electronic camera by a guest are appended with specific identification information and are saved at a specific image server and the image server address information and the image data identification information are transmitted to the portable telephone of the guests by the means for communication.
  • the communication output can be lowered to prevent any interference from an electronic device other than the portable telephone of the guest who is the subject.
  • a special electronic instrument for exclusive use in conjunction with the image transmission system according to the present invention may be offered to each guest at the time of admission to ensure that image data are transmitted to the guest who is photographed in the image data with a high degree of reliability by preventing interference from other electronic devices.
  • the photographing side, the image recipient side and the image server are fixed as described above, it is not necessary to provide the image recipient side with the image server address information, and the image recipient side only needs to be provided with the image data identification information.
  • the image server address information is fixed and stored within the electronic cameras, the means for communication and the electronic instrument for exclusive use within the property.
  • the identification information of the special electronic instrument carried by each guest may be transmitted by the electronic instrument to the means for communication so that each set of image data, appended with the corresponding electronic instrument identification information is saved into the image server.
  • no image data identification information needs to be transmitted to the special electronic instrument carried by the guest.
  • the electronic instrument identification information is transmitted to the photographing side from the special electronic instrument during the photographing operation and the identification information is appended to the image data and is saved into the image server.
  • the image server accessed by the special electronic instrument carried by the guest, only needs to download the image data corresponding to the electronic instrument identification information to the electronic instrument.
  • the image server may forcibly transmit the image data to the special electronic instrument corresponding to the identification information appended to the image data.
  • the volume of data exchanged between the image data photographing side and the image data recipient side and the types of information exchanged between them can both be reduced.
  • the communication is conducted by using the special electronic instrument for exclusive use within the property, interference from other electronic devices can be prevented to improve the reliability of the communication.
  • the electronic camera and a portable telephone exchange the image data upload information (the image server address information and the image data identification information), the electronic camera uploads the image data to the image server on the Internet based upon the upload information and the portable telephone (electronic instrument) downloads the image data from the image server based upon the upload information.
  • the upload information does not need to include the image data identification information and may be constituted of the image server address information alone, instead.
  • the portable telephone connects with the image server to which the image data have been uploaded based upon the image server address information, either manually or automatically searches for desired image data based upon the photographing data and the like after the connection is achieved and downloads the image data selected through the search from the image server. Since this eliminates the need for storing in memory identification information for each set of image data at the portable telephone, a means for storage with a small storage capacity can be utilized in the portable telephone.
  • the electronic camera and a portable telephone exchange the image data upload information (the image server address information and the image data identification information), the electronic camera uploads the image data to the image server on the Internet and the portable telephone (electronic instrument) downloads the image data from the image server.
  • the electronic camera may transmit the image server address information to the portable telephone and receive the portable telephone identification information (the telephone number or the like) from the portable telephone during the photographing operation, generate identification information for the image data based upon the portable telephone identification information and upload the image data appended with the identification information thus generated to the image server.
  • the portable-telephone-identification information is provided to the image server so that the image data appended with the identification information corresponding to the portable telephone identification information are downloaded to the portable telephone.
  • the electronic camera may generate image data identification information (an image data filename or the like) in combination with other photographing information (e.g., the photographing date/time point (date and time) information) as well as the portable telephone identification information.
  • the portable telephone may access the image server by specifying a given photographing date/time point to selectively download the image data photographed at the photographing date/time point specified by the subject carrying the portable telephone.
  • the image server may keep track of a download history (portable telephone numbers of download recipients, download date/time points, etc.) for each set of image data so that when a portable telephone accesses the image server by specifying the download history, image data that have never been downloaded to the mobile telephone or image data that have not been downloaded over a predetermined length of time can be selectively downloaded.
  • the portable telephone is able to selectively download specific image data without having to store in memory the identification information for each set of image data.
  • the image data upload information is exchanged through a wireless (electromagnetic wave) communication between the electronic camera and an electronic instrument carried by a subject.
  • a redundant communication with electronic instruments carried by parties other than the true subject should be prevented by keeping the wireless-transmission-signal output level within a range lower than a predetermined value and thus limiting the communication-enabled distance range based upon the assumption that electronic instruments have the average reception capability.
  • This distance range is determined by taking into consideration the standard manner in which the electronic camera is electronic apparatus is used, whether or not an object that blocks electromagnetic waves is present between the electronic camera and the electronic instrument, the performance of the receiver at the electronic instrument and the like. It is desirable to set the distance range to approximately 10 m or less in correspondence to the focal length of the standard photographic lens.
  • the wireless communication circuit engages in communication through a short-distance wireless system such as Bluetooth.
  • a short-distance wireless system such as Bluetooth
  • the short-distance wireless communication may be achieved through a system other than Bluetooth, such as a wireless LAN (IEEE802.11) or infrared communication (IrDA).
  • communication may be achieved by simultaneously utilizing a plurality of short-distance wireless systems. In the latter case, an electronic instrument with any type of short-distance wireless function carried by a subject, can engage in wireless communication with the electronic camera.
  • the image data upload information is exchanged through a wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by the subject.
  • the image data upload information may instead be exchanged through another non-contact or contact method.
  • the electronic camera attempts short-distance wireless communication with electronic instruments in the vicinity of the electronic camera in order to exchange image data upload information.
  • the electronic camera may be preset so as to disallow any wireless communication with an electronic instrument carried by the photographer operating the electronic camera.
  • an electronic instrument having obtained upload information from the electronic camera through communication between the electronic camera and the electronic instrument accesses the image server to which image data have been uploaded based upon the upload information and downloads desired image data from the image server.
  • the electronic camera may obtain the identification information of the electronic instrument and provide the electronic instrument identification information thus obtained to the image server before the electronic instrument accesses the image server.
  • the image server enables exclusive access by the electronic instrument corresponding to the identification information received from the electronic camera on a specific condition.
  • the specific condition may be a specific length of time allowed to elapse after the electronic instrument identification information is received from the electronic camera or a specific limit on the number of image data downloads.
  • information (the image server address information, the image data identification information, the electronic instrument identification information and the like) exchange through wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by the subject.
  • an open key to be used to code the information may be transmitted from the information recipient to the information sender, the information sender may transmit the information coded by using the open key and the information recipient may decode the received information by using a secret key coupled with the open key.
  • the information remains secure.
  • the electronic camera attempts wireless communication with electronic instruments present in the vicinity of the electronic camera to exchange the image data upload information during a photographing operation.
  • the transmission signal output level for the wireless communication may be adjusted in conformance to the subject distance (the output level is raised as the distance to the subject increases).
  • the distance to a subject can be obtained through a detection performed by a distance detection device of the known art, or a photographing distance of the photographic lens which is manually set may be used as the subject distance.
  • the electronic camera is able to achieve communication with the electronic instrument held by a subject with a high degree of reliability and the likelihood of achieving communication with an electronic instrument held by a party other than the subject is reduced.
  • the electronic camera uploads image data obtained through a photographing operation to an image server on the Internet through its internal-wireless-telephone circuit.
  • the electronic camera may first transmit the image data to a portable telephone carried by the photographer operating the electronic camera through short-distance wireless communication and the photographer's portable telephone may upload the image data to the image server on the Internet through a wireless telephone line. Since this eliminates the need to provide an internal wireless telephone circuit at the electronic camera, the electronic camera can be miniaturized and further advantages related to power consumption and cost are also achieved.
  • the electronic camera uploads image data obtained through a photographing operation to a selected image server.
  • image data may be uploaded to a fixed image server instead.
  • a fixed image server For instance, by setting up a fixed image server to be exclusively used for image data upload, marketing an electronic camera in a package in which image services are offered through the image server and ensuring all the images photographed with the camera are uploaded to the image server, the purchaser of the electronic camera is relieved from the task of signing up with an image server himself and the camera can be operated with ease even by users not familiar with the Internet or the like.
  • the address information of the fixed image server is stored in an nonvolatile memory such as an EEPROM in the electronic camera.
  • the electronic camera automatically downloads all image data photographed in the photographing mode to the image server (the download recording mode).
  • the electronic camera may be allowed to assume another photographing mode (a recording mode for memory cards) for recording photographed image data into a memory card loaded at the electronic camera and may be manually or automatically switched to the download recording mode or the memory card recording mode.
  • the electronic camera having been operated in the recording mode may be automatically switched to the download recording mode as the remaining capacity of the memory card becomes low or the electronic camera having been operated in the download recording mode may be automatically switched to the recording mode when the electronic camera moves out of the wireless telephone line communication range.
  • Such features allow the photographer to take pictures with the electronic camera without having to worry about state of the memory card or the state of the wireless telephone line.
  • image data may be uploaded to the image server and also saved into the memory card at the same time under normal circumstances, and either the upload or the save may be skipped as necessary. This eliminates the need to create an image data back-up later on and thus, the image data can be saved with a higher degree of reliability.
  • the image data upload information is exchanged between the electronic camera and the electronic instrument through short-distance wireless communication and then the electronic camera uploads the image data to the image server.
  • this order may be reversed, i.e., the electronic camera may first upload the image data to the image server and then the image data upload information may be exchanged between the electronic camera and the electronic instrument through a short-distance wireless communication.
  • a photographing operation is executed in response to an operation of the shutter release button provided at the electronic camera.
  • a photographing operation in the electronic camera may be started up through short-distance wireless communication in response to remote control implemented from a portable telephone carried by a person to be photographed.
  • the short-distance wireless circuit can be effectively utilized and, at the same time, the photographing operation can be executed with the timing desired by the subject.
  • the short-distance wireless communication between the electronic camera and an electronic instrument in the vicinity is initiated by the electronic camera to transmit the image data upload information to the electronic instrument from the electronic camera.
  • a communication request may be first issued to the electronic camera from an electronic instrument carried by the subject and then the image data upload information may be transmitted by the electronic camera to the electronic instrument, instead.
  • a request for the image data upload information may be issued from the portable telephone 160 to an electronic camera in its vicinity in response to a specific operation of an operating key 162 at the portable telephone 160 shown in FIG. 102 .
  • the electronic camera Since this enables the subject side to initiate the short-distance wireless communication between the electronic camera and the portable telephone, the electronic camera does not need to engage in short-distance wireless communication each time a photographing operation is performed unless necessary and thus, the electronic camera can perform the next photographing operation immediately after the photographed image data are uploaded to the image server.
  • the image data upload information is exchanged through wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by the subject.
  • the electronic instrument having received search messages from the electronic camera may display a message indicating a communication from the electronic camera has been received or such a message may be provided to the subject via a means for sound generation on the subject side.
  • Such measures make it possible for the subject to verify that a communication from the electronic camera has been received.
  • the subject i.e., the user, provided with an audio message, can verify that communication from the electronic camera has been received while keeping the electronic instrument which may not include a means for display in a pocket or the like.
  • the subject having verified that the communication has been received may refuse to download the image data if necessary by cutting off further communication.
  • the image data upload information is automatically exchanged through wireless communication between the electronic camera and the electronic instrument held by the subject.
  • the electronic instrument having received the search communication from the electronic camera may compare the identification number of the electronic camera initiating the search communication with the identification numbers of electronic cameras preregistered at the electronic instrument and may refuse any further communication with the electronic camera if it is determined to be an unregistered camera. In this case, the subject can reject any image data photographed by a stranger.
  • the image data upload information is exchanged between the electronic camera and an electronic instrument through wireless communication.
  • the electronic camera may transmit information on the specifications (the data volume, the recording format, the image aspect ratio, the number of pixels, etc.) to the electronic instrument to allow the electronic instrument side to decide whether or not the image are to be downloaded in correspondence to a specific purpose of use for the image data based upon the received image data specification information. For instance, if the party on the electronic instrument side only wishes to review image data displayed at the screen of his portable telephone, he may choose to download image data with a small data volume or a small number of pixels.
  • the image data upload information is automatically exchanged between the electronic camera and the electronic instrument held by a subject through wireless communication during a photographing operation.
  • photographed image data may be saved into a RAM or a memory card on a temporary basis
  • a series of sets of image data may be uploaded to the image server and the image data upload information may be exchanged between the electronic camera and the electronic instrument held by the subject through wireless communication when a series of photographing operations has been completed. Since this eliminates the need to upload the image data and engage in wireless communication each time one of a plurality of pictures taken without changing the composition (as in an exposure bracket photographing operation) is photographed, the series of pictures can be taken quickly. Then, after the series of photographing operations is completed or the like, the plurality of sets of image data can be uploaded in a batch and the upload information with regard to the plurality of sets of image data can be exchanged through a single wireless communication, either automatically or manually.
  • image data are uploaded to the image server and the electronic camera and the electronic instrument held by the subject engage in wireless communication every time a photographing operation is performed.
  • the image data upload operation and the wireless communication operation may be disallowed if the image data upload and the wireless communication are likely to cause a delay in the photographing operation. For instance, by automatically disallowing the image data upload operation and the wireless communication operation when performing a continuous photographing (continuous shooting) operation, it is possible to prevent any significant lag attributable to the image data upload operation and the wireless communication operation from manifesting between the image frames obtained through the continuous photographing operation.
  • the image data obtained through the continuous photographing operation should be saved into RAM or a memory card on a temporary basis so that the series of image data can be uploaded in a batch to the image server and the image data upload information can be exchanged between the electronic camera and the electronic instrument held by the subject through wireless communication either automatically or manually after the series of pictures are taken.
  • the upload information and the print instruction information corresponding to the image data to be printed are transmitted to the printer from the electronic camera in response to an operation of the PRINT button.
  • the print instruction information transmitted through this process may include information on various printer settings (such as the print size, the print quality and the printing color).
  • the upload information and the print instruction information corresponding to the image data to be printed are transmitted to the printer from the electronic camera in response to an operation of the PRINT button.
  • the electronic camera may obtain from the printer the identification information (an IP address) used to identify the printer on the network and may transmit this identification information to the image server to enable the image server to transmit the image data to be printed to the printer corresponding to the identification information.
  • the identification information an IP address
  • information (address information) with regard to an upload of information (image) to an information server (image server) on the Internet is exchanged through short-distance communication between the information (image) sender side and the information (image) recipient side, the information (image) is first uploaded to the information server on the Internet from the information (image) sender side and the information (image) recipient side downloads for a utilization the information (image) from the Information server (image server) on the Internet to which the information (image) has been uploaded.
  • an information transmission system image transmission system
  • an information transmission method image transmission method that enable a speedy transmission of the information (image) to the information (image) recipient side without placing a great onus on the information (image) sender side during the information (image) transmission are provided.
  • the electronic camera is only required to execute the processing for transmitting the upload information to the individual portable telephones through short-distance communication and the processing for uploading the image data to the image server on the Internet, the onus of having to transmit the image data to the individual portable telephones through short-distance communication with the portable telephone is lifted from the electronic camera and the desired image data can be obtained promptly at each portable telephone as well.
  • the various programs explained in reference to the embodiments can be provided in a recording medium such as a CD-ROM or through data signals on the Internet.
  • a program executed on the personal computer 140 in FIG. 1 may be read from a CD-ROM drive device (not shown) mounted at the personal computer 140 .
  • the electronic camera 100 or the portable telephone 160 may connect with the personal computer 140 to download an upgrade program.
  • the programs may be downloaded via the Internet 130 by adopting the configuration shown in FIG. 1 as well.
  • an application program server in place of the image sever 150 in the configuration, the programs may be downloaded from this server via the Internet.
  • the programs When providing the programs via the Internet, they are embodied as data signals on a carrier wave to be transmitted via a communication line.
  • programs can each be distributed as a computer-readable computer program product assuming any of the various modes such as a recording medium or carrier wave.

Abstract

In an electronic apparatus having a user identification function or in a user identification method, an electronic apparatus and an electronic instrument having stored therein user personal information communicate in wireless with each other so as to enable the electronic apparatus to automatically identify the electronic apparatus user. If the electronic apparatus identifies a plurality of possible users, the electronic apparatus automatically selects the user by executing specific user identification processing.

Description

    INCORPORATION BY REFERENCE
  • The disclosures of the following applications are herein incorporated by reference:
      • Japanese Patent Application No. 2001-368014 filed Dec. 3, 2001
      • Japanese Patent Application No. 2001-373831 filed Dec. 7, 2001
      • Japanese Patent Application No. 2001-375568 filed Dec. 10, 2001
    TECHNICAL FIELD
  • The present invention relates to an electronic apparatus that identifies a user and performs an operation corresponding to the user and a method for identifying a user of an electronic apparatus and identifying an electronic apparatus to be utilized by the user and, more specifically, it relates to an electronic apparatus that identifies a user through communication executed between the electronic apparatus and the user and a method for identifying an electronic apparatus user and an electronic apparatus to be utilized by a user by executing communication between the electronic apparatus and the user.
  • The present invention also relates to an electronic camera, an image display apparatus and an image display system that stores electronic image data obtained by photographing a desired subject with the electronic camera into a storage medium, reads out the electronic image data from the storage medium and displays an image reproduced from the electronic image data at a specific screen and, more specifically, it relates to an electronic camera, an image display apparatus and an image display system that brings up a display of information related to the image data at the screen at which the image reproduced from the image data is displayed.
  • In addition, the present invention relates to an information transmission system and an information transmission method through which data are exchanged among portable terminals or the like and, more specifically, it relates to an image transmission system and an image transmission method through which electronic image data obtained by photographing a desired subject with a photographing apparatus such as an electronic camera are transmitted to an electronic instrument such as a portable telephone terminal having an image display function.
  • BACKGROUND ART
  • Electronic apparatuses such as cameras that identify electronic apparatus users and engage in a specific operation corresponding to a given user are known in the related art. For instance, Japanese Laid-Open Patent Publication No. H 11-164282 discloses a digital camera having a card read unit, which allows a user to enter personal information to the digital camera through the card reader unit, automatically connects with a data server corresponding to the user based upon the entered personal information and automatically transfers image data obtained through a photographing operation performed with the digital camera to the data server.
  • The electronic apparatus in the related art described above requires a magnetic card or a memory card having stored therein the personal information to be directly loaded at the electronic apparatus or necessitates a wired connection of the magnetic card or the memory card to the electronic apparatus so that the personal information can be read into the electronic apparatus.
  • However, having to directly load a magnetic card or a memory card at the electronic apparatus or connect the magnetic card or the memory card to the electronic apparatus, by means of wiring, each time the user needs to use the electronic apparatus places a considerable onus on the user and, moreover, there is a problem in that the electronic apparatus cannot be used immediately in a situation that arises suddenly. For instance, there is a risk of missing a good photo opportunity while setting the personal information in the digital camera.
  • In addition, in the electronic apparatus in the related art described above, the personal information is utilized for limited purposes such as the automatic selection of the recipient to which images are to be transmitted and setting and display operations are performed in the electronic apparatus without fully reflecting the personal information. As a result, the user may not always be satisfied with the ease of operation it affords. The design of the electronic apparatus is particularly lax in consideration of users such as first-time users, seniors and children, who may not be accustomed to operating electronic apparatuses.
  • People also store image data obtained through a photographing operation performed with an electronic camera on a trip or the like into a memory card and later review images reproduced from the image data displayed at a liquid crystal display screen of the electronic camera in which the memory card is loaded or at a screen of a personal computer to which the image data have been transferred from the memory card.
  • The user reviewing electronic image data in the related art as described above is simply allowed to view the reproduced images on display, and there is difficulty in expanding the range of information utilization beyond reviewing of the images.
  • For instance, if the user becomes interested in a historical structure or the like photographed in the image he is currently reviewing, he has to stop reviewing images temporarily to look it up in an encyclopedia or to conduct a search on the Internet. In addition, if the user becomes interested in a person (the person's name, hobby, etc.) photographed in the image he is reviewing, the user cannot obtain that person's personal information immediately on the spot.
  • In addition, image data obtained through a photographing operation of an electronic camera are transferred to another image display terminal for reviewing the image data through an off-line information transmission method, in which the image data are temporarily saved into a memory card and then the memory card is loaded at the other image display terminal or through an on-line information transmission method, in which the image data are transmitted through a wireless or wired communication from the electronic camera to the other image display terminal, in the related art. For instance, Japanese Laid-Open Patent Publication No. H 10-341388 discloses an image transmission system which enables a direct transmission of image data from an electronic camera to another electronic camera through optical communication achieved by using infrared light.
  • There are disadvantages to the off-line image transmission system and the off-line image transmission method in the related art described above in that since they necessitate an exchange of an information storage medium such as a memory card between the electronic camera and the image display terminal, the storage medium must be frequently loaded and unloaded, and data cannot be obtained and recorded through a photographing operation in the meantime, and in that since the recording medium must be physically transported, there is a time lag before the data can be utilized on the recipient side.
  • While the need to transport the information recording medium is eliminated by adopting the on-line transmission system and the on-line image transmission method in the related art described above, there is a problem in that data cannot be obtained and recorded through a photographing operation or the like while communication is in progress at a low transmission speed. In particular, the length of time required to transmit image data the volume of which is much larger than that of standard text data or the like tends to be very significant. Furthermore, when image data of a group picture of a number of people taken with an electronic camera are directly transmitted by the electronic camera to portable telephones or the like of the plurality of subjects, the image data transmission takes a great deal of time.
  • DISCLOSURE OF THE INVENTION
  • The present invention provides, preferably, an electronic apparatus that has a user identification function for performing identification of the user of the electronic apparatus through a wireless method so as to speed up the user identification without placing an onus on the user and for effectively preventing an erroneous identification that would identify a party other than the user when executing the user identification through the wireless method and a user identification method that may be adopted in the electronic apparatus.
  • The present invention also provides, preferably, an electronic apparatus identification method in which identification of an electronic apparatus to be utilized by a user is executed through a wireless method so as to speed up the identification without placing an onus on the user, and an erroneous identification that would identify another electronic apparatus different from the intended electronic apparatus as the electronic apparatus to be utilized is effectively prevented when electronic apparatus identification is executed through the wireless method.
  • The present invention further provides, preferably, an electronic apparatus that has a user identification function that allows beginners/unexperienced users, elderly persons and children as well as experienced users to operate the same electronic apparatus with ease, without requiring any special data input or registration setting to enable the electronic apparatus to execute a user identification.
  • The present invention further provides, preferably, an electronic camera, an image display apparatus and an image display method that enable an easy and prompt review of information related to a subject photographed in an image, which interests the user reviewing a reproduced display of electronic image data in linkage with the reproduced display of the electronic image data.
  • The present invention also provides, preferably, an image transmission system and an image transmission method that allow images to be transmitted to an image recipient side in a short time without placing an onus on the image sender side during an image transmission. More specifically, it provides an image transmission system and an image transmission method that allow image data obtained through a photographing operation executed in an electronic camera to be promptly transmitted to a partner-side electronic instrument without placing a great onus on the electronic camera. Furthermore, the present invention provides an image transmission system and an image transmission method that enable distribution of image data obtained through a photographing operation executed in an electronic camera to portable terminals carried by a plurality of parties who are subjects without placing a great onus on the electronic camera.
  • In the electronic apparatus having a user identification function and the identification method according to the present invention, the electronic apparatus automatically identifies the user of the electronic apparatus through wireless communication executed between the electronic apparatus and an electronic instrument having stored therein user personal information. In addition, if the electronic apparatus identifies a plurality of possible users, the electronic apparatus automatically selects the correct user based upon specific user identification processing.
  • In the electronic apparatus identification method according to the present invention, an electronic instrument carried by a user automatically identifies an utilization-object, or target, electronic apparatus through wireless communication executed between the electronic apparatus and the electronic instrument. In addition, if the electronic instrument identifies a plurality of possible utilization-object electronic apparatuses, the electronic instrument automatically selects the correct electronic apparatus based upon specific identification processing.
  • In the electronic apparatus having a user identification function and the identification method adopted in the electronic apparatus according to the present invention, the electronic apparatus engages in wireless communication with an electronic instrument having stored therein personal information that is available for general purposes and in the electronic apparatus, to have the personal information transferred to the electronic apparatus. The user is categorized based upon the personal information available for general purposes and various operational settings are automatically selected for the electronic apparatus based upon the categorization results.
  • In the electronic camera, the image display apparatus and the image display method according to the present invention, information related to a subject, which is made to correlate to the position of the subject in the image plane is appended to the electronic image data and stored in memory and the information related to the subject is displayed in correspondence to the subject position within the image plane in a reproduced display of the electronic image data.
  • In the information transmission system (or image transmission system) and the information transmission method (or image transmission method) according to the present invention, an image generating apparatus (a photographing apparatus, or an electronic camera) has a function which allows it to be connected to the Internet, information (or an image) that has been generated and appended with identification data (or a file name), is first transmitted to a specific information server (or image server) on the Internet where it is temporarily saved (or uploaded), a direct connection is established with an electronic instrument which is the recipient of the information (image) through a short-distance communication, information indicating the address of the Information server (or image server) on the Internet and the identification data corresponding to the generated information (image) are transmitted to the electronic instrument and the electronic instrument utilizes the information (image) by reading it out (or downloading) from the Information server (or image server) on the Internet based upon the address information and the identification data having been received.
  • In the information transmission system (or image transmission system) and the information transmission method (or image transmission method) according to the present invention, an image generating apparatus (a photographing apparatus, or an electronic camera) achieves a direct connection with an electronic instrument which is the recipient of information (or image) through short-distance communication, receives information indicating the address of the electronic instrument on the Internet from the electronic instrument on the partner side and transmits (or uploads), on a temporary basis, the address information of the electronic instrument together with the information (or image) which has been generated to a specific information server (or image server) on the Internet, and the information server (or image server), in turn, automatically transmits the information (or image) which has been received to the electronic instrument corresponding to the received address information.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 a conceptual diagram of the system configuration adopted in a first embodiment of the present invention;
  • FIG. 2 is an external view (a front view) of the electronic camera achieved in the first embodiment of the present invention;
  • FIG. 3 is an external view (a rear view) of the electronic camera achieved in the first embodiment of the present invention;
  • FIG. 4 is a block diagram of the electrical structure adopted in the electronic camera in the first embodiment;
  • FIG. 5 shows the structure of the user registration information;
  • FIG. 6 shows the structure of the personal information data;
  • FIG. 7 is a transition diagram of the state assumed in the electronic camera according to the present invention;
  • FIG. 8 presents a correspondence list relevant to the user identification operation;
  • FIG. 9 presents a flowchart of the main routine executed at the CPU;
  • FIG. 10 presents a flowchart of a subroutine;
  • FIG. 11 presents a flowchart of a subroutine;
  • FIG. 12 presents a flowchart of a subroutine;
  • FIG. 13 presents an example of screen display;
  • FIG. 14 presents a flowchart of a subroutine;
  • FIG. 15 presents a flowchart of a subroutine;
  • FIG. 16 presents an example of screen display;
  • FIG. 17 presents a flowchart of a subroutine;
  • FIG. 18 presents a flowchart of a subroutine;
  • FIG. 19 presents an example of screen display;
  • FIG. 20 presents a flowchart of a subroutine;
  • FIG. 21 presents a flowchart of a subroutine;
  • FIG. 22 presents an example of screen display;
  • FIG. 23 presents an example of screen display;
  • FIG. 24 presents an example of screen display;
  • FIG. 25 presents an example of screen display;
  • FIG. 26 presents a flowchart of a subroutine;
  • FIG. 27 presents an example of screen display;
  • FIG. 28 presents an example of screen display;
  • FIG. 29 presents an example of screen display;
  • FIG. 30 presents an example of screen display;
  • FIG. 31 presents a flowchart of a subroutine;
  • FIG. 32 presents a flowchart of a subroutine;
  • FIG. 33 presents a flowchart of a subroutine;
  • FIG. 34 presents a flowchart of a subroutine;
  • FIG. 35 illustrates the structure of the wireless circuit;
  • FIG. 36 illustrates the communication directivity;
  • FIG. 37 illustrate the positional detection;
  • FIG. 38 presents a flowchart of a subroutine;
  • FIG. 39 presents a flowchart of a subroutine;
  • FIG. 40 presents an example of screen display;
  • FIG. 41 presents a flowchart of a subroutine;
  • FIG. 42 presents a flowchart of a subroutine;
  • FIG. 43 presents a flowchart of a subroutine;
  • FIG. 44 presents an example of screen display;
  • FIG. 45 illustrates the diopter adjustment;
  • FIG. 46 presents a flowchart of a subroutine;
  • FIG. 47 illustrates communication achieved through a user internal path;
  • FIG. 46 presents a flowchart of a subroutine;
  • FIG. 49 is a conceptual diagram of an example of a variation of the system configuration;
  • FIG. 50 shows an example of a variation of the communication sequence;
  • FIG. 51 illustrates a communication mode;
  • FIG. 52 presents an external view of a portable telephone;
  • FIG. 53 presents an example of screen display;
  • FIG. 54 presents an example of screen display;
  • FIG. 55 presents an example of screen display;
  • FIG. 56 illustrates the concept adopted in a second embodiment of the present invention;
  • FIG. 57 illustrates the system configuration adopted in the second embodiment of the present invention;
  • FIG. 58 is an external view (a front view) of the electronic camera achieved in the second embodiment of the present invention;
  • FIG. 59 is an external view (a rear view) of the electronic camera achieved in the second embodiment of the present invention;
  • FIG. 60 is a block diagram of the electrical structure adopted in the electronic camera in the second embodiment;
  • FIG. 61 shows the structure of the data in the memory;
  • FIG. 62 shows the structure of the photograph information data;
  • FIG. 63 shows the structure of the general information data;
  • FIG. 64 shows the structure of the personal information data;
  • FIG. 65 is a transition diagram of the state assumed in an electronic camera according to the present invention;
  • FIG. 66 presents a flowchart of the main program;
  • FIG. 67 presents a flowchart of a subroutine;
  • FIG. 68 presents an example of screen display;
  • FIG. 69 presents a flowchart of a subroutine;
  • FIG. 70 presents a flowchart of a subroutine;
  • FIG. 71 illustrates the communication sequence;
  • FIG. 72 illustrates the position information detection;
  • FIG. 73 illustrates the position information detection;
  • FIG. 74 illustrates the position information detection;
  • FIG. 75 presents a flowchart of a subroutine;
  • FIG. 76 presents a flowchart of a subroutine;
  • FIG. 77 presents an example of screen display;
  • FIG. 78 presents a flowchart of a subroutine;
  • FIG. 79 presents an example of screen display;
  • FIG. 80 presents a flowchart of a subroutine;
  • FIG. 81 presents an example of screen display;
  • FIG. 82 presents an example of screen display;
  • FIG. 83 presents an example of screen display;
  • FIG. 84 presents an example of screen display;
  • FIG. 85 presents an example of screen display;
  • FIG. 86 illustrates the structure of the wireless circuit;
  • FIG. 87 illustrates the communication directivity;
  • FIG. 88 presents an external view of a portable telephone;
  • FIG. 89 presents an example of screen display;
  • FIG. 90 illustrates the image-plane position information detection;
  • FIG. 91 illustrates the communication directivity;
  • FIG. 92 presents a flowchart of a subroutine;
  • FIG. 93 presents an example of screen display;
  • FIG. 94 presents an example of screen display;
  • FIG. 95 illustrates how a subject is specified;
  • FIG. 96 presents a flowchart of a subroutine;
  • FIG. 97 illustrates an example of a variation of the system configuration;
  • FIG. 98 illustrates the concept adopted in a third embodiment of the present invention;
  • FIG. 99 illustrates the system configuration adopted in the third embodiment of the present invention;
  • FIG. 100 is an external view (front view) of the electronic camera achieved in the third embodiment of the present invention;
  • FIG. 101 is an external view (rear view) of the electronic camera achieved in the third embodiment of the present invention;
  • FIG. 102 presents an external view (front view) of a portable telephone terminal;
  • FIG. 103 is a block diagram of the electrical structure adopted in the electronic camera;
  • FIG. 104 is a block diagram of the electrical structure adopted at the portable telephone terminal;
  • FIG. 105 presents a list of upload dial numbers and the corresponding image servers;
  • FIG. 106 shows the structure of the data in the memory card;
  • FIG. 107 shows the structure of the photograph information data;
  • FIG. 108 presents a transition diagram of the operating state of the electronic camera;
  • FIG. 109 presents a flowchart of the main program;
  • FIG. 110 presents a flowchart of a subroutine;
  • FIG. 111 presents an example of screen display;
  • FIG. 112 presents a flowchart of a subroutine;
  • FIG. 113 illustrates the communication sequence;
  • FIG. 114 presents a flowchart of a subroutine;
  • FIG. 115 presents a flowchart of a subroutine;
  • FIG. 116 presents an example of screen display;
  • FIG. 117 presents a flowchart of a subroutine;
  • FIG. 118 presents an example of screen display;
  • FIG. 119 illustrates the communication sequence;
  • FIG. 120 presents a flowchart of a subroutine;
  • FIG. 121 presents an example of screen display;
  • FIG. 122 illustrates the communication sequence;
  • FIG. 123 illustrates the concept adopted in an embodiment of the present invention;
  • FIG. 124 presents a flowchart of a subroutine;
  • FIG. 125 presents an example of screen display;
  • FIG. 126 illustrates the structure of the wireless circuit;
  • FIG. 127 illustrates the communication directivity; and
  • FIG. 123 illustrates the concept adopted in an embodiment of the present invention.
  • BEST MODE FOR CARRYING OUT THE INVENTION FIRST EMBODIMENT
  • The first embodiment of the present invention is explained in reference to the drawings. FIG. 1 is a conceptual diagram of an electronic camera adopting the present invention and an electronic image communication system achieved by utilizing the electronic camera. In FIG. 1, an electronic camera 100 that includes a memory card saves electronic image data or digital image data (hereafter referred to as image data) obtained through a photographing operation into the memory card. In addition, the electronic camera 100 has a short-distance wireless communication function (e.g., Bluetooth (registered trademark; RTM) enabling communication within a range of approximately a 10M ) and engages in communication with a portable telephone 160 that also has a short-distance wireless communication function. It is assumed that this portable telephone 160 is carried by a user of the electronic camera 100 and that a UIM card (user identification module card) 170 having stored therein user personal information (or user information) can be loaded at the portable telephone 160.
  • Through communication between the electronic camera 100 and the portable telephone 160 carried by the user, the user personal information is transferred to the electronic camera 100 and the electronic camera 100, in turn, sets itself up for the specific user in conformance to the transferred personal information. In addition, image data obtained through a photographing operation at the electronic camera 100 are transferred to the portable telephone 160 from the electronic camera 100, and then the image data are transferred to a personal computer 140 for personal use or an image server 150, first via a wireless base station 120 through a wireless portable telephone line 120 and then via a wired or wireless public telephone line or the Internet 130.
  • FIGS. 2 and 3 present external views (a front view and a rear view) of an embodiment of the electronic camera 100 in FIG. 1. As shown in FIG. 2, at the front of the electronic camera 100, a photographic lens 10 which forms a subject image, a viewfinder 11 used to check the photographic image plane, a strobe 12 used to illuminate the subject during a photographing operation, a photometering circuit 13 that detects the brightness of the subject and a grip portion 14 that projects out from the camera main body to allow the electronic camera 100 to be held by hand are provided, whereas a shutter release button 16 operated to issue an instruction for a photographing start and a power switch 17 (a momentary switch which is alternately set to ON or OFF each time it is operated) through which the on/off state of the power to the electronic camera 100 is controlled are provided at the upper surface.
  • As shown in FIG. 3, at the rear of the electronic camera 100, an eyepiece unit of the viewfinder 11, a MODE dial 19 that can be rotated to select a user-identify mode (a registration mode selected for user registration, an automatic mode selected for automatic user identification or a fixed mode selected for the user), a dot 20 used to indicate the setting position of the MODE dial 19, a left LCD (left screen) 21 having a substantially quadrangular screen for displaying text and images and a right LCD (right screen) 22 having a substantially quadrangular screen for displaying text and images are provided. In the vicinity of the left screen 21 on its left side, an up button 23 and a down button 24 operated to switch the image displayed at the left screen 21 are provided, whereas under the right screen 22 and the left screen 21, a SET button 25 operated to set various operations of the electronic camera 100, a SEND button 26 operated to issue an instruction for an image data transmission to the outside, a CONFIRM button 27 used to finalize user selections and setting items and an INITIALIZE button 29 operated to initialize the various operational settings in the electronic camera 100 are provided. Above the left screen 21, a PHOTOGRAPH/DISPLAY button 28 operated to select a reproduction mode in which image data saved in a memory card 104 are displayed at the left screen 21 or a photographing mode in which image data are obtained through photographing (the photographing mode or the reproduction mode is alternately selected each time the button is operated) is provided. At a side surface, a memory card slot 30 in which the memory card 104 can be loaded is provided.
  • It is to be noted that the shutter release button 16, the MODE dial 19, the up button 23, the left LCD 24, the SET button 25, the SEND button 26, the CONFIRM button 27, the PHOTOGRAPH/DISPLAY button 28 and the INITIALIZE button 29 are all operating keys operated by the user.
  • It is also to be noted about over the surfaces of the left screen 21 and the right screen 22, so-called touch tablets 66 having a function of outputting position data corresponding to a position indicated through a finger contact operation are provided, and they can be used to make a selection from selection items displayed on the screens or to select specific image data. The touch tablets 66 are each constituted of a transparent material such as a glass resin so that the user can observe images and text formed inside the touch tablets 66 through the touch tablets 66.
  • In the block diagram in FIG. 4, which presents an example of an internal electrical structure that may be assumed in the electronic camera 100 shown in FIGS. 2 and 3, various components are connected with one another via a data/control bus 51 through which various types of information data and control data are transmitted.
  • The components are divided into the following five primary blocks, i.e., 1) a block, the core of which is a photographing control circuit 60 that executes an image data photographing operation, 2) a block, the core of which is constituted by a wireless transmission circuit 71 and a wireless reception circuit 72 that execute transmission/reception of image data to and from the outside, 3) a block constituted of the memory card 104 in which image data are stored and saved, 4) a block, the core of which is a screen control circuit 92 that executes display of image data and information related to the image data and 5) a block, the core of which is constituted of user interfaces such as operating keys 65 and a CPU 50 that implements integrated control on various control circuits.
  • The CPU 50 (central processing unit), which is a means for implementing overall control on the electronic camera 100, issues various instructions for the photographing control circuit 60, the wireless transmission circuit 71, the wireless reception circuit 72, the screen control circuit 92 and a power control circuit 64 in conformance to information input from the operating keys 65, the touch tablets 66, various detection circuits 70, the power switch 17, a timer 74, the photometering circuit 13, a GPS circuit 61 and an attitude change detection circuit 62. It is to be noted that the various detection circuits 70 include a grip sensor to be detailed later and the like.
  • The photometering circuit 13 measures the brightness of the subject and outputs photometric data indicating the results of the measurement to the CPU 50. In conformance to the photometric data, the CPU 50 sets the length of the exposure time and the sensitivity of a CCD 55 through a CCD drive circuit 56, and in addition, it controls the aperture value for an aperture 53 with an aperture control circuit 54 via the photographing control circuit 60 in conformance to the data indicating the settings. In the photographing mode, the CPU 50 controls the photographing operation via the photographing control circuit 60 in response to an operation of the shutter release button 15. In addition, if the photometric data indicate a low subject brightness, the CPU 50 engages the strobe 12 in a light emission operation via a strobe drive circuit 73 during the photographing operation.
  • The GPS circuit 61 (global positioning system circuit) detects information indicating the position of the electronic camera 100 by using information provided by a plurality of satellites orbiting around the earth and provides the detected position information to the CPU 50. When an image is photographed, the CPU 50 transfers this position information or processed position information (the name of the site, the geographic location or the like) together with the image data to the memory card 104 for storage.
  • The attitude-change detection circuit 62, which is constituted of an attitude sensor of the known art or the like, capable of detecting a change in the attitude of the electronic camera 100, provides information indicating the extent of the attitude change having occurred between the immediately preceding photographing operation and the current photographing operation to the CPU 50. Based upon the attitude change information, a decision can be made with regard to whether or not the photographic composition has been changed for the current photographing operation from that for the immediately preceding photographing operation. The CPU 50 transfers this attitude change information together with the image data to the memory card 104 for storage.
  • The timer 74, having an internal clock circuit provides time information corresponding to the current time point to the CPU 50. The CPU 50 transfers the time point information indicating a photographing operation time point together with the image data to the memory card 104 for storage. The CPU 50 controls the various units in conformance to a control program stored in a ROM 67 (read-only memory). In an EEPROM 68 (electrically erasable/programmable ROM) which is a nonvolatile memory, personal information, registration setting information and the like necessary for the operations of the electronic camera 100 are stored. The CPU 50 implements control on a power supply 63 via the power control circuit 64 by detecting the operating state of the power switch 17.
  • The photographing control circuit 60 focuses or zooms the photographic lens 10 through a lens drive circuit 52, controls the exposure quantity at the CCD 55 through control implemented on the aperture 53 by engaging the aperture control circuit 54 and controls the operation of the CCD 55 through the CCD drive circuit 56. The photographic lens 10 forms a subject image onto the CCD 55 with a light flux from the subject via the aperture 53 which adjusts the light quantity. This subject image is captured by the CCD 55. The CCD 55 (charge-coupled device) having a plurality of pixels is a charge storage type image sensor that captures a subject image, and outputs electrical image signals corresponding to the intensity of the subject image formed on the CCD 55 to an analog processing unit 57 in response to a drive pulse supplied by the CCD drive circuit 56.
  • The analog processing unit 57 samples the image signals having undergone photoelectric conversion at the CCD 56 with predetermined timing and amplifiers the sampled signals to a predetermined level. An A/D conversion circuit 58 (analog digital conversion circuit) converts the image signals sampled at the analog processing unit 57 to digital data through digitization and the digital data are temporarily stored into a photographic buffer memory 59.
  • In the photographing mode, the photographing control circuit 60 repeatedly executes the operation described above and the screen control circuit 92 repeatedly executes an operation in which the digital data sequentially stored into the photographic buffer memory 59 are readout via the data/control bus 51, the digital data thus read out are temporarily stored into a frame memory 69, the digital data are converted to display image data and are restored into the frame memory 69 and the display image data are displayed at the left screen 21. In addition, the screen control circuit 92 obtains text display information from the CPU 50 as necessary, converts it to display text data which are then stored into the frame memory 69 and displays the display text data at the left screen 21 and the right screen 22. Since the image currently captured by the CCD 55 is displayed at the left screen 21 in real time in the photographing mode in this manner, the user is able to set the composition for the photographing operation by using this through screen (through image) as a monitor screen.
  • The photographing control circuit 60 detects the state of the focal adjustment at the photographic lens 10 by analyzing the degree of the high-frequency component of the digital data stored in the photographic buffer memory 59 and performs a focal adjustment for the photographic lens 10 with the lens drive circuit 52 based upon the results of the detection. Upon receiving a photographing instruction from the CPU 50, the photographing control circuit 60 engages the CCD 55 to capture a subject image via the CCD drive circuit 56 and temporarily stores image signals generated through the image-capturing operation as digital data (raw data) into the photographic buffer memory 59 via the analog processing unit 57 and the A/D converter circuit 58. The photographing control circuit 60 then generates image data by converting or compressing the digital data stored in the photographic buffer memory 59 on a temporary basis to a predetermined recording format (such as JPEG) and stores the image data back into the photographic buffer memory 59. The photographing control circuit 60 stores the image data stored back into the photographic buffer memory 59 into the memory card 104 as an image file.
  • In the reproduction mode, the screen control circuit 92 reads out an image file specified by the CPU 50 from the memory card 104, temporarily stores the image-file into the frame memory 69, converts the image data to display image data and stores the display image data back into the frame memory 69 and then displays the display image data at the left screen 21.
  • In the reproduction mode, upon receiving a transmission instruction from the CPU 50, the wireless transmission circuit 71 reads out a specified image file from the memory card 104 and outputs this image file to the outside through wireless transmission. The wireless transmission circuit 71 and the wireless reception circuit 72 engage in communication with an external wireless instrument during a user identification operation and provides the results of the communication to the CPU 50.
  • FIGS. 5 and 6 show the data structure of user registration information saved within the EEPROM 68. As shown in FIG. 5, the user registration information is prepared for individual users (user A, user B, default user and most recent user in FIG. 5). The default user is a general-purpose user having set for use in case a user identification could not be performed successfully, and the most recent user is the user who last operated the electronic camera 100. Each set of individual user information includes identification data (individual identification data used to identify the instrument at the communication partner), personal information data (see FIG. 6), camera data and priority order data. The camera data are constituted of camera information, communication information, custom setting data, last operating time point data and last setting data.
  • As shown in FIG. 6, the personal information data are general personal-information data having no relevance to the settings at the electronic camera 100, which can be utilized for multiple purposes. For this reason, it is not necessary to specially set these personal information data in order to operate the electronic camera 100. The personal information data include the name, the date of birth and the preferred language of the individual, physical data (visual acuity, diopter(eyesight) and hand preference), tastes (favorite color, etc.) and other data, and the electronic camera 100 can be customized by using the personal information data.
  • FIG. 7 is a transition diagram of the state assumed in the electronic camera in the embodiment of the present invention. As the power is turned on, wireless user identification processing is executed. During the user identification processing, a communication operation and a camera setting operation are performed. Once the user identification processing at power ON is completed, the operation shifts to the photographing mode in which a photographing operation and a camera setting operation are performed. In response to an operation of the PHOTOGRAPH/DISPLAY button 28, the operation shifts from the photographing mode to the reproduction mode or from the reproduction mode to the photographing mode. In the reproduction mode, an image data reproducing operation, a camera setting operation and an image data transmission operation are performed. If the MODE dial 19 is set for registration, the operation shifts to user registration processing from the photographing mode or the reproduction mode, a communication operation and a registration operation for user registration are performed, and then when these operations are completed, the operation shifts back into the original mode, i.e., the photographing mode or the reproduction mode.
  • The table in FIG. 8 provided to facilitate an explanation of the user identification processing operation indicates that the default user is set as the current user if the number of users detected through the wireless communication operation is 0, the detected user is set as the current user if the number of detected users is one, and the user selected through automatic selection processing which is to be detailed later is set as the current user if the number of detected users is two or more.
  • FIG. 9 presents a chart of the main flow of the operations executed in the electronic camera 100 (the CPU 50) in the embodiment described above. First, the power is turned on as the power switch 17 is operated in S10, and a user identification processing subroutine to be detailed later is executed in S20 to identify the current user of the electronic camera 100. When the user identification processing in S20 is completed, a photographing mode subroutine is executed in S30 and thus the camera enters a photographing-enabled state. If the shutter release button 16 is operated in the photographing mode, a release interrupt processing subroutine in S40 is executed to perform a photographing operation. If the PHOTOGRAPH/DISPLAY button 28 is operated in the photographing mode, a photographing/reproduction interrupt processing subroutine in S70 is executed and a reproduction mode subroutine in S80 is executed to display the reproduced image of image data stored in the memory card 104 at the left screen 21. If, on the other hand, the PHOTOGRAPH/DISPLAY button 28 is operated in the reproduction mode, the photographing/reproduction interrupt processing subroutine in S70 is executed and the photographing mode subroutine in S30 is executed.
  • If the SEND button 26 is operated in the reproduction mode, a transmission interrupt processing subroutine in S90 is executed to transmit the image data currently reproduced in the reproduction mode to the outside. In addition, if the SET button 25 is operated either in the photographing mode or the reproduction mode, a setting interrupt processing in S60 is executed to enable manual setting of the camera, if the INITIALIZE 29 is operated either in the photographing mode or the reproduction mode, initialize interrupt processing in S95 is executed to initialize the camera settings in correspondence to the specific user and if the MODE dial 19 is set for registration in the photographing mode or the reproduction mode, a registration interrupt processing in S50 is executed to enable a new user registration.
  • In the detailed flowchart of the user identification processing subroutine presented in FIG. 10, the last user is set as the most recent user in S201 after the subroutine is started up in S20. In S202, the MODE dial 19 is checked to ascertain whether or not it is set to “auto”, the most recent user is set as the current user if the MODE dial 19 is not set to “auto” and, accordingly, the most recent user settings are selected for the camera before the operation proceeds to S210. Namely, if the MODE dial 19 is set to “fixed”, the last user who operated the camera is set as the current user and the camera settings are selected accordingly without executing a user identification. If, on the other hand, it is ascertained in S202 that the MODE dial 19 is set to “auto”, the operation proceeds to step S204 to attempt communication with the outside for user detection by engaging the wireless transmission circuit 71 and the wireless reception circuit 72. At this time, the user detection is performed over the maximum wireless communication range by setting the output signal level G of the wireless transmission circuit 71 to the maximum (Gmax).
  • In S205, a verification as to whether or not the number of registered users among the detected users is 0 is accomplished, and if the number of registered users is 0, the default setting is selected for the user, the camera settings for the default user are selected and the most recent user is updated as the default user in S206 before the operation proceeds to S210. If, on the other hand, it is ascertained in S205 that the number of registered users is not 0, a verification is achieved in S207 as to whether or not the number of registered users among the detected users is 1, and if the number of registered users is 1, the detected registered user is set as the current user, the camera settings for the detected registered user are selected and the most recent user is updated as the detected registered user in S208 before the operation proceeds to S209.
  • If it is ascertained in S207 that the number of registered users among the detected users is not 1, i.e., if number of registered users among the detected users is at least 2, a single registered user is selected through an automatic user selection subroutine executed in S25 which is to be detailed later, and then the operation proceeds to S209. In S209, the individual user information of the registered user having determined to be the current user is read and stored into the EEPROM 68 by communicating with the selected registered user again.
  • In step S210, a verification as to whether or not the most recent user whose individual user information is stored in the EEPROM 68 matches the last user is achieved , and if they match, the operation makes a direct return in S211. If, on the other hand, they do not match, an automatic setting subroutine in S22 which is to be detailed later is executed to initialize the camera settings before the operation makes a return in S211.
  • In the detailed flowchart of the automatic user selection subroutine presented in FIG. 11, after the subroutine is started up in S25, the output signal level G of the wireless transmission circuit 71 is first initialized to the minimum (Gmin) in S251 to reduce the wireless communication range to the minimum, and then a wireless communication is attempted at the output signal level setting G to communicate with a registered user in S252. A verification is achieved in S253 as to whether or not communication with the registered user has been achieved, and if no communication has been achieved, the output signal level G of the wireless transmission circuit 71 is increased by ΔG in S254. In S255, a verification is achieved as to whether or not the output signal level G of the wireless transmission circuit 71 exceeds the maximum (Gmax), and if the output signal level G does not exceed the maximum level, the operation returns to S252 to attempt communication with a registered user by slightly increasing the output signal level.
  • If it is confirmed in S253 that communication with a registered user has been achieved, a verification is achieved in S257 as to whether or not communication with a plurality of registered users has been achieved at the output signal level setting, and if communication has not been achieved with a plurality of registered users, the registered user with whom communication has been achieved is accepted as the current user, the camera settings for the registered user with whom communication has been achieved are selected and the most recent user is updated as the registered user with whom communication has been achieved in S258 before the operation makes a return in S261.
  • If, on the other hand, it is verified in S257 that communication with a plurality of registered users has been achieved at the output signal level setting, the registered user ranked highest in the priority order among the registered users with whom communication has been achieved is selected as the current user in S259, and the registered user thus selected is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user in S260 before the operation makes a return in S261.
  • If it is decided in S255 that the output signal level G of the wireless transmission circuit 71 exceeds the maximum (Gmax), the default user is set as the current user, the camera settings for the default user are selected and the most recent user is updated as the default user in S256 before the operation makes a return in S261.
  • As described above, since the registered users are searched while gradually increasing the wireless communication range in the user automatic selection subroutine shown in FIG. 11, the registered user present within the shortest distance from the camera (likely to be the current user of the camera) can be automatically selected and even if communication is achieved simultaneously with a plurality of registered users, the user most likely to be operating the camera can be selected in conformance to the predetermined priority order.
  • In the detailed flowchart of the automatic setting processing subroutine presented in FIG. 12, after the subroutine is started up in S22, the last user and the most recent user are displayed at the left screen 21 and a message indicating that the last user and the most recent user do not match is displayed at the right screen 22 in S221. Next, the initialize interrupt processing subroutine in S95 is executed before the operation makes a return in S222.
  • In the detailed flowchart of the initialize interrupt processing subroutine presented in FIG. 14, after the subroutine is started up in S95, the camera settings are initialized to the custom settings corresponding to the most recent user in S951 and then the operation makes a return in S952.
  • In the detailed flowchart of the photographing mode subroutine presented in FIG. 15, after the subroutine is started up in S30, the processing in S301 is repeatedly executed. In S301, image data sequentially generated by the CCD 55 under the camera setting condition selected by the user or in the initialized camera setting condition are displayed at the left screen 21 and the corresponding camera setting condition and the user name are displayed in text at the right screen 22, as shown in FIG. 16.
  • In the detailed flowchart of the release interrupt processing subroutine presented in FIG. 17, after the subroutine is started up in S40, a verification is achieved in S401 as to whether or not the camera is currently set in the photographing mode, and if it is decided that the camera is not set in the photographing mode, the operation makes a return in S403. If, on the other hand, the camera is set in the photographing mode, an image-capturing operation is executed in the camera setting condition corresponding to the user and image data obtained through the image-capturing operation are stored into the memory card 104 in S402. In addition, the image data are displayed at the screen over a predetermined length of time before the operation makes a return in S403.
  • In the detailed flowchart of the reproduction mode subroutine presented in FIG. 18, after the subroutine is started up in S80, the processing in S801 is repeatedly executed. In S801, specific image data stored in the memory card 104 are selected and read out in response to an operation of the direction buttons 23 and 24 and the image data are reproduced and displayed at the left screen 21 and the operating instructions and the user name are displayed at the right screen 22, as shown in FIG. 19.
  • In the detailed flowchart of the transmission interrupt processing subroutine presented in FIG. 20, after the subroutine is started up in S90, a verification is achieved in S901 as to whether or not the camera is currently set in the reproduction mode and the operation makes a return in S903 if the camera is not set in the reproduction mode. If, on the other hand, the camera is currently set in the reproduction mode, the image data the reproduced image of which is currently displayed at the left screen 21 are read out from the memory card 104 and the image data are wirelessly transmitted to the recipient corresponding to the user through the wireless transmission circuit 71 in S902, and then the operation makes a return in S903.
  • In the detailed flowchart of the registration interrupt processing subroutine presented in FIG. 21, after the subroutine is started up in S50, two selection items, i.e., “new user registration” and “registered user selection” are displayed is at the left screen 21, as shown in FIG. 22, to allow the user to select one of the selection items in S501. If the user selects “new user registration”, a communication for user detection is attempted through the wireless transmission circuit 71 and the wireless reception circuit 72 by setting the output signal level G to the maximum (Gmax) in S502. In S503, the names of the detected users and their identification data are displayed at the left screen 21 as shown in FIG. 23 to enable a selection of the user to be registered. In S504, the registered users are displayed in conformance to the priority order at the left screen 21, as shown in FIG. 24, to enable selection of a priority order rank for the user to be newly registered.
  • In S505, data indicating the selected user and his priority order rank and data related to the selected user (the identification data and the personal information data) are registered and saved at the EEPROM 68. In S60, the setting interrupt processing subroutine is executed to select the camera settings and the like for the newly registered user, and then in S506, data indicating the selected settings and the like are registered and saved at the EEPROM 68 as the camera data of the newly registered user and also the setting data are sent back through the wireless transmission circuit 71 to be saved into an instrument such as a portable telephone carried by the user before the operation proceeds to S509.
  • If, on the other hand, “registered user selection” is selected in S501, the names of the registered users are displayed at the left screen 21 as shown in FIG. 25 to enable a selection of the desired registered user in S507, and then the current user is switched to the selected registered user, the camera settings are switched to those for the selected registered user and the most recent user is updated as the selected registered user in S508 before the operation proceeds to S509. In S509, the MODE dial 19 is repeatedly checked to verify whether or not a setting other than the registration mode has been selected through the MODE dial 19, and if a setting other than the registration mode has been selected, the operation makes a return in S510.
  • In the detailed flowchart of the setting interrupt processing subroutine presented in FIG. 26, after the subroutine is started up in S60, exposure mode selection items are displayed at the left screen 21 as shown in FIG. 27 in S601 to enable a user selection. In S602, AF mode (automatic focal adjustment mode) selection items are displayed at the left screen 21, as shown in FIG. 28, to enable a user selection. In S603, image-data-recording-mode selection items are displayed at the left screen 21, as shown in FIG. 29, to enable a user selection. In S604, image-data-display-mode selection items are displayed at the left screen 21, as shown in FIG. 30, to enable a user selection. Then, in S605, the current camera settings are updated in conformance to the selected camera settings before the operation makes a return in S606.
  • In the detailed flowchart of the photographing/reproduction interrupt processing subroutine presented in FIG. 31, after the subroutine is started up in S70, a verification is achieved as to whether or not the camera is currently set in the reproduction mode, and if it is decided that the camera is set in the reproduction mode, the operation exits the reproduction mode in S702 to shift to the photographing mode subroutine in S30. If, on the other hand, the camera is not set in the reproduction mode, the operation exits the photographing mode in S703 to shift to the reproduction mode subroutine in S80.
  • In the detailed flowchart of an example of a variation of the automatic user selection subroutine presented in FIG. 32, after the subroutine is started up in S25, communication with detected registered users is executed with the output level of the wireless transmission signal set at a constant level and data indicating the reception signal output levels at the user-side instruments are obtained from the user-side instruments in S271. Then, in S272, the registered user whose data indicate the highest reception signal output level is selected. In S273, the selected registered user is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user before the operation makes a return in S274.
  • As described above, in the automatic user selection subroutine shown in FIG. 32, signal output levels of the signal transmitted by the electronic camera and received by the users are detected on the reception side, and the data indicating the detected reception signal output levels are sent back to the camera where the registered user with the highest output level for reception signal (within the shortest distance) is selected.
  • In the detailed flowchart of another example of a variation of the automatic user selection subroutine presented in FIG. 33, after the subroutine started up in S25, the electronic camera communicates with detected registered users and obtains data indicating the output levels of signals transmitted by the user-side instruments from the user-side instruments in S281. In S282, the signals transmitted by the user-side are received and the output levels of the received signals are detected. In S283, the registered user with the lowest signal-attenuation rate is selected by comparing the reception-signal output levels with the transmission signal output-level data. In S284, the registered user thus selected is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user, and then the operation makes a return in S285. As described above, in the user automatic selection subroutine shown in FIG. 33, the registered user with the lowest signal attenuation rate at the electronic camera (the registered user within the shortest distance) is selected.
  • In the detailed flowchart of yet another example of a variation of the user automatic selection subroutine presented in FIG. 34, after the subroutine is started up in S25, the registered user whose last operation of the electronic camera is recorded with the most recent time among the detected registered users is selected in S291. In S292, the registered user thus selected is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user before the operation makes a return in S285.
  • As described above, in the user automatic selection subroutine shown in FIG. 34, the registered user whose last operation of the electronic camera is the most recent (most likely to be the current user) is selected from the detected registered users. By referencing the history of the use of the camera by the users in this manner, the level of the user identification capability can be improved.
  • In FIG. 35 illustrating the structure of the wireless transmission circuit 71 and the wireless reception circuit 72 within the electronic camera, the wireless transmission circuit 71 and the wireless reception circuit 72 are enclosed by an electromagnetic shield film 79 over the camera front surface (along with the photographic optical axis), the camera lower surface, the camera upper surface and the camera left and right side surfaces. Accordingly, the wireless transmission circuit 71 and the wireless reception circuit 72 are allowed to achieve communication readily with a portable instrument 180A carried by a user likely to be present to the rear of the camera, i.e., the photographer (user A in the figure) and are not allowed to achieve ready communication with a portable instrument 180B carried by the subject (user B in the figure) or the like present along the photographing direction. FIG. 36, which shows the communication directivity of the wireless transmission circuit 71 and the wireless reception circuit 72, indicates that the directivity on the side to the rear of the electronic camera 100 is extreme and that the directivity on the side to the front (along the optical axis 40) is slight.
  • By adopting the structure shown in FIG. 35 to allow the wireless transmission circuit 71 and the wireless reception circuit 72 to assume extreme directivity on the side to the rear of the camera as described above, the likelihood of detecting the photographer as the current user is increased. It is to be noted that the directivity may be adjusted by adopting a specific antenna structure as well as by using an electromagnetic shield film.
  • In FIG. 37 illustrating a user selection achieved based upon the positional relationships between the camera and users, camera user A and camera user B individually detect position data (x, y, z) with GPS (global positioning system) units, and the detected position data (x1, y1, z1) and (x2, y2, z2) of user A and user B respectively are provided to the camera which calculates the relative distances between the camera and user A and between the camera and user B based upon its own position data (x0, y0, z0) and selects the user present within the shortest relative distance. For instance, the relative distance between the camera and user A can be calculated as the square root of the sum of the squares of (x1-x0), (y1-y0) and (z1-z0).
  • In the detailed flowchart of an embodiment of the user automatic selection subroutine which is achieved by utilizing GPS units presented in FIG. 38, after the subroutine is started up in S25, the position data indicating the user positions are obtained by communicating with the detected registered users in S391. In S392, position data indicating the position of the camera itself are detected. In S393, based upon the user position data and the camera position data, the relative distances between the camera and detected registered users are calculated. In S394, the registered user present within the shortest relative distance is selected. In S395, the registered user thus selected is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user before the operation makes a return in S396.
  • As described above, in the user automatic selection subroutine shown in FIG. 38, the registered user present within the shortest distance from the camera is selected based upon the positions detected with the GPS units.
  • In the detailed flowchart of a user manual selection subroutine executed to manually select the current user instead of through the user automatic selection subroutine, after the subroutine is started up in S35, the registered users detected as a result of a communication trial are displayed at the left screen 21, as shown in FIG. 40, to enable a user selection in S351. In S352, the registered user that has been selected is set as the current user, the camera settings for the selected registered user are selected and the most recent user is updated as the selected registered user, before the operation makes a return in S353. As described above, in the user manual selection subroutine shown in FIG. 39, a registered user is manually selected from the detected users indicated as the results of communication and thus, the current user can be selected with a high degree of reliability.
  • In the detailed flowchart of a grip sensor interrupt processing subroutine presented in FIG. 41, through which the user identification processing can be started at power ON,as well as in response to an output from a grip sensor internally provided at the grip portion 14, the grip-sensor-interrupt-processing subroutine in S45 is started up as the grip sensor senses that there has been a change at the grip portion 14, i.e., a change from a non-gripped state to a gripped state and the user identification processing subroutine in S20 is executed accordingly before the operation makes a return in S451. By executing the grip sensor interrupt subroutine shown in FIG. 41 as described above, the current user can be identified with a high degree of reliability even when a change of photographer occurs while the power is on.
  • In the detailed flowchart of a timer interrupt processing subroutine presented in FIG. 42, through which the user identification processing is started up by a timer interrupt, the timer interrupt processing in S46 is started up over a predetermined time interval, a verification is achieved in S461 as to whether or not the camera is currently set in a self timer photographing mode (a mode set with an operating member (not shown), in which an image-capturing operation is executed with a predetermined delay following a shutter release operation) or a remote control mode (a mode set with an operating member (not shown), in which the shutter is released in the camera through a remote control operation), and if the camera is set in either the self timer photographing mode or the remote control mode, the operation makes a return in S462 without executing a user identification. If, on the other hand, the camera is not set in the self timer photographing mode or the remote control mode, the user identification processing subroutine in S20 is executed and then the operation makes a return in S462.
  • As described above, the timer interrupt processing subroutine shown in FIG. 42, the user identification is executed over predetermined time intervals to achieve a reliable user identification even when the photographer hands the camera over to another operator while the power is on. In addition, since the user identification is not executed in an operating mode (the self timer photographing mode or the remote control mode) in which the photographer is likely to be away from the camera, the reliability of user identification is improved.
  • In the detailed flowchart of an example of a variation of the initialize-interrupt-processing subroutine presented in FIG. 43, the camera operation is set based upon the personal information data. After the subroutine is started up in S95, the default settings for the preferred language, the display character size, the preferred text style and the operating mode (preferred language=Japanese, character size=enlarged size, text style=easy, operating mode=full automatic) are selected in S961. In S962, a verification is achieved as to whether or not the most recent user is the default user, and if the most recent user is the default user, the operation makes a return in S570. If, on the other hand, the most recent user is not the default user, a verification is achieved in S963 as to whether or not the preferred language of the most recent user is Japanese, and if the preferred language is Japanese, the operation proceeds to S965. If, on the other hand, the preferred language of the most recent user is not Japanese, English is set for the preferred language in S964 before the operation proceeds to S965. In S965, a verification is achieved as to whether or not the visual acuity of the most recent user is equal to or higher than 1.0 and if the visual acuity of the most recent user and is not equal to or higher than 1.0, the operation proceeds to S967. If the visual acuity of the most recent user is equal to or higher than 1.0, on the other hand, the character size is set to a standard size before the operation proceeds to S967. In S967, a verification is achieved as to whether or not the age of the most recent user is equal to or higher than 15 and equal to or lower than 60, and if the age of the most recent user is not equal to or higher than 15 and equal to or lower than 60, the operation makes a return in S570. If, on the other hand, the age of the most recent user is equal to or higher than 15 and equal to or lower than 60, the text style is set to “standard” in S968, the operating mode is set to “manual” in S969 and then the operation makes a return in S570. The settings thus selected are reflected in the camera operation and the display operation executed after the return.
  • As described above, in the initialize interrupt subroutine shown and FIG. 43, the operating mode of the camera is adjusted to satisfy individual user needs (a user likely to be inexperienced can use the camera with ease in the full auto (fully automatic) mode, whereas an experienced user can use the camera in a more advanced application through manual setting), the character size is adjusted for an easy-to-check display (a display of larger-sized characters for a user with poor vision), the text style is adjusted to ensure that an error-free operation is performed (an easy text style, free of technical terms, and graphics are used for seniors and children) and a reliable man-machine interface is provided by adjusting the preferred language, based upon the personal information data (the age, the visual acuity and the preferred language). FIG. 44 presents an example of a screen display brought up in the photographing mode with the preferred language set to English, the character size set to large and the text style set to easy. Simple English text (without any technical terms) is displayed by using large-size alphanumeric characters together with graphics and, in this case, the camera assumes the full auto settings ( program exposure mode, one-shot AF mode, normal recording mode and color normal display mode), thereby allowing even an inexperienced photographer to perform a reliable photographing operation without having to select special settings.
  • In FIG. 45, presenting an example of a configuration that may be adopted to automatically adjust the viewfinder diopter in conformance to data indicating the user diopter (eyesight), a Kepler viewfinder (a real-image finder) is constituted with an objective lens 110, an erecting prism 111 and an eyepiece lens 112 and the photographic field can be observed with an eye 113 set on a viewfinder optical axis 115. By moving the eyepiece lens 112 along the viewfinder optical axis, the diopter can be adjusted. The eyepiece lens 112 is moved by a motor 114 which is controlled by the CPU 50.
  • FIG. 46 presents a detailed flowchart of an automatic diopter (eyesight) adjustment performed for each user by adopting the configuration shown in FIG. 45 through the initialize interrupt processing subroutine. After the subroutine is started up in S95, the diopter is set to the default value in S971, a verification is achieved in S972 as to whether or not the most recent user is the default user and if the most recent user is the default user, the operation proceeds to S975. If, on the other hand, the most recent user is not the default user, the operation proceeds to S973 to make a verification as to whether or not the diopter data of the most recent user are available and if the diopter data are not available, the operation proceeds to S975. If the diopter data are available, the diopter is set as indicated by the diopter data of the most recent user and then the operation proceeds to S975. In S975, the CPU 50 drives the motor 114 in conformance to the diopter that has been set to adjust the position of the eyepiece lens 112, thereby completing the diopter adjustment executed in conformance to the user diopter before the operation makes a return in S976. It is to be noted that data indicating the extent of user fatigue may be read out by communicating with the user over predetermined time intervals to fine-adjust the diopter in conformance to the fatigue data, or the length of time over which the electronic camera 100 has been continuously used since the user started operating the electronic camera 100 may be measured to fine-adjust the diopter in conformance to the length of the continuous operation.
  • In FIG. 47 showing the structure that may be adopted in an example of a variation of the wireless communication between the electronic camera and an instrument carried by the user, a weak electrical current (a weak electromagnetic wave) traveling through a path 202 inside the body (a hand 201) of the user holding the portion 14 of the electronic camera 100 is used to enable communication between the CPU 50 and a wristwatch-type instrument 200 worn by the user at which the user information is stored, through a transmission/reception circuit 75. The surface of the grip portion 45, an electrical contact layer that transmits/receives electrical signals through the hand placed in contact with the surface is formed. This contact layer is connected with the transmission/reception circuit 75, so as to enable communication with the instrument 200 carried by the user via the hand holding the portion 14. The instrument 200, too, includes an electrical contact layer to be placed in contact with the hand 201 of the user, so that data can be received and stored data can be transmitted through the contact layer.
  • In the detailed flowchart of a user processing subroutine presented in FIG. 48, which is achieved by adopting the embodiment illustrated in FIG. 47, after the subroutine is started up in S25, a verification is achieved in S551 through the grip sensor as to whether or not the portion 14 is gripped and if it is judged that the portion 14 is not gripped, the most recent user is set as the current user, the camera settings for the most recent user are selected and the most recent user setting is left unchanged in S552 before the operation makes a return in S557. If, on the other hand, it is judged in S551 that the portion 14 is gripped, a communication is attempted via the path 202 within the hand 201 in S553, and if it is judged in S554 that communication has been achieved, the user with whom communication has been achieved is set as the current user, the camera settings for the user with whom communication has been achieved are selected and the most recent user is updated as the user with whom communication has been achieved in S555 before the operation makes a return in S557. If, on the other hand, it is judged in S554 that communication has not been achieved, the default user is set as the current user, the camera settings for the default user are selected and the most recent user is updated as the default user in S556, and then the operation makes a return in S557. As described above, since communication with the instrument having stored therein the user personal information is achieved through the hand holding the electronic camera 100 in the user automatic selection subroutine shown in FIG. 48, the operator (the user) of the electronic camera 100 can be selected with a high degree of reliability.
  • FIG. 49 is another conceptual diagram of an electronic camera adopting the present invention and an electronic image communication system achieved by utilizing this electronic camera. While the personal information data stored in the UIM card 170 loaded at the portable telephone 160 are read into the electronic camera 100 through a short-distance wireless communication between the electronic camera 100 and the portable telephone 180 carried by the user in the electronic image communication system in FIG. 1, personal information data stored in a wireless tag (a non-contact IC card) are automatically read from the wireless tag into the electronic camera 100 as the wireless tag enters the vicinity of the electronic camera 100 internally provided with a non-contact wireless read unit in the electronic image communication system shown in FIG. 49. The electronic camera 100 also has a function of engaging in wireless communication through a wireless telephone line 190 and thus is capable of transmitting image data obtained through an image-capturing operation to an external database 140 and the like. As described above, in the electronic image communication system shown in FIG. 49, the operations of the electronic camera 100 can be customized by the user simply by holding a compact wireless tag (a non-contact IC card) which does not require a power source.
  • In the embodiment described above (see FIGS. 1 and 10), a wireless communication between the electronic apparatus and a portable instrument carried by a user is achieved simply as the user approaches the electronic apparatus, the personal information on the user is transferred to the electronic apparatus and the electronic apparatus automatically selects operational settings that are optimal for the user in conformance to the personal information. As a result, the user is able to immediately operate the electronic apparatus which has been customized for his use without having to perform a special operation.
  • In the embodiment described above (see FIG. 1), in which the user identification is executed through short-distance wireless communication, the likelihood of communicating with a partner (or an opponent) that is the current user is increased.
  • In the embodiment (see FIG. 9), the user identification is executed at power ON before the main operation of the electronic apparatus starts and the settings of the electronic apparatus are selected in correspondence to the identified user. Thus, a given user can always operate the electronic apparatus at the settings customized for the user.
  • Since the communication partner most likely to be the current user is automatically selected as the user if wireless communication with a plurality of partners has been achieved during the wireless user identification in the embodiment described above (see FIG. 10), the confusion that might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus is prevented.
  • Since the user is selected from the pre-registered users if wireless communication with a plurality of partners has been achieved during the wireless user identification in the embodiment described above (see FIG. 10), the confusion that might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus is prevented.
  • Since the predetermined default user is set as the current user if wireless communication has not been achieved with any partner during the wireless user identification in the embodiment described above (see FIG. 10), even a user who is not capable of transferring his personal information data to the electronic apparatus can operate the electronic apparatus at the default settings.
  • Since the electronic apparatus can be operated at the settings corresponding to the most recent user without executing a user identification if the MODE dial 19 is set to “fixed” in the embodiment described above (see FIG. 10), the confusion that might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus is prevented.
  • In the embodiment described above (see FIG. 11), if wireless communication with a plurality of partners has been achieved during the wireless user identification, communication is attempted by gradually adjusting the transmission output from low to high and the first user with whom communication is achieved is selected as the current user. As a result, the partner most likely to be the current user (the partner present over the shortest distance from the electronic apparatus) can be selected and the confusion that might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus is prevented.
  • Since the user is selected in conformance to the predetermined priority order if wireless communication with a plurality of partners has been achieved during the wireless user identification in the embodiment described above (see FIG. 11), a partner likely to be the current user can be selected and the confusion that might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus is prevented.
  • Since the names of the last user and the current user are displayed only if the last user does not match the current user as a result of the user identification in the embodiment described above (see FIGS. 10, 12 and 13), no unnecessary display is brought up when a single user is continuously operating the electronic apparatus and, in addition, an appropriate action (registration setting) can be promptly taken manually if a user has been erroneously identified.
  • In the embodiment described above (see FIG. 14), the electronic apparatus can be initialized to the custom settings set in advance for a given user through a single action, and thus, the camera can be set to the user's preferred settings promptly to correspond to a specific photographing condition.
  • Since the name of the identified user remains on display at the screen in the embodiment described above (see FIG. 16 and the like), an appropriate action (registration setting) can be manually taken promptly even when the user has been erroneously identified.
  • When registering a new user, the names of the partners with whom wireless communication has been achieved are brought up on displayed at the screen and the user himself selects the name of the user to be newly registered in the embodiment described above (see FIGS. 21 and 23) and, as a result, the user to be newly registered can be specified with a high degree of reliability.
  • Since the user automatically identified by the camera can be manually changed in the embodiment described above (see FIGS. 21 and 25), an appropriate action can be promptly taken manually even when the wrong user has been identified by the camera.
  • In the embodiment described above (see FIG. 32), the communication partner reporting the highest-reception-signal level among a plurality of partners with whom wireless communication has been achieved is selected as the current user during the wireless user identification and, as a result, the partner most likely to be the current user (the partner present at the shortest distance from the electronic apparatus) can be selected, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • In the embodiment described above (see FIG. 33), the communication partner with the lowest transmission signal attenuation rate among a plurality of partners with whom wireless communication has been achieved is selected as the current user during the wireless user identification and, as a result, the partner most likely to be the current user (the partner present at the shortest distance from the electronic apparatus) can be selected, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • In the embodiment described above (see FIG. 34), the communication partner who has operated the electronic apparatus most recently among a plurality of partners with whom wireless communication has been achieved is selected as the current user during the wireless user identification and, as a result, the partner most likely to be the current user can be selected, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • In the embodiment described above (see FIGS. 35 and 36), the directivity of the wireless communication in the electronic apparatus is restricted along the direction in which the operator of the electronic apparatus is likely to be present relative to the position of the electronic apparatus main unit by using an electromagnetic shield or the like and, as a result, the likelihood of wireless communication being achieved with the true user is increased and the likelihood of wireless communication being achieved with parties other than the true user is reduced during the user identification executed through a wireless communication, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • In the embodiment described above (see FIGS. 37 and 38), the communication partner who is present within the shortest distance from the electronic apparatus among a plurality of partners with whom wireless communication has been achieved is selected as the current user during the wireless user identification and, as a result, the partner most likely to be the current user can be selected, thereby preventing the confusion which might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus.
  • In the embodiment described above (see FIGS. 39 and 40), if wireless communication has been achieved with a plurality of partners during the wireless user identification, the names of the partners with whom communication has been achieved are brought up on display at the screen and then the user himself is allowed to select the true user from the partners on display. As a result, the correct user can be specified with a high degree of reliability and the confusion that might otherwise result when a plurality of partners with whom wireless communication is enabled are present in the vicinity of the electronic apparatus is prevented.
  • Since the user identification is executed by detecting that the electronic apparatus is held by the user in the embodiment described above (see FIG. 41), the user identification is executed only if there is a user who is bound to be operating the electronic apparatus and thus, an erroneous user identification resulting from wireless communication achieved with a non-user partner when there is no true user present in the vicinity of the electronic apparatus is prevented.
  • In the embodiment described above (see FIG. 42), the user identification is executed over predetermined time intervals. As a result, a change-over of the user taking place when the power to the electronic apparatus is on can be automatically detected with a high degree of reliability and the electronic apparatus can be set for operation by the new user.
  • Since no user identification is executed through the short-distance wireless communication in an operating mode in which a true user is not likely to be in the vicinity of the electronic apparatus in the embodiment described above (see FIG. 42), the risk of erroneously identifying a party other than the true user as the current user is reduced.
  • Since the operational settings and the display settings are selected for the electronic apparatus in conformance to the extent of expertise of a given user in handling the electronic apparatus which is indirectly assessed based upon the general personal information on the user (information bearing no direct relevance to the electronic apparatus) in the embodiment described above (see FIGS. 43 and 44), unexperienced users (e.g. seniors and children) unaccustomed to handling electronic apparatuses can use the electronic apparatus through an easy and simple operation and, at the same time, users in the age group likely to be familiar with handling electronic apparatuses can utilize the electronic apparatus in a more advanced application through manual settings and the like.
  • Since the diopter adjustment of the viewfinder is executed based upon the information indicating the user's visual acuity in the embodiment described above (see FIGS. 45 and 46), it is not necessary to manually perform a viewfinder diopter adjustment each time one of a plurality of users sharing the camera operates the camera and, as a result, a photographing operation can be started promptly without missing a good photo opportunity.
  • Since the electronic apparatus communicates with an instrument worn by the user through the hand of the user holding the electronic apparatus in the embodiment described above (see FIGS. 47 and 48), the user can be identified with a high degree of reliability. Alternatively, the electronic instrument may be embedded in the body of the user.
  • Since the electronic apparatus includes a read circuit that reads a non-contact IC card (wireless tag) and thus is able to perform a non-contact read of the personal information from the non-contact IC card (wireless tag) in the embodiment described above (see FIG. 49), the personal information on the user carrying a light weight and compact non-contact IC card that does not require a power supply is provided to the electronic apparatus without requiring the user to perform any special operation and thus, the electronic apparatus can be operated with the settings preferred by the user.
  • EXAMPLES OF VARIATIONS
  • The present invention is not limited to the embodiment described above and allows for a number of variations and modifications.
  • In the embodiment described above (see FIG. 9), communication with the electronic instrument (a portable telephone, a wireless tag) carried by the user is attempted from the electronic apparatus (electronic camera) side for user identification. However, an operating member that executes a trial communication may be provided at the user-side electronic instrument so that the electronic apparatus near the electronic instrument responds only when the user operates the operating member to enable transmission of the user personal information to the electronic apparatus, instead.
  • FIG. 50 illustrates the sequence of a wireless communication exchange between an electronic instrument carried by the user and the electronic apparatus that the user wishes to utilize, which is initiated on the electronic instrument side. The electronic instrument is a portable telephone 160 shown in FIG. 52 at which a UIM card or the like is loaded and thus the personal information on the user carrying the portable telephone 160 is stored and saved. The portable telephone 160 includes a display screen 161 and operating buttons 162. As shown in FIG. 51, an electronic camera A, an electronic camera B, a printer A and a scanner A are present in the vicinity of the electronic instrument, and the user carrying the electronic instrument (the portable telephone 160) wishing to use an electronic apparatus (the electronic camera A) first operates an operating button 162 at the portable telephone 160 to set the portable telephone 160 in the communication mode, whereby various categories of electronic apparatuses that may be used are brought on display at the display screen 161, as shown in FIG. 53, to allow the user to select the category of the electronic apparatus he wishes to use with a selection member (not shown). FIG. 53 presents an example of a display in which the electronic camera is selected as the category of the electronic apparatus the user wishes to use.
  • Once the category of the electronic apparatus that the user wishes to utilize is the selected, the electronic instrument transmits electronic instrument identification information, information indicating the selected category and the user identification information to the electronic apparatuses present in the vicinity of the electronic instrument together with a utilization state information request 1. Among the electronic apparatuses having received the utilization state information request 1 from the electronic instrument, any electronic apparatus that does not belong to the category indicated by the category information does not return any response. However, if a given electronic apparatus fits into the category indicated by the category information, the electronic apparatus verifies the current utilization state (i.e. whether or not the user indicated by the user information can use the apparatus) and then executes communication 2 to transmit the utilization state information, resulting from the verification, to which both the identification information of the sender electronic apparatus and the electronic instrument information corresponding to the recipient are appended.
  • The electronic instrument corresponding to the electronic instrument information that has received the utilization state information 2 from eligible electronic apparatuses displays their utilization states, i.e., the identities of the electronic apparatuses that are available for use, as shown in FIG. 54, at the display screen 161 and then automatically selects a single electronic apparatus based upon, for instance, varying distances between the electronic instrument and the electronic apparatuses. Alternatively, the user may be allowed to select the specific category of the electronic apparatus that he wishes to use with a selection member (not shown). FIG. 54 presence an example of a display in which the electronic camera A is selected as the electronic apparatus to be used.
  • After the electronic apparatus to be used is selected at the electronic instrument, the electronic instrument issues a utilization request to the selected electronic apparatus by transmitting the identification information corresponding to the selected electronic apparatus, the identification information corresponding to the sender electronic instrument and the user personal information through a communication 3. Upon receiving the utilization request 3, the electronic apparatus selected by the electronic instrument performs a custom setting operation for the electronic apparatus based upon the user personal information and also, it sends a verification notification attached with the identification information corresponding to the sender electronic apparatus and the identification information corresponding to the recipient electronic instrument through a communication 4 with the electronic instrument that has transmitted the utilization state. Upon receiving the verification notification 4, the electronic instrument, i.e., the recipient of the verification notification, displays a message indicating that the selected electronic apparatus is now available for use at the display screen 161, as shown in FIG. 55, for the user. Subsequently, the user can operate the electronic apparatus at the settings customized for his use.
  • Since this allows the electronic apparatus to verify with a high degree of reliability that the electronic instrument having initiated a trial communication belongs to a user, an erroneous user identification can be prevented and, at the same time, since the electronic apparatus is able to execute the user identification with arbitrary timing, the current user of the electronic apparatus can be effectively identified even after a user has changed-over.
  • The user-side electronic instrument, having received the electronic apparatus utilization information from an electronic apparatus (e.g., information indicating that the electronic apparatus is currently being used by another person, information indicating that only limited users are allowed use of the electronic apparatus), makes a decision as to whether or not the user can use this particular electronic apparatus. If the user-side electronic instrument decides that there is a single electronic apparatus in its vicinity that can be used by the user, this electronic apparatus is selected as the apparatus to be used by the user and transmits a message indicating that the user's intention to use the electronic apparatus and the user personal information to the electronic apparatus. Upon receiving the message indicating the user's intention to use the electronic apparatus and the user personal information from the electronic instrument, the electronic apparatus identifies the user carrying the electronic instrument as the user of the electronic apparatus and customizes the settings of the electronic apparatus for the user based upon the user personal information.
  • If the user-side electronic instrument determines that there are a plurality of electronic apparatuses in its vicinity that can be used by the user, it automatically selects the electronic apparatus that the user most likely wishes to use through a method similar to any of those adopted in the embodiment (the electronic apparatus present at the shortest distance, the electronic apparatus which is highest in the priority order, manual selection, etc.) and transmits a message indicating the user's intention to use the electronic apparatus and the user personal information to the selected electronic apparatus. Upon receiving the message indicating the user's intention to use the electronic apparatus and the user personal information from the electronic instrument, the electronic apparatus identifies the user carrying the electronic instrument as the user of the electronic apparatus and customizes the settings of the electronic apparatus for the user based upon the user personal information.
  • Alternatively, if the user-side electronic instrument determines that there are a plurality of electronic apparatuses in its vicinity that can be used by the user, it may transmits a message indicating the user's intention to use the electronic apparatus and the user personal information to the plurality of electronic apparatuses. In this case, upon receiving the message indicating the user's intention and the personal information from the electronic instrument, each of the plurality of electronic apparatuses identifies the user carrying the electronic instrument as the user of the electronic apparatus and, as a result, the settings at the plurality of electronic apparatuses are simultaneously customized for the user based upon the user personal information. Thus, the need to execute the user identification at each electronic apparatus is eliminated when the user wishes to use a plurality of electronic apparatuses at the same time.
  • As another alternative, the user-side electronic instrument may immediately transmit a message indicating the user's intention and the user personal information to any electronic apparatus in its vicinity without verifying the number of electronic apparatuses in its vicinity that can be used by the user. In this case, each electronic apparatus having received the message indicating the user's intention to use and the personal information from the electronic instrument identifies the user carrying the electronic instrument as the user of the electronic apparatus and, as a result, the settings at a plurality of electronic apparatuses or a single electronic apparatus can be customized for the user based upon the user personal information. While this does not allow the user to select a specific electronic apparatus for use, it eliminates the step in which a decision is made as to whether or not a given electronic apparatus is available for use by the user and, for this reason, it is effective in a situation in which the user urgently needs to operate an electronic apparatus customized for his use.
  • In addition, the electronic apparatus automatically selected by the user-side electronic instrument may be displayed at the display unit of the electronic instrument for user verification. Alternatively, a list of electronic apparatuses available for use by the user may be displayed at the user-side electronic instrument based upon the results of the communication to allow the user to select the electronic apparatus he wishes to use.
  • In the embodiment described above (see FIG. 9), the user identification is achieved through a wireless (electromagnetic wave) communication between the electronic apparatus and the electronic instrument carried by the user. However, during this process, a redundant communication with an electronic instrument carried by a party other than the true user should be prevented by keeping the wireless transmission signal output level within a range lower than a predetermined value and thus, limiting the communication-enabled distance range based upon the assumption that electronic instruments have the average reception capability. This distance range is determined by taking into consideration the standard manner in which the electronic apparatus is operated, whether or not an object that blocks the electromagnetic waves is present between the electronic apparatus and the electronic instrument, the performance of the receiver at the electronic instrument and the like. It is desirable to set the range equal to or less than approximately 10 m under normal circumstances and to set the range equal to or less than 2 m and preferably equal to or less than 1 m in the case of an electronic apparatus that is primarily operated by the user carrying the apparatus.
  • In the embodiment described above (see FIG. 9), the user identification is achieved through a wireless (electromagnetic wave) communication between the electronic apparatus and the electronic instrument carried by the user. However, the user identification may be instead achieved through either a non-contact method or a contact method other than wireless communication. In addition, by combining a contact-type user identification method (e.g., finger print detection) or a non-contact user identification method (e.g., pupil pattern detection, retina pattern detection, face image detection or voiceprint detection), both achieved based upon physical characteristics of the user, with the wireless user detection method that uses the electromagnetic wave described above, an even more reliable user identification is enabled. By adopting such a combined method, the electronic apparatus detects the user's physical characteristics and then identifies the communication partner whose physical characteristics read out from his electronic instrument present in the vicinity of the electronic apparatus through wireless communication match the detected physical characteristics as the true user.
  • In the embodiment described above (see FIG. 9), the user identification is executed through wireless communication between the electronic apparatus and the electronic instrument carried by the user and the user verifies that he has been identified by the electronic apparatus by checking the user name displayed at the screen of the electronic apparatus. Instead, the electronic apparatus information may be transmitted by the electronic apparatus to the electronic instrument of the user identified by the electronic apparatus and the user's instrument, upon receiving the information, may display at the user-side electronic instrument, information indicating specifically which electronic apparatus has identified the user. In addition, the electronic instrument may notify the user that he has been identified by an electronic apparatus by utilizing a means for sound generation. In such a case, one of various sound patterns may be used in correspondence to a specific electronic apparatus type so that a user can ascertain the type of electronic apparatus which has identified him without having to check the display.
  • By adopting the structure described above, it becomes possible for the user to verify that he has been correctly identified by an electronic apparatus that does not include a means for display. In addition, such an audio notification allows the user to verify that he has been correctly identified by an electronic apparatus even if his electronic instrument does not include a means for display while keeping the electronic instrument in his pocket or the like.
  • In the embodiment described above (see FIG. 11), the electronic apparatus wirelessly communicates with an electronic instrument carried by a user, and if a plurality of users are detected as a result, the electronic apparatus attempts communication with the electronic instruments by gradually increasing the transmission output and selects the user able to use the electronic apparatus with whom communication is achieved initially, as the correct user. However, the electronic apparatus may attempt communication with electronic instruments by gradually increasing the transmission output from the initial state and select the user able to use the electronic apparatus with whom communication is achieved first as the correct user, instead.
  • In the embodiment described above (see FIG. 41), the user identification is executed with the timing with which the electronic apparatus is held by the user. However, the user identification may instead be executed through another method or with timing other than that used in the embodiment. For instance, the user identification may be executed with timing with which a given operating member of the electronic apparatus is operated (e.g., the timing with which the shutter release button 16 of the electronic camera 100 is pressed half way down). Since this allows the user identification to be executed at time when the user of the electronic apparatus is surely present in the vicinity of the electronic apparatus, a reliable user identification is achieved and, at the same time, the need for a special detection sensor such as a grip sensor is eliminated. Alternatively, a pair of elements, i.e., a light emitting element and a light-receiving element may be provided near the eyepiece unit of the viewfinder 11 to execute the user identification with the timing with which an increase in the quantity of reflected light is detected as a user moves his eyes or face closer to the eyepiece unit. As a further alternative, the user identification may be executed with the timing with which the memory card 104 is detached. Furthermore, the user identification may be executed with the timing with which the photographic lens is replaced, a zooming operation is performed, a focusing operation is performed or an external strobe is detached/attached in the electronic camera.
  • In the embodiment described above (see FIG. 42), no user identification is executed if the electronic apparatus is set in the remote control mode. However, the remote control device may be equipped with a user identification function, instead. In this case, a user identification is enabled even in the remote control mode and, at the same time, it is not necessary to execute the user identification at individual electronic apparatuses when a plurality of electronic apparatuses are controlled by using a single remote control device In the embodiment described above (see FIG. 43), the camera setting operation and the display operation are customized based upon the personal information on the user identified by the electronic apparatus. However, processing other than the camera setting operation and the display operation may be executed in conformance to the personal information which has been obtained.
  • For instance, the functions of operating members may be changed (e.g., the functions of the shutter release button 16 and the power switch 17 in FIG. 2 may be switched) in conformance to the information indicating the user's favored hand (left-handed or right-handed). The level of force (resistance) required to depress the shutter release button 16 may be adjusted in correspondence to the age or gender of the user. Moreover, any portion of the display on the screen that uses different hues may instead be displayed with varying levels of contrast if the user has a vision problem such as color blindness.
  • In the embodiment described above (see FIG. 9), the electronic camera 100 communicates with the electronic instrument carried by the photographer (user) and identifies the photographer operating the electronic camera 100. However, the electronic camera 100 may instead communicate with an electronic instrument carried by a subject to set the operations of the electronic camera 100 in conformance to the personal information of the subject.
  • In this case, the electronic camera 100 engages in wireless communication with electronic instruments in the vicinity and eliminates the electronic instrument that appears to be carried by the photographer among the electronic instruments with which communication has been achieved, for instance. The photographer's electronic instrument may be pre-registered at the electronic camera 100, or the electronic instrument present at the shortest distance from the electronic camera 100 may be identified as the electronic instrument carried by the photographer and thus eliminated. Next, the electronic camera 100 sets the distance range over which the subject of the electronic camera 100 should be present by using information indicating the photographic lens focal length or information indicating the photographic lens photographing distance and then selects the electronic instrument carried by the user (subject) of the electronic camera 100 present within the subject distance range from the electronic camera 100 among the electronic instruments with which wireless communication has been achieved based upon information related to the distances between the electronic camera 100 and the electronic instruments (the signal levels, the position information or the like). During this process, the photographing direction and the photographic field angle of the electronic camera 100 may be determined based upon azimuth information detected by a means for azimuth detection internally provided at the electronic camera 100 and the information indicating the photographic lens focal length, and the positions (the directions of the electronic instruments relative to the electronic camera 100 may be calculated based upon information indicating the measured positions (position information) of the electronic camera 100 and the electronic instruments so as to select the electronic instrument present within the photographic field angle along the photographing direction.
  • By selecting the electronic instrument carried by a party other than the photographer, i.e., the subject, as described above and, selectively engaging in wireless communication with the electronic instrument, the electronic camera 100 may obtain data indicating the color of the irises of the subject so that if any red-eye phenomenon occurs in the image data obtained by photographing the subject, the portion of the image data corresponding to the red-eye can be touched up by using the pupil color data.
  • In addition, the personal information may include data indicating whether or not the red-eye phenomenon tends to occur readily so that the electronic camera 100 can automatically set itself in a red-eye reproduction mode (a mode for reducing the extent of the red-eye phenomenon by emitting a small quantity of light at the strobe prior to a photographing operation and causing the pupils of the subject to contract) to perform a strobe photographing operation if the personal information obtained from the subject indicates that the user tends to induce the red-eye phenomenon readily. Furthermore, the personal information does not necessarily need to include direct data indicating whether or not the red-eye phenomenon tends to occur readily and, instead, it may include indirect data obtained by deducing the likelihood of the red-eye phenomenon based upon the general personal information such as the user's age and visual acuity. For instance, statistically, young children and myopic people tend to have large pupils and thus are likely to induce the red-eye phenomenon. Accordingly, the camera is forcibly set in the red-eye reproduction mode when photographing a young child or a myopic person. As an alternative means to a strobe light emission for causing the pupils to contract, a lamp or a very bright LED capable of continuously providing illumination over a predetermined length of time may be set in an ON state continuously prior to the photographing operation.
  • The first embodiment described above, in which a prompt user identification is achieved without placing a great onus on the user by identifying the user through wireless communication and the user highly likely to be the true user is automatically selected if a plurality of user candidates are detected wirelessly, enables the user to promptly start operating the electronic apparatus such as an electronic camera without having to perform any time-consuming tasks.
  • In addition, since the settings and the display at the electronic apparatus are adjusted by automatically optimizing them for use by the specific user in conformance to the general personal information bearing no direct relevance to the electronic apparatus settings, which the electronic apparatus has obtained from the user, an environment that allows inexperienced users, seniors and children as well as seasoned users to operate the electronic apparatus with ease and comfort is provided.
  • SECOND EMBODIMENT
  • The second embodiment of the present invention is now explained in reference to the drawing. In the conceptual diagram of the second embodiment of the present invention presented in FIG. 56, an electronic camera that photographs a subject present within the photographing range also obtains related information from the subject and an information source present in the vicinity of the camera through wireless communication. In addition, the information related to the subject, which is made to correspond to the position of the subject within the photographic image plane, is appended to the image data obtained through the photographing operation together with other related information to constitute an image file.
  • When reproducing an image file, the image data having been obtained through a photographing operation are first reproduced and displayed as a home screen. Then, as the position within the image plane is specified at the home screen, the information related to the subject reproduced and displayed at the position is displayed at the screen. From the related information thus displayed, further related information linked with the information can be brought up on display.
  • FIG. 57 is a conceptual diagram of an electronic camera adopting the present invention and an electronic image communication system achieved by utilizing the electronic camera. In FIG. 56, an electronic camera 100 that includes a memory card saves electronic image data or digital image data (hereafter referred to as image data) obtained by photographing a photographic field contained within a photographic image plane in a predetermined size into a memory card. In addition, the electronic camera 100 has a short-distance wireless communication function (e.g., Bluetooth (registered trademark) enabling communication within a range of approximately a 10 meters) and engages in communication with a portable telephone 160 that also has a short-distance wireless communication function. It is assumed that this portable telephone 160 is carried by a subject, i.e., a person to be photographed by the electronic camera 100 and that a UIM card (user identification module card) 170 having stored therein user personal information, that can be loaded at the portable telephone 160. In addition, the portable telephone 160 is internally provided with a means for positional measurement (a means for positional detection such as a GPS) unit.
  • When the person, i.e., the subject, is photographed with the electronic camera 100, the personal information of the subject is transferred to the electronic camera 100 through communication between the electronic camera 100 and the portable telephone 160 carried by the subject. In addition, the electronic camera 100 detects the position of the subject within the photographic image plane through a method which is to be detailed later and stores an image file constituted of a set of the image-plane position information and the personal information as well as the image data obtained through the photographing operation into the memory card. The electronic camera 100 also transfers the image file to a personal computer 140 for personal use or an image server 150 via a wireless base station 120 through a wireless portable telephone line 190 and then via a wired or wireless public telephone line or the Internet 130.
  • The electronic camera 100 includes a liquid crystal display screen at which an image file can be reproduced and displayed by reading out the image file saved in the memory card. In addition, the electronic camera 100 is capable of accessing various information servers and information sites based upon the information related to the subject via the wireless portable telephone line 190 and the Internet 130, obtaining detailed information on the subject and reproducing and displaying the detailed information at the liquid crystal display screen.
  • The personal computer 140 for personal use includes a CRT display screen at which an image file can be reproduced and displayed by reading out the image file saved at the image server 150. The personal computer 140 for personal use is also capable of accessing various information servers and information sites based upon the information related to the subject via the Internet 130, obtaining detailed information on the subject and reproducing and displaying the detailed information at the CRT display screen.
  • FIGS. 58 and 59 present external views (a front view and a rear view) of the electronic camera 100 in FIG. 57, as achieved in an embodiment. As shown in FIG. 58, at the front of the electronic camera 100, a photographic lens 10 which forms a subject image, a viewfinder 11 through which the photographic image plane is checked, a strobe 12 used to illuminate the subject during a photographing operation, a photometering circuit 13 that detects the brightness of the subject and a grip portion 14 that projects out from the camera main body to allow the electronic camera 100 to be held by hand are provided, whereas a shutter release button 16 operated to issue an instruction for a photographing start and a power switch 17 (a momentary switch which is alternately set to ON and OFF each time it is operated) through which the on/off state of the power to the electronic camera 100 is controlled are provided at the upper side of the camera.
  • As shown in FIG. 59, at the rear of the electronic camera 100, an eyepiece unit of the viewfinder 11, a left LCD (left screen) 21 having a substantially quadrangular screen for text and image display and a right LCD (right screen) 22 having a substantially quadrangular screen for text and image display are provided, and an up button 23 and a down button 24 for switching the image displayed at the left screen 21 are provided in the vicinity of, and to the left of the left LCD 21, whereas the photographing mode button 25 operated to set the electronic camera 100 in a photographing mode, a reproduction mode button 26 operated to set the electronic camera 100 in a reproduction mode, an information display mode button 27 operated to set the electronic camera 100 in an information display mode and a CONFIRM button 29 used to set a selection item are provided around the right screen 22 and the left screen 21. At a side surface, a memory card slot 30 at which a memory card 104 may be loaded is provided.
  • It is to be noted that the shutter release button 16, the up button 23, the down button 24, the photographing mode button 25, the reproduction mode button 26, the information display mode button 27, the SEND button 28 and the CONFIRM button 29 are all operating keys operated by the user.
  • It is also to be noted that over the surfaces of the left screen 21 and the right screen 22, so-called touch tablets 66 having a function of outputting position data corresponding to a position indicated through a finger contact operation are provided, and they can be used to make a selection from selection items displayed on the screens or to select a subject displayed on the screen. The touch tablets 66 are each constituted of a transparent material such as a glass resin so that the user can observe images and text formed inside the touch tablets 66 through the touch tablets 66.
  • In the block diagram in FIG. 60, which presents an example of an internal electrical structure that may be assumed in the electronic camera 100 shown in FIGS. 58 and 59, various components are connected with one another via a data/control bus 51 through which various types of information data and control data are transmitted.
  • The components are divided into the following five primary blocks, i.e., a block, the core of which is a photographing control circuit 60 that executes an image data photographing operation, a block, the core of which is constituted of a wireless communication circuit 71 that communicates with electronic instruments in the vicinity of the electronic camera 100 through wireless communication (Bluetooth or the like) and a wireless telephone circuit 72 that exchanges data with the outside through a wireless portable telephone line, a block constituted of the memory card 104 in which image files are stored and saved, a block, the core of which is a screen control circuit 92 that executes display of image data and information related to the image data and a block, the core of which is constituted of user interfaces such as operating keys 65 and a CPU 50 that implements integrated control on various control circuits.
  • The CPU 50 (central processing unit), which is a means for implementing overall control on the electronic camera 100, issues various instructions for the photographing control circuit 60, the wireless communication circuit 71, the wireless telephone circuit 72, the screen control circuit 92 and a power control circuit 64 in conformance to information input from the operating keys 65, the touch tablets 66, various detection circuits 70, the power switch 17, a timer 74, the photometering circuit 13, a GPS circuit 61 and an attitude detection circuit 62.
  • An audio recording circuit 80 records sound generated by the subject during a photographing operation and sound made by the photographer after the photographing operation as a memorandum as audio data in conformance to control implemented by the CPU 50, and such audio data are stored in the image file together with the image data. An audio reproduction circuit 81 reproduces the audio data which are stored in the image file together with the image data during an image data reproduction or other audio data, in conformance to control implemented by the CPU 50.
  • The photometering circuit 13 measures the brightness of the subject and outputs photometric data indicating the results of the measurement to the CPU 50. In conformance to the photometric data, the CPU 50 sets the length of the exposure time and the sensitivity of a CCD 55 through a CCD drive circuit 56, and in addition, it controls the aperture value for an aperture 53 with an aperture control circuit 54 via the photographing control circuit 60 in conformance to the data indicating the settings.
  • In the photographing mode, the CPU 50 controls the photographing operation via the photographing control circuit 60 in response to an operation of the shutter release button 15. If the photometric data indicate a low subject brightness, the CPU 50 engages the strobe 12 in a light emission operation via a strobe drive circuit 73 during the photographing operation.
  • The 61 (global positioning system circuit) detects information indicating the position of the electronic camera 100 by using information provided by a plurality of satellites orbiting around the earth and provides the photographing position information to the CPU 50 when an image is photographed. The attitude change detection circuit 62, which is constituted of an attitude sensor of the known art (a gyro sensor, an azimuth sensor or the like) capable of detecting the attitude of the electronic camera 100, detects information indicating the camera attitude and provides the photographing attitude information to the CPU 50 during the image photographing operation.
  • The timer 74 having an internal clock circuit detects information indicating a current time point and provides the photographing time point (day and time) information to the CPU 50 when a photographing operation is executed, the CPU 50 controls the various units in conformance to a control program stored in a ROM 67 (read only memory). In an EEPROM 68 (electrically erasable/programmable ROM) which is a nonvolatile memory, the setting information and the like necessary for the operations of the electronic camera 100 are stored. The CPU 50 implements control on a power supply 63 via the power control circuit 64 by detecting the operating state of the power switch 17.
  • The photographing control circuit 60 focuses or zooms the photographic lens 10 through a lens drive circuit 52, controls the exposure quantity at the CCD 55 through control implemented on the aperture 53 by engaging the aperture control circuit 54 and thus controls the operation of the CCD 55 through the CCD drive circuit 56. The photographic lens 10 forms a subject image onto the CCD 55 with a light flux from the subject via the aperture 53 which adjusts the light quantity. This subject image is captured by the CCD 55. The CCD 55 (charge-coupled device) having a plurality of pixels is a charge storage type image sensor that captures a subject image, and outputs electrical image signals corresponding to the intensity of the subject image formed on the CCD 55 to an analog processing unit 57 in response to a drive pulse supplied by the CCD drive circuit 56. It is to be noted that the photographic field contained inside the angle of field which is determined by the effective image plane size at the CCD 55 and the focal length of the photographic lens 10 is photographed into the photographic image plane as image data.
  • The photographing control circuit 60 repeatedly executes the operation described above and, concurrently, the screen control circuit 92 repeatedly executes an operation in which the digital data sequentially stored into a photographic buffer memory 59 are read out via the data/control bus 51. The digital data thus read out are temporarily stored into a frame memory 69, the digital data are converted to display image data and are restored into the frame memory 69 and the display image data are displayed at the left screen 21. In addition, the screen control circuit 92 obtains text display information from the CPU 50 as necessary, converts it to display text data which are then stored into the frame memory 69 and displays the display text data at the left screen 21 or the right screen 22. Since the image currently captured by the CCD 55 is displayed at the left screen 21 in real time in the photographing mode in this manner, the user is able to set the composition for the photographing operation by using this through screen as a monitor screen.
  • The photographing control circuit 60 detects the state of the focal adjustment at the photographic lens 10 by analyzing the degree of the high-frequency component of the digital data stored in the photographic buffer memory 59 and performs a focal adjustment for the photographic lens 10 with the lens drive circuit 52 based upon the results of the detection.
  • During a photographing operation, upon receiving a photographing instruction from the CPU 50, the photographing control circuit 60 engages the CCD 55 to capture a subject image via the CCD drive circuit 56 and temporarily stores image signals generated through the image-capturing operation as digital data (raw data) into the photographic buffer memory 59 via the analog processing unit 57 and the A/D conversion circuit 58. The photographing control circuit 60 then generates image data by converting or compressing the digital data stored in the photographic buffer memory 59 on a temporary basis in a predetermined recording format (such as JPEG) and stores the image data back into the photographic buffer memory 59.
  • The CPU 50 communicates with the electronic instrument such as a portable telephone carried by the subject via the wireless communication circuit 71 to collect information related to the subject and also to obtain information indicating the position of the electronic instrument. In addition, the CPU 50 obtains the photographing position information from the GPS circuit 61, the photographing attitude information from the attitude detection circuit 62 and the focal length information with regard to the focal length of the photographic lens 10 from the photographing control circuit 60 and executes an arithmetic operation to obtain image-plane position information indicating the subject position within the image plane through a method which is to be detailed later based upon the information indicating the electronic instrument position, the photographing position information, the photographing attitude information and the focal length information. The CPU 50 stores the image data, the information related to the subject and the image plane subject position information into the memory card 104 as an image file.
  • In the reproduction mode, the screen control circuit 92 reads out an image file specified by the CPU 50 from the memory card 104, temporarily stores the image file into the frame memory 69, converts the image data to display image data and stores the display image data back into the frame memory 69 and then displays the display image data at the left screen 21. It also stores text data of the reproduction mode instructions and the like into the frame memory 69 and displays the text data at the right screen 22 in response to an instruction issued by the CPU 50. In the reproduction mode, upon receiving a transmission instruction issued by the CPU 50, the wireless telephone circuit 72 reads out the specified image file from the memory card 104 and wirelessly transmits the image file to the outside.
  • In the information display mode, the CPU 50 first reproduces and displays specific image data at the left screen 21 via the screen control circuit 92 as in the reproduction mode, and also brings up a superimposed display of an information icon at the position in the image plane of the subject corresponding to the position information. In addition, it displays instruction text data at the right screen 22. If the information icon is selected through the touch tablet 66 in the information display mode, the CPU 50 accesses the information source (homepage or the like) on the Internet via the wireless telephone circuit 72 based upon the subject-related information corresponding to the image-plane position information, displays a screen or the like of the homepage at the left screen 21 and display is an operational instruction screen at the right screen 22.
  • FIG. 61 shows the data structure of image files stored in the memory card 104. As shown in FIG. 61, a plurality of image files are saved at the memory card 104. The image files are each constituted of image data and additional information data. The additional information data are constituted of photographing information data (see FIG. 62) indicating various settings selected for the photographing operation, time point information data indicating the photographing time point, the position information data indicating the photographing position, the attitude information data indicating the camera attitude during the photographing operation, audio information data recorded during or after the photographing operation, general information data (see FIG. 63) containing general information related to the photographing operation input during or after the photographing operation and subject information data containing information related to subjects which is entered during or after the photographing operation.
  • The subject information data are constituted of information related to the subjects (each of which may be a person, a building, a landscape or the like) photographed in the photographic image plane. For instance, the subject information data may be constituted of personal information data (see FIG. 64) of a subject person and general information data related to a subject building.
  • FIG. 62 shows the structure of the photographing information data constituted of setting information indicating the photographic lens setting and the camera settings selected for the photographing operation.
  • FIG. 63 shows the structure of the general information data constituted of Internet access data (homepage addresses or the like) indicating how individual sets of information may be accessed on the Internet, data indicating the contents of the individual sets of information and position information data indicating the positions of the originators of the individual sets of information.
  • FIG. 64 shows the structure of the personal information data constituted of the image-plane position information indicating a specific subject within the image plane to which the personal information corresponds, position information indicating the position of the electronic instrument from which the personal information has been transmitted, Internet access information data (the homepage URL and the like) of the individual, the e-mail address of the individual, the individual's name, date of birth, preferred language, physical data (visual acuity, diopter, favored hand), tastes or person's liking (favorite color, etc.) and other data.
  • FIG. 65 is a transition diagram of the state assumed by the electronic camera 100 in the embodiment of the present invention. The electronic camera, which assumes one of the three operating modes, i.e., the photographing mode, the reproduction mode and the information display mode, makes a shift among the modes in response to an operation of one of the three operating buttons (the photographing mode button 25, the reproduction mode button 26 and the information display mode button 27). As the power is turned on, the electronic camera shifts to the photographing mode first. In the photographing mode, a photographing operation, a wireless communication operation and an image file preparation and storage operation are executed. In the reproduction mode, an image file reproducing operation and a transmission operation to transmit an image file to the outside are executed. In the information display mode, information related to the image data is collected by, for instance, accessing the Internet based upon the additional information data corresponding to the image data and the information thus collected is displayed.
  • FIG. 66 presents a flowchart of the main flow of the operations executed in the electronic camera 100 (the CPU 50) in the embodiment described above. First, the power is turned on as the power switch 17 is operated in S10, and then, a photographing mode subroutine is executed in S20 to enter a photographing-enabled state. If the shutter release button 16 is operated in the photographing mode, a release interrupt processing subroutine in S30 is executed to perform a photographing operation. In addition, while the photographing operation is executed, a wireless interrupt subroutine in S40 is called up from the release interrupt processing subroutine in S30 to engage the electronic camera in wireless communication with electronic instruments present in the vicinity of the electronic camera to collect subject information data and general information data. The subject information data and the general information data thus collected constitute the additional information data together with the photographing information data, and the additional information data are stored together with the image data into the memory card 104 as an image file.
  • If one of the three operating buttons (the photographing mode button 25, the reproduction mode button 26 and the information display mode button 27) is operated in a given operating mode, mode switching interrupt processing in S50 is started up to switch over to the mode selected in correspondence to the operating button which has been operated.
  • In the reproduction mode in S60, an image file stored in the memory card 104 is read out and the image data are reproduced and displayed at the display screen, and if the SEND button 28 is operated, a transmission interrupt processing subroutine in S70 is executed to transmit the image file containing the image data currently reproduced in the reproduction mode to an external recipient.
  • In the information display mode in S80, image data are first reproduced and displayed at the display screen as detailed later and also, the subject information itself, which includes the image-plane position information corresponding to the position specified by the user through the touch tablet 66 on the display screen, or information collected by accessing the Internet based upon the Internet access information contained in the subject information is brought on display at the display screen.
  • In the detailed flowchart of the photographing mode subroutine presented in FIG. 67, after the subroutine is started up in S20, the processing in S201 is repeatedly executed. In S201, image data sequentially generated by the CCD 55 in conformance to the photographing setting condition selected by the user are displayed at the left screen 21 as shown in FIG. 68 and the selected photographing setting condition is displayed in text at the screen LCD 22.
  • In the detailed flowchart of the release interrupt processing subroutine presented in FIG. 69, after the subroutine is started up in S30, a verification is achieved in S301 as to whether or not the camera is currently set in the photographing mode, and if it is decided that the camera is not set in the photographing mode, the operation makes a return in S305. If, on the other hand, it is decided that the camera is set in the photographing mode, an image-capturing operation is executed in conformance to the photographing setting condition selected by the user in S302 to generate image data, and then the wireless communication processing subroutine in S40, which is to be detailed later, is executed to generate additional information data. In S303, an image file is generated by incorporating the image data and the additional information data and the image file is saved into the memory card 104. In S304, the image data are displayed at the screen over a predetermined length of time, and then the operation makes a return in S305. It is to be noted that the image data achieves a predetermined photographic image plane size so that a photographed subject is bound to be present somewhere within this photographic image plane in accordance to the photographic composition.
  • In the detailed flowchart of the wireless communication processing subroutine presented in FIG. 70, after the subroutine is started up in S40, a verification is achieved in S401 as to whether or not the electronic camera is currently set in the photographing mode, and if it is decided that the camera is not set in the photographing mode, the operation makes a return in S408. If, on the other hand, it is decided the camera is set in the photographing mode, a means for wireless communication 71 is engaged in S402 to attempt wireless communication with electronic instruments in the vicinity of the electronic camera 100 (a portable telephone carried by a subject person, an electronic sightseeing guide apparatus installed in the vicinity of a historical site and capable of transmitting sightseeing information with regard to the historic site through wireless communication or the like). During this process, the signal transmission output for the wireless communication is adjusted in conformance to the focal length of the photographic lens. For instance, since it can be assumed that the subject is present over a long distance when the focal length is large, a signal transmission output level is set higher accordingly. If it is decided in S403 that no communication has been achieved, the operation makes a return in S408. If communication has been achieved, on the other hand, the position information data indicating the electronic instrument position and the subject information data (or the general information data) are obtained through wireless communication from the electronic instrument of the communication partner with whom communication has been achieved in S404. In S405, the position information indicating the position of the electronic camera 100 itself is received from the GPS circuit 61, the attitude information indicating the attitude of the electronic camera 100 itself is received from the attitude detection circuit 62, and the position of the subject (electronic instrument) relative to the electronic camera 100 is determined by using the electronic instrument position information, the position information indicating the position of the electronic camera 100 and the attitude information indicating the attitude of the electronic camera 100, as explained later. In S407, the focal length information indicating the focal length of the photographic lens 10 is received from the photographing control circuit 60 and the angle of field for the photographing operation is calculated based upon the focal length information and the effective image plane size at the image-capturing element as detailed later, and also, based upon the position information indicating the position of the subject relative to the electronic camera 100, data indicating the position of the subject within the photographed image plane (image-plane position information data) are generated before the operation makes a return in S408.
  • In the sequence diagram provided in FIG. 71 to facilitate an explanation of the wireless communication operation between the electronic camera and an electronic instrument during the release interrupt processing and the wireless communication processing, a shutter release operation is first performed at the electronic camera 100, a photographing operation is performed in response and then wireless communication is attempted at the electronic camera 100 by transmitting a communication request 1 to electronic instruments present in the vicinity. The transmission request 1 includes the identification information of the electronic camera 100 indicating the sender.
  • An electronic instrument having received the communication request 1 returns a response 2 to the electronic camera 100 by transmitting the identification information of the sender electronic instrument and the identification information of the recipient electronic camera.
  • Upon receiving the response 2, the electronic camera 100 verifies the communication partner based upon the electronic instrument identification information and transmits an information request 3 so that information related to the communication partner can be requested by specifying the communication partner's electronic instrument. The information request 3 transmitted to the electronic instrument includes the identification information of the recipient electronic instrument, the identification information of the sender electronic camera 100, the contents of the information being requested and the like.
  • The electronic instrument corresponding to the electronic instrument identification information included in the information request 3 transmits information 4, i.e., the requested related information (the position information, the access information, etc.) to the electronic camera. The information 4 transmitted to the electronic camera includes the identification information corresponding to the electronic camera 100, i.e., the recipient, the identification information corresponding to the electronic instrument, i.e., the sender and the related information.
  • The electronic camera 100 receives the information 4 transmitted by the electronic instrument. The electronic camera 100 sequentially issues the information request 3 to all the electronic instruments having responded to the communication request 1. Upon receiving the information 4 from all the electronic instruments, it ends the wireless communication, generates an image file and saves the image file into the memory card 104.
  • FIGS. 72 and 73 illustrate how the image-plane position information data indicating the position of a subject within the image plane may be obtained. The subject position information data and the photographing position information data are provided as positional coordinates in a three-dimensional coordinate system the center of which is set at a predetermined reference origin point. First, as shown in FIG. 72, the reference coordinate system is converted to a coordinate system (an axis P: extending along the camera optical axis, an axis Q: extending along the lateral side of the camera, an axis R: extending along the longitudinal side of the camera) through a parallel translation of the origin point of the reference coordinate system to a central point 3 (the photographing position) of the electronic camera 100, and the positional coordinates of the subject 203 after the conversion are determined.
  • In addition, as shown in FIG. 73, an angle of field 5 (indicated by the one-dot-and-bar chain line) is calculated along photographic optical axis (along the P axis) based upon the photographic lens focal length information and the effective image plane size of the CCD. Next, the direction in which a straight line connecting the origin point 3 and the subject position 203 extends is calculated. By comparing the angle of field 5 and the direction of the straight line 4, a decision can be made as to whether or not the subject is contained in the image plane and the position of the subject within the image plane (the image-plane position information data) can be calculated as shown in FIG. 74.
  • In the detailed flowchart of the mode switching interrupt processing subroutine presented in FIG. 75, which is started up in response to an operation of an operating button (the photographing mode button 25, the reproduction mode button 26 or the information display mode button 27), after the subroutine is started up in S50, a verification is achieved in S501 as to whether or not the operated button is the photographing mode button 25 and if it is decided that the photographing mode button 25 has been operated, the operation shifts to the photographing mode subroutine in S20. If, on the other hand, it is decided that the photographing mode button 25 has not been operated, a verification is achieved in S501 as to whether or not the operated button is the reproduction mode button 26, and if it is decided that the reproduction mode button 26 has been operated, the operation shifts to the reproduction mode subroutine in S60. If it is decided that the reproduction mode button 26 has not been operated, the operation shifts to the information display mode subroutine in S80.
  • In the detailed flowchart of the reproduction mode subroutine presented in FIG. 76, after the subroutine is started up in S60, the processing in S601 is repeatedly executed. In S601, image data of an image file selected from the image files stored in the memory card 104 are read out in response to an operation of the direction button 23 or 24, the selected image data are reproduced and displayed at the left screen 21 and the operational instructions are displayed at the right screen 22, as shown in FIG. 77.
  • In the detailed flowchart of the transmission interrupt processing subroutine presented in FIG. 78, after the subroutine is started up in S70, a verification is achieved in S701 as to whether or not the electronic camera is currently set in the reproduction mode, and if it is decided that the electronic camera is not set in the reproduction mode, the operation makes a return in S703. If, on the other hand, it is decided that the camera is set in the reproduction mode, potential recipients are displayed at the right screen 22 in S702 as shown in FIG. 79. The recipients include the subject within the image currently reproduced and displayed at the left screen 21. Once the desired recipient is specified by the user, the image file containing the image data currently reproduced and displayed at the left screen 21 is read out from the memory card 104 and this image file is transmitted through the wireless telephone circuit 72 to the recipient specified by the user before the operation makes a return in S703.
  • In the detailed flowchart of the information display mode subroutine presented in FIG. 80, after the subroutine is started up in S80, the home screen of the information display mode is first displayed in S801. In the home screen, image data are displayed at the left screen 21, as shown in FIG. 77. If the operation has shifted into the information display mode from the reproduction mode, the image data that were reproduced and displayed at the left screen 21 in the reproduction mode are kept on display, whereas if the operation has shifted into the information display mode from the photographing mode, the image data of the image file obtained through the most recent photographing operation are displayed. In addition, at the left screen 21, graphics 82 and 83 indicating subject information are displayed at positions corresponding to individual sets of image-plane position information data contained in the additional information data and also an icon 85 for bringing up a display of a map based upon the photographing position information data, an icon 84 for bringing up a display of news items reported on the photographing date based upon the photographing time point information data and an icon 86 for bringing up a display of the general information are displayed, all superimposed over the image data. At the right screen 22, the operating instructions with respect to the left screen 21 are displayed.
  • When the user touches a desired icon at the left screen 21, the position of the point on the screen touched by the user is detected through the touch tablet 66 in S802 and subsequently, the operation branches to one of the following steps in correspondence to the position on the screen that has been touched by the user.
  • If the icon 85 in FIG. 81 is touched, the operation branches to S803 in which a map database on the Internet is accessed via the wireless telephone circuit 72, map data containing the photographing position are downloaded from the map database in conformance to the photographing position information corresponding to the image data displayed at the home screen and the downloaded map data are displayed at the left screen 21, as shown in FIG. 82. At the left screen 21, a reduction icon 87 and an enlargement 88 are superimposed over the map display and as the user touches one of the icons, the map display is either reduced or enlarged in correspondence to the icon that has been touched. At the right screen 22, a thumbnail image (reduced image) of the image data having been displayed at the home screen is displayed and the home screen display in S801 is restored if the user touches this thumbnail image.
  • If the icon 84 in FIG. 81 is touched, the operation branches to S804 in which a news database on the Internet is accessed via the wireless telephone circuit 72, news data of the news reported on the photographing date are downloaded from the news database based upon the photographing time point information corresponding to the image data displayed at the home screen and the downloaded news data are displayed at the left screen 21, as shown in FIG. 83. At the left screen 21, a list of news items and detail graphics 89 are displayed and if a user touches a specific detail icon, details of the corresponding news item are displayed at the left screen 21. At the right screen 22, a thumbnail image (reduced image for pre-view) of the image data having been displayed at the home screen is displayed and the home screen display in S801 is restored if the user touches this thumbnail image.
  • If the icon 82 or the icon 83 in FIG. 81 is touched, the operation branches to S805 in which an information source (a homepage or a database) where detailed information related to the subject is stored is accessed on the Internet via the wireless telephone circuit 72 in conformance to the Internet access information data contained in the subject information indicated by the icon and the top screen of the homepage is brought up on display. For instance, if the icon 82 is touched, the top screen of the homepage of the subject person is displayed at the left screen 21, as shown in FIG. 84, to enable the user to display a screen at a lower layer or link to another homepage through the touch tablet 66. If, on the other hand, the icon 83 is touched, a commentary screen related to a historical structure photographed in the image is brought up on display at the left screen 21, as shown in FIG. 85, to enable the user to display a more detailed description or an image of the historical structure at the left screen 21 through the touch tablet 66. At the right screen 22, a thumbnail image (reduced image) of the image data having been displayed at the home screen is displayed and the home screen display in S801 is restored if the user touches this thumbnail image.
  • In FIG. 86 illustrating the structure of the wireless communication circuit 71 in the electronic camera, the wireless communication circuit 71 is enclosed by an electromagnetic shield film 79 along the camera rear surface (in the direction opposite from the direction along which the photographic optical axis extends, i.e., the side on which the photographer is present), the camera lower surface, the camera upper surface and the camera left and right side surfaces. Thus, the wireless communication circuit 71 is allowed to communicate readily with the portable telephone 160 carried by the subject present in the direction along which the photographing operation is performed with the camera to collect the subject information with a high degree of reliability. At the same time, since it cannot readily communicate with a portable instrument carried by a non-subject party who is not present along the photographic optical axis, the collection of unnecessary information irrelevant to the subject can be minimized. In FIG. 87 showing the communication directivity of the wireless communication circuit 71, extreme directivity manifests to the front of the electronic camera 100 (along the optical axis 40) and the directivity to the rear is slight.
  • By allowing the wireless communication circuit 71 to manifest extreme directivity along the direction in which the photographing operation is performed with the camera through the structure shown in FIG. 86 as described above, the likelihood of communicating with the electronic instrument carried by the subject is increased. It is to be noted that the directivity may be adjusted by adopting a specific antenna structure instead of by using an electromagnetic shield film.
  • In the wireless communication processing shown in FIG. 71, the electronic camera 100 collects subject information by communicating with electronic instruments in the vicinity during the photographing operation. However, the subject information may be transmitted to the electronic camera engaged in the photographing operation from a subject-side electronic instrument either before or after the photographing operation, instead. At the portable telephone 160 carried by the subject, such as that shown in FIG. 88, a UIM card or the like having stored and saved therein the personal information on the user carrying the portable telephone 160 is loaded. The portable telephone 160 includes a display screen 161 and operating buttons 162. When the subject person wishes to transmit his personal information to the electronic camera, the subject person first operates an operating button 162 of the portable telephone 160 to set the portable telephone 160 in a personal information transmission mode, enabling transmission of the subject information to the electronic camera. Once the transmission is completed, a message indicating the recipient electronic camera and the transmission completion is displayed at the display screen 161, as shown in FIG. 89. Since this allows the subject to decide whether or not his personal information should be transmitted, random circulation of the personal information is prevented.
  • In the detection of the position of the subject within the image plane shown in FIG. 73, the information indicating the electronic camera position and the information indicating the electronic instrument position detected with GPS units are utilized. However, the subject position within the image plane may be detected through a method other than this. FIG. 90 shows an example in which the subject position within the photographic image plane is detected by using a highly-directional wireless antenna 75 (its directivity 43 manifests with great intensity along a single direction 42, as shown in FIG. 91) which is utilized as a communication antenna of the wireless communication circuit 71. It is to be noted that details of a structure that may be adopted in such a compact directional antenna are disclosed in, for instance, Japanese Laid-Open Patent Publication No. 2000-278037.
  • The electronic camera 100 is internally provided with the highly-directional wireless antenna 75 and a drive unit 76 that adjusts the direction of the highly-directional wireless antenna 75 by scanning the highly-directional wireless antenna 75 through mechanical drive. While the wireless communication processing is in progress during the photographing operation, the CPU 50 engages the drive unit 76 to drive the highly-directional wireless antenna so as to scan inside the photographic image plane and it concurrently communicates with a subject present within the image plane to collect the subject information data. This makes it possible to correlate scanning position of the highly-directional wireless antenna 75 (i.e., image-plane position information data) with the subject information data.
  • In the detailed flowchart of the wireless communication processing subroutine executed in conjunction with using the highly-directional wireless antenna 75, which is presented in FIG. 92, after the subroutine is started up in S40, a verification is achieved in S411 as to whether or not the electronic camera is currently set in the photographing mode, and if it is decided that the camera is not set in the photographing mode, the operation makes a return in S419. If, on the other hand, it is decided that the camera is set in the photographing mode, the scanning position of the highly-directional wireless antenna 75 is initialized in S412. It is to be noted that the image plane size is determined in conformance to the photographic lens focal length information and the CCD effective image plane size. In S413, wireless communication with electronic instruments present in the vicinity of the electronic camera 100 (a portable telephone carried by a subject person, an electronic sightseeing guide apparatus installed near a historical site and capable of transmitting sightseeing information with regard to the historical site through wireless communication) is attempted with the means for wireless communication 71. If it is decided in S414 that no communication has been achieved, the operation proceeds to S417. If, on the other hand, communication has been achieved, the subject information data are obtained through wireless communication from the electronic instrument of the communication partner with whom communication has been achieved in S415. In S416, image-plane position information contained in the subject information data is designated as the information indicating the scanning position of the highly-directional wireless antenna 75. In S417, the scanning position of the highly-directional wireless antenna 75 it is updated. In S418, a verification is achieved as to whether or not the scanning position of the highly-directional wireless antenna 75 is at the end position and, if it is decided that the scanning position is not at the end position, the operation returns to S413 to repeatedly executed the operation described above, whereas if the scanning position is at the end position, the operation makes a return in S419.
  • In FIG. 93 showing a display example of the home screen brought up in the information display mode, no graphics are displayed at the left screen 21 unlike the example shown in FIG. 81. If the user touches the display of the person or the building on the left screen, the contact position is detected through the touch tablet 66, and the subject information containing the image-plane position information data corresponding to the contact position is brought up on display at the right screen 22, as shown and FIG. 94. In addition, on the left screen 21, an arrow mark 44 is displayed at the subject position corresponding to the subject information. This allows the user to view the image data reproduced and displayed at the left screen 21 without obtrusive graphics crowding on the screen and to review the subject information at the right screen 22 while the image data remains on display at the left screen 21 and to verify the relationship between the subject information and the subject on the image simply by touching the subject which interests the user at the left screen 21. In addition, it makes it possible to display the subject information directly from the reproduction mode by integrating the information display mode and the reproduction mode.
  • Furthermore, the outlines of the image a subjects in the image and displayed may be extracted through an image analysis, so as to enable a display of the subject information corresponding to image-plane position information data contained in an area 46 at the left screen 21 if the user touches the area near an outline (the area 46 in the vicinity of the person 45 in FIG. 95). In this case, even if the user touches a position other than the position of the electronic instrument which has transmitted the subject information to the electronic camera, eg., if the user touches the face of the subject person, the subject information corresponding to the subject person can be brought up on display.
  • FIG. 96 presents a detailed flowchart of an information display mode subroutine achieved by executing additional processing between S801 and S802 in the information display mode subroutine shown in FIG. 80. Through the additional processing, the subject information contained in an image file is transplanted into a similar image file that does not have any subject information.
  • After the home screen is displayed in S801, an image analysis is executed on the image data reproduced and displayed at the left screen 21 in S811 to identify the primary subjects (a primary subject maybe, for instance, a person or a building taking up a relatively large area in the image plane) and then a verification is achieved in S812 as to whether or not there is any subject information containing image-plane position information data present in the area occupied by a primary subject. If it is decided that there is such subject information, the operation proceeds to S802. If, on the other hand, it is decided that there is no such subject information, image files with the photographing time point data indicating the same photographing date as the photographing date indicated by the photographing time point data of the image data the reproduced image of which is currently on display are searched in the memory card 104 and are extracted in S813. Next, an image analysis is executed on the image data in these files and image data having a subject similar to the primary subject of the image data the reproduced image of which is currently on display are extracted. Then, a verification is achieved as to whether or not image-plane position information is attached to the similar subject in the extracted image data and, if any image-plane position information is attached, the subject information containing this image-plane position information is appropriated as the subject information of the primary subject of the image data the reproduced image of which is currently on display. Accordingly, the image file containing the image data the reproduced image of which is currently on display is updated by attaching image-plane position information indicating the position of the primary subject within the image plane determined through the image analysis.
  • In S814, the photographing position information data, the photographing attitude information data and the focal length information data are read out from the image file containing the image data the reproduced image of which is currently on display, and a primary subject is identified upon data corresponding to the information data, which are obtained from a map database on the Internet and the results of the image analysis executed in S813. For instance, if a triangular object appearing to be a mountain is shown in an image plane photographed along a southerly direction from the vicinity of Lake Yamanaka, the object is identified as Mount Fuji. After a specific primary subject is identified, a search is conducted on the Internet by using the name of the primary subject as a keyword, access information with regard to a homepage providing information on the primary subject is obtained and the image file containing the image data the reproduced image of which is currently on display is updated by appending image-plane position information corresponding to the position of the primary subject within the image plane ascertained through the image analysis.
  • After transplanting the subject information into the image file with no subject information from another file or from the Internet, as described above, the operation proceeds to S802. Thus, even when no subject information related to a subject has been obtained during the photographing operation (e.g. when there is no sightseeing guide apparatus capable of wirelessly providing subject information present in the vicinity of the electronic camera used to photograph a landscape image or the like), information related to the subject can be displayed when the image data are reproduced in the information display mode through this subroutine. It is to be noted that the image files searched in S813 are not limited to those obtained on the same photographing date. For instance, a predetermined number of image files photographed before and after the image file with no subject information may be searched, instead. The camera automatically identifies the subject by using the photographing position information data, the photographing attitude information data and the focal length information data and downloads information on the identified subject from the Internet in S814. However, the user may identify the subject by reviewing the image data, input or download from the Internet information on the specified subject and update the image file by pasting the information onto the subject position on the screen image instead.
  • FIG. 97 is another conceptual diagram of an electronic camera adopting the present invention and an electronic image communication system achieved by using the electronic camera. In the electronic image communication system in FIG. 57, the personal information data stored in the UIM card 170 loaded at the portable telephone 160 are read into the electronic camera 100 through a short-distance wireless communication between the electronic camera 100 and the portable telephone 160 carried by the user. In the electronic image communication system in FIG. 97, on the other hand, as a wireless tag (a non-contact IC card) having stored therein personal information data moves closer to the electronic camera 100 having an internal non-contact wireless read unit, the personal information data are automatically read from the wireless tag into the electronic camera 100.
  • As described above, the electronic image communication system shown in FIG. 97 allows the subject to communicate with the electronic camera 100 simply by carrying a compact wireless tag (a non-contact IC card) which does not require a power source.
  • In the embodiment described above (see FIGS. 56 and 66), subject information (including position information) is automatically transferred to the electronic camera through wireless communication between the electronic camera and a portable instrument carried by the subject during an image data photographing operation and, as a result, the need to perform a special operation to obtain the subject information is eliminated. In addition, since the information indicating the subject position assumed at the time when the image data are photographed can be obtained, the image-plane position information data indicating a position accurately corresponding to the actual position on the image plane can be obtained by using the position information. It is to be noted that the subject position information should be obtained without a significant time lag relative to the timing with which the subject is photographed (it is desirable to obtain the subject position information within about 1 second prior to or following the photographing operation) in order to ensure that a position accurately corresponding to the actual position on the image plane is ascertained.
  • In the embodiment described above (see FIGS. 56 and 66), information related to a subject in the image which interests the user viewing the reproduced image data on display can be reviewed with ease and speed simply by touching the icon displayed in correspondence to the image-plane position information at the screen or by touching the subject.
  • In the embodiment described above (see FIGS. 56 and 70), subject information is obtained from a subject through a short-distance wireless communication to raise the probability of achieving communication with the subject present in the vicinity of the electronic camera and also to reduce the likelihood of unnecessarily communicating with an electronic instrument that is present at a long distance from the electronic camera and that has no relation to the subject. In addition, since the signal transmission output level for the wireless communication is adjusted in conformance to the photographic lens focal length, a reliable wireless communication with the subject is achieved and, at the same time, the likelihood of achieving wireless communication with a party other than the subject is reduced.
  • In the embodiment described above (see FIG. 78), the image data reproduced and displayed in the reproduction mode can be transmitted to a transfer address indicated in subject information data obtained during the image data photographing operation. Thus, image data can be transmitted with ease to a stranger met by chance on a trip or the like without having to keep track of the postal address, the e-mail address or the like of the stranger. This feature is particularly convenient when there are numerous people photographed in a group picture because it relieves the user from the task of performing a special operation to send the image data to multiple recipients.
  • In the embodiment described above (see FIG. 82, etc.), when displaying subject information specified by the user at the screen in the information display mode, the original image data in which the subject is photographed are displayed (as a thumbnail image) at the same time, enabling the user to understand the relationship between the image data and the subject intuitively, and also, the home screen display can be promptly recovered by specifying the image data through the touch tablet 66 which enables the user to quickly and easily review the subject information in reference to the image data.
  • In the embodiment described above (see FIGS. 86 and 87), the wireless communication directivity of the wireless communication circuit 71 is restricted so as to manifest along the direction in which the subject is likely to be present relative to the electronic apparatus main unit by using an electromagnetic shield or the like and thus, reliable wireless communication with the subject is achieved to obtain the subject information wirelessly and the likelihood of achieving wireless communication with a party other than the subject is reduced.
  • In the embodiment described above (see FIGS. 88 and 89), a subject information transmission to the electronic camera is initiated on the subject side and, as a result, it becomes possible to prevent uncontrolled circulation of personal information and the like to the outside, as necessary.
  • In the embodiment described above (see FIG. 92), a scan within the image plane is performed with a highly-directional wireless antenna to collect the image plane position information corresponding to a subject, to allow the system to be configured at low cost without having to provide a special means for positional measurement such as GPS unit at the electronic instrument on the subject side.
  • In the embodiment described above (see FIG. 96), the image plane position information corresponding to a specific subject is detected through an image analysis and the subject information is copied from a similar image file, to enable the user to reference the subject information for a review even when no subject information has been obtained during the photographing operation. In addition, since information indicating the image-plane position of a subject is detected through image analysis, the subject is identified based upon the results of the image analysis and the position information and subject information related to the identified subject is collected from an information source on the Internet or the like, the user is able to reference the subject information for review even when no subject information has been obtained during the photographing operation.
  • In the embodiment described above (see FIG. 97), the electronic camera includes a non-contact IC card (wireless tag) read circuit to execute a non-contact read of subject information from a non-contact IC card (wireless tag) carried by the subject. Thus, the subject only needs to hold a compact and light-weight non-contact IC card which does not require a power source to ensure that his subject information is provided to the electronic camera without having to perform any operation.
  • EXAMPLES OF VARIATIONS
  • The present invention is not limited to the embodiment described above and allows for numerous variations and modifications.
  • In the embodiment described above (see FIG. 56), subject information is obtained through a wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by a subject. During this process, a redundant communication with electronic instruments carried by parties other than the true user should be prevented by keeping the wireless transmission signal output level within a range lower than a predetermined value and thus limiting the communication-enabled distance range based upon the assumption that electronic instruments have the average reception capability. This distance range is-determined by taking into consideration standard manner in which the electronic apparatus is operated, whether or not an object that blocks electromagnetic waves is present between the electronic apparatus and the electronic instrument, the performance of the receiver at the electronic instrument and the like. However, it is desirable to set the distance range to approximately 10 m or less in correspondence to the focal length of the standard photographic lens.
  • In the embodiment described above (see FIG. 56), the wireless communication circuit engages in communication through a short-distance wireless system such as Bluetooth. However, the short-distance wireless communication may be achieved through a system other than Bluetooth, such as wireless LAN (IEEE802.11) or infrared communication (IrPA) In addition, communication may be achieved by simultaneously utilizing a plurality of short-distance wireless systems. In the latter case, an electronic instrument with any type of short-distance function, carried by a subject can engage in wireless communication with the electronic camera.
  • In the embodiment described above (see FIG. 56), subject information is transferred through a wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by a subject. However, the subject information may instead be transferred through another non-contact or contact method.
  • In the embodiment described above (see FIG. 60), sound can be recorded with the audio recording circuit 80 during the photographing operation. However, a highly directional microphone may be utilized during a photographing operation to record sound generated by a subject present within the image plane, the recorded audio data may be stored into the image file in correspondence to the image plane subject position information and the recorded audio may be reproduced if the subject in the reproduced image display is specified through the touch tablet or the like in the information display mode, instead. In addition, a plurality of microphones with extreme directivity may be utilized to record sound generated by a plurality of subjects in the image plane at the same time and the sound corresponding to a subject specified through the touch tablet or the like in the reproduced image display in the information display mode may be reproduced.
  • Also, sound generated by a subject or voice commentary related to the subject may be transmitted as subject audio information to the electronic camera from an electronic instrument mounted with a microphone which is carried by the subject or an electronic instrument having pre-stored audio data of the information related to the subject any time, regardless of whether or not a photographing operation is in progress and regardless of photographing timing. In such a case, subject audio information may be appended to image data obtained by photographing the subject and saved in correspondence to the position of the subject within the photographic image plane indicated by the image-plane position information. Then. if a given subject is specified with a pointing device such as a touch tablet at the display screen while the user is reviewing the image reproduced from the image data in the information display mode, the subject audio information stored in correspondence to the position of this particular subject is reproduced. As a result, it becomes possible to provide the viewer with diverse types of subject information including visual information and audio information.
  • In the embodiment described above (see FIG. 60), a position on the left screen 21 or the right screen 22 is specified through the touch tablet 66. However, another pointing device may be utilized for this purpose. For instance, a track ball or a mouse may be utilized as a means for screen position specification. In addition, if an electronic viewfinder (a viewfinder that enlarges through an optical system a screen display brought up at a compact means for screen display to facilitate user observation) is used, the direction along which the line of sight of the user extends (the site line position, the visual focus position) may be detected to enable a screen position specification. In addition, when using the touch tablet 66, a member (pen) provided exclusively for position specification may be used instead of the user's finger to ensure that a position is specified with greater accuracy and that unnecessary subject information is not displayed in response to an inadvertent finger contact.
  • In the embodiment described above (see FIG. 64), subject information is transferred through wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by a subject. In addition, a notification acknowledging the reception of the subject information may be issued from the electronic camera to the electronic instrument having transmitted the subject information to the electronic camera, and a message indicating that the subject information has been transferred may be brought up on display at the subject-side electronic instrument as well. Alternatively, the electronic instrument may issue an audio message for the subject person to notify him that the subject information has been transferred by employing a means for sound generation.
  • By adopting the features described above, the subject can verify that the subject information has been transferred. In addition, the subject provided with an audio notification can verify that the subject information has been transferred even if his electronic instrument does not include a means for display, while the electronic instrument is left in a pocket or the like.
  • In the embodiment described above (see FIG. 66), the electronic camera engages in the photographing mode operation, the reproduction mode operation and the information display mode operation. However, image data may be generated through a photographing operation performed in the electronic camera, an image file may be prepared by collecting subject information and image-plane position information through wireless communication and this image file may be transferred on line or off-line to an image display apparatus other than the electronic camera, which then executes the reproduction mode operation and the information display mode operation, instead.
  • In the embodiment described above (see FIG. 66), the switch over between the reproduction mode and the information display mode is achieved through a manual operation. Instead, the user may specify a given subject at a specific position within the reproduced image display through the touch tablet in the reproduction mode to bring up a display of the subject information related to this particular subject. This allows the user to check the information on the subject immediately as he becomes interested in the subject while reviewing the image data, without having to switch from the reproduction mode to the information display mode.
  • With respect to the reproduction mode and the information display mode in the embodiment described above (see FIG. 66), audio data recorded with the audio recording circuit 80 during a photographing operation may be reproduced with the audio reproduction circuit 81 when reproducing the image data. In addition, a music database on the Internet may be accessed based upon the photographing time point data to download music which was popular at the time of the photographing operation and this downloaded music may be automatically reproduced as background music when reproducing the image data, as well. In this case, the user, who is reminded both visually and audibly of the time when the picture was taken, can fully enjoy the viewing experience.
  • In the embodiment described above (see FIG. 69), the electronic camera does not provide the photographer with a report of the subject information collected through communication with subjects during the photographing operation. However, a list of collected subject information may be brought up on display together with the image data obtained through the photographing operation, instead. Since this allows the photographer to verify exactly what subject information was collected during the photographing operation, he is then able to take further action to collect more information if there is any missing information.
  • In the embodiment described above (see FIG. 70), the electronic camera attempts wireless communication with electronic instruments present in the vicinity of the electronic camera to collect subject information during a photographing operation. The electronic camera may be set in advance so as not to engage in wireless communication with the electronic instrument carried by the photographer operating the electronic camera in this situation. Alternatively, the electronic camera may engage in wireless communication with the electronic instrument carried by the photographer during the photographing operation and record information related to the photographer collected through the wireless communication as photographer information constituting part of the photographing information data.
  • In the embodiment described above (see FIG. 70), the electronic camera attempts wireless communication with electronic instruments present in the vicinity of the electronic camera to collect subject information during a photographing operation. During this process, the transmission signal output level for the wireless communication (by means of e.g. electromagnetic waves, infrared light) may be adjusted in conformance to the subject distance (the output level is raised as the distance to the subject increases). The distance to a subject can be obtained through a detection performed by a distance detection device of the known art, or the photographing distance of the photographic lens, which is manually set, may be used as the subject distance. Through this output level adjustment, the electronic camera is able to achieve communication with the electronic instrument held by a subject with a high degree of possibility, while the likelihood of achieving communication with an electronic instrument held by a party other than the subject is reduced.
  • In the embodiment described above (see FIG. 71), subject information is automatically transferred to the electronic camera through a wireless communication between the electronic camera and the electronic instrument held by a subject during a photographing operation. However, when the electronic instrument receives the communication request 1 or the information request 3 from the electronic camera, the electronic instrument may notify the subject of the reception so as to determine whether or not any further communication is to be carried out in conformance to an instruction from the subject. Since the subject is given the option of refusing further communication as necessary in this manner, uncontrolled circulation of his personal information to the outside is prevented.
  • Alternatively, when the electronic instrument receives the communication request 1 or the information request 3 from an electronic camera, the electronic instrument may compare the identification number of the electronic camera having transmitted the request with the identification numbers of electronic cameras preregistered at the electronic instrument and may refuse any further communication if the electronic camera is not registered. In this case, the extent to which the personal information is allowed to circulate can be automatically controlled.
  • In the embodiment described above (see FIG. 71), subject information is automatically transferred to the electronic camera through a wireless communication between the electronic camera and the electronic instrument held by a subject during a photographing operation. Instead, the subject information may be automatically transferred into the electronic camera through a wireless communication between the electronic camera and the electronic instrument held by the subject while a photographing operation is not in progress. In this case, since it is not necessary to engage in wireless communication for each of a plurality of pictures being taken without changing the composition (as in an exposure bracket photographing operation), a series of pictures can be taken quickly, and also, subject information corresponding to the plurality of sets of image data can be collected through a single wireless communication executed either automatically or manually when the photographing operation has been completed or the like.
  • In the embodiment described above (see FIG. 71), subject information is automatically transferred to the electronic camera through a wireless communication between the electronic camera and the electronic instrument held by a subject, which is carried on at all times during a photographing operation. However, the wireless communication operation for subject information collection may be disallowed to prevent a delay in the photographing operation. For instance, when a continuous photographing (continuous shooting) operation is performed with the electronic camera, wireless communication may be automatically disallowed to prevent a significant lag between continuously photographed image frames due to the wireless communication operation.
  • In the embodiment described above (see FIG. 73), information indicating the position of a subject within the image plane is calculated by using the electronic camera attitude information and the positional measurement information for the electronic camera and the electronic apparatus obtained through the GPS units. However, the image plane subject position information may be detected through another method. For instance, if the electronic camera and the electronic instrument held by a subject engage in communication with each other by using infrared light, the position of the electronic instrument within the image plane may be specified by detecting the light-receiving position on a two-dimensional infrared light image-capturing element provided at the electronic camera based upon the image data captured at the image-capturing element. Alternatively, the image-plane position information alone may be obtained by detecting the position of a point light source of infrared light originating from the electronic instrument with a two-dimensional infrared light image-capturing element, and the subject information may be obtained through a wireless communication (electromagnetic waves). In such a case, the infrared point light source should be turned on with the timing with which the electronic instrument engages in wireless communication.
  • In the embodiment described above (see FIG. 80), the subject information corresponding to a position at which the user touches the screen is accessed on the Internet and is brought up on display in the information display mode. However, the Internet does not always need to be accessed, and if sufficient subject information has been collected during the photographing information, such subject information alone may be brought up on display at the screen. Since this eliminates the need to access the Internet to display subject information, a prompt subject information display is achieved, thereby reducing the stress of the user as he no longer needs to wait a long time for the subject information to come up on display for review.
  • In the embodiment described above (see FIG. 81), image data having been photographed in a predetermined photographic image plane size are reproduced and displayed in a size matching the display screen size and the subject information corresponding to a position on the display screen specified by the user is brought up on display at the display screen in the information display mode. However, the sizes of the photographic image plane and the display screen do not necessarily need to match each other, and the photographic image plane may be displayed either in an enlargement or in a reduction on the display screen, instead. In addition, the image data may be modified, rotated or the like for display at the display screen. If the photographic image plane size and the display screen size do not match, the coordinates of a position specified on the display screen are converted to positional coordinates on the photographic image plane (through enlargement, reduction, shifting, rotation or the like) and the subject information is displayed based upon the positional coordinates resulting from the conversion. Since this eliminates the need to ensure that the shape of the photographic image plane and the shape of the display screen are congruent with each other, it becomes possible to utilize various types of image display apparatus in the information display mode and to specify a desired subject by enlarging the image at the display screen if multiple subjects are present in close proximity to one another in the image.
  • In the embodiment described above (see FIG. 82), the electronic camera includes two display screens, at one of which subject information is displayed with the image data obtained by subjects displayed (as a thumbnail image) at the other screen in the information display mode. However, if the electronic camera has only one screen, the subject information may be displayed over the entire screen with a superimposed display of the image data containing the subjects brought up over part of the screen. As this allows simultaneous review of the subject information and the corresponding image data, the user can intuitively understand the relationship between the subject information and the image data even when the electronic camera has a single display screen.
  • In the second embodiment described above, information related to a subject, which is made to correlate to a subject position within the image plane, is appended to the electronic image data for storage, and the subject-related information is brought up on display in correspondence to the subject position within the image plane when the electronic image data are reproduced and displayed. As a result, information related to a subject photographed in an image, which interests the user reviewing reproduced electronic image data on display, can be quickly and easily reviewed in connection with the reproduced display of the electronic image data.
  • THIRD EMBODIMENT
  • The third embodiment of the present invention is now explained in reference to the drawings. FIG. 98 is a conceptual diagram of the third embodiment of the present invention. An electronic camera photographs subjects within the photographing range, assigns a file name to image data obtained through the photographing operation, connects with an image server on the Internet and uploads the image data into a specific folder at the image server. In addition, the electronic camera transmits address information (the Internet address of the image server is equivalent to URL (Uniform Resource Locator), the folder name and the image data file name at the image server) to electronic instruments (such as portable telephones) carried by the subjects in the vicinity of the camera through short-distance wireless communication. The electronic instrument (such as a portable telephone) carried by a subject connects to the image server on the Internet based upon the information received from the electronic camera through short-distance wireless communication and displays at its screen the image data downloaded from the specific folder at the image server.
  • FIG. 99 is a conceptual diagram of an information transmission system achieved by utilizing an electronic camera, an image server and a portable telephone terminal adopting the present invention. In FIG. 99, an electronic camera 100 photographs a subject present within its photographic range and generates electronic image data or digital image data (hereafter referred to as “image data”). The electronic camera 100 having a wireless telephone function is capable of connecting with a public telephone line network 140 via a base station 120 through a wireless portable telephone line 190. The public telephone line network 140 is also connected with the Internet 130 to allow the electronic camera 100 to connect with an image server 150 accessible via the Internet 130. The image server 150 having a large capacity memory for saving image data saves image data uploaded via the Internet 130 into folders set in correspondence to individual users and downloads specified image data via the Internet 130 if a read request is issued from an external partner via the Internet 130. By connecting with the image server 150, the electronic camera 100 uploads and saves image data into a specific folder.
  • In addition, the electronic camera 100 has a short-distance wireless communication function such as Bluetooth (registered trademark) and transfers identification data (URL) of the image server to which image data have been uploaded, folder identification data (folder name) and image data identification data (filename) to a portable telephone terminal 160 through communication achieved via a short-distance wireless line 180 with the portable telephone 160 having a matching short-distance wireless communication function. The portable telephone 160 is assumed to be carried by a person who is the subject of a photographing operation executed with the electronic camera 100, and a UIM card (user identity module card) 170 having stored therein user personal information at which image data can also be stored is loaded at the portable telephone 160.
  • The portable telephone 160 having a wireless telephone function is capable of connecting with public telephone line network 140 through the wireless portable telephone line 190. The public telephone line network 140 is also connected with the Internet 130 and thus allows the portable telephone 160 to connect with the image server 150 installed on the Internet 130. By connecting with the image server 150, the portable telephone 160 downloads the image data in the specific folder, saves the image data into the UIM card 170, and also displays the downloaded image data at its screen.
  • It is to be noted that Bluetooth refers to a mobile communication technology (for portable electronic instruments) with a low transmission output for short-distance communication, which covers a short-range of approximately 10 m by using wireless electromagnetic waves in the 2.45 GHz band and realizes a one-on-one or multiple-partner connection at a transfer rate of 1 Mbps.
  • FIGS. 100 and 101 present external views (a front view and a rear view) of an embodiment of the electronic camera 100 in FIG. 99. As shown in FIG. 100, at the front surface of the electronic camera 100, a photographic lens 10 which forms a subject image, a viewfinder 11 used to check the photographic image plane, a strobe 12 used to illuminate the subject during a photographing operation, a photometering circuit 13 that detects the brightness of the subject and a grip portion 14 that projects out from the camera main body to allow the electronic camera 100 to be held by hand are provided, whereas a shutter release button 16 operated to issue an instruction for a photographing start and a power switch 17 (a momentary switch which is alternately set to on and off each time it is operated) through which the on/off state of the power to the electronic camera 100 is controlled are provided at the upper surface.
  • As shown in FIG. 101, at the rear surface of the electronic camera 100, an eyepiece unit of the viewfinder 11, a left LCD (left screen) 21 having a substantially quadrangular screen for text and image display and a right LCD (right screen) 22 having a substantially quadrangular screen for text and image display are provided, and an up button 23 and a down button 24 for scrolling the image displayed at the left screen 21 are provided in the vicinity of, and to the left of the left screen 21, whereas a photographing mode button 25 operated to set the electronic camera 100 in a photographing mode, a reproduction mode button 26 operated to set the electronic camera 100 in a reproduction mode, a PRINT button 27 operated to print image data, a SEND button 28 operated to transmit image data and a DOWNLOAD button 29 operated to download image data are provided around the right screen 22 and the left screen 21. An upload dial 32 is an operating member rotated to select an image server to which specific image data are to be uploaded and as it is rotated, the image server corresponding to the number at the position of a dot 31 is selected. At a side surface, a memory card slot 30 at which a memory card 77 can be loaded is provided.
  • It is to be noted that the shutter release button 16, the up button 23, the down button 24, the photographing mode button 25, the reproduction mode button 26, the PRINT button 27, the SEND button 28, the DOWNLOAD button 29 and the upload dial 32 are all operating keys operated by the user.
  • It is also to be noted that over the surfaces of the left screen 21 and the right screen 22, so-called touch screens 66 having a function of outputting position data corresponding to a position indicated through a finger contact operation are provided, and they can be used to make a selection from selection items or make an image data selection on the screens. The touch screens 66 are each constituted of a transparent material such as a glass resin so that the user can observe images and text formed inside the touch screens 66 through the touch screens 66.
  • FIG. 102 presents an example of an external view (front view) of the portable telephone 160 shown in FIG. 99. As shown in FIG. 102, the portable telephone 160 includes a display screen 161, operating keys 162 and a UIM card loading slot 163.
  • In the block diagram in FIG. 103, which presents an example of an internal electrical structure to be positioned assumed in the electronic camera 100 shown in FIGS. 100 and 101, various components are connected with one another via a data/control bus 51 through which various types of information data and control data are transmitted. The components are divided into the following five primary blocks, i.e., (1) a block, the core of which is a photographing control circuit 60 that executes an image data photographing operation, (2) a block, the core of which is constituted of wireless communication circuit 71 that communicates with electronic instruments in the vicinity of the electronic camera 100 through short-distance wireless communication (Bluetooth or the like) and a wireless telephone circuit 72 that exchanges data with the outside through a wireless portable telephone line, (3) a block constituted of the memory card 77 in which image files are stored and saved, (4) a block, the core of which is a screen control circuit 92 that executes display of image data and information related to the image data and (5) a block, the core of which is constituted of user interfaces such as operating keys 65 and a CPU 50 that implements integrated control on various control circuits.
  • The CPU 50 (central processing unit), which is a means for implementing overall control on the electronic camera 100, issues various instructions for the photographing control circuit 60, the wireless communication circuit 71, the wireless telephone circuit 72, the screen control circuit 92 and a power control circuit 64 in conformance to information input from the operating keys 65, the touch screen 66, the power switch 17, a timer 74 and the photometering circuit 13. The photometering circuit 13 measures the brightness of the subject and outputs photometric data indicating the results of the measurement to the CPU 50. In conformance to the photometric data, the CPU 50 sets the length of the exposure time and the sensitivity of a CCD 55 through a CCD drive circuit 56, and in addition, it controls the aperture value for an aperture 53 with an aperture control circuit 54 via the photographing control circuit 60 in conformance to the data indicating the settings.
  • In the photographing mode, the CPU 50 controls the photographing operation via the photographing control circuit 60 in response to an operation of the shutter release button 15. In addition, if the photometric data indicate a low subject brightness, the CPU 50 engages the strobe 12 in a light emission operation via a strobe drive circuit 73 during the photographing operation. The timer 74 having an internal clock circuit detects information indicating a current time point and provides the photographing time point information to the CPU 50 during the photographing operation. The CPU 50 controls the various units in conformance to a control program stored in a ROM 67 (read only memory). In an EEPROM 68 (electrically erasable/programmable ROM), which is a nonvolatile memory, the setting information and the like necessary for the operations of the electronic camera 100 are stored. A RAM 70 which is a volatile memory is used as a work area of the CPU 50. The CPU 50 implements control on a power supply 63 via the power control circuit 64 by detecting the operating state of the power switch 17.
  • The photographing control circuit 60 focuses or zooms the photographic lens 10 through the lens drive circuit 52, controls the exposure quantity at the CCD 55 through control implemented on the aperture 53 by engaging the aperture control circuit 54 and thus controls the operation of the CCD 55 through a CCD drive circuit 56. The photographic lens 10 forms a subject image onto the CCD 55 with a light flux from the subject via the aperture 53 which adjusts the light quantity and this subject image is then captured by the CCD 55. The CCD 55 (charge-coupled device) having a plurality of pixels is a charge storage type image sensor that captures a subject image, and outputs electrical image signals corresponding to the intensity of the subject image formed on the CCD 55 to an analog processing unit 57 in response to a drive pulse supplied by the CCD drive circuit 56.
  • The photographing control circuit 60 repeatedly executes the operation described above and the screen control circuit 92 repeatedly executes an operation in which the digital data sequentially stored into the photographic buffer memory 59 are readout via the data/control bus 51, the digital data thus read out are temporarily stored into a frame memory 69, the digital data are converted to display image data and are restored into the frame memory 69 and the display image data are displayed at the left screen 21. In addition, the screen control circuit 92 obtains text display information from the CPU 50 as necessary, converts it to display text data which are then stored into the frame memory 69 and displays the display text data at the left screen 21 and the right screen 22. Since the image currently captured by the CCD 55 is displayed at the left screen 21 in real time in the photographing mode in this manner, the user is able to set the composition for the photographing operation by using this through screen as a monitor screen. The photographing control circuit 60 detects the state of the focal adjustment at the photographic lens 10 by analyzing the degree of the high-frequency component of the digital data stored in the photographic buffer memory 59 and performs a focal adjustment for the photographic lens 10 with the lens drive circuit 52 based upon the results of the detection.
  • During a photographing operation, upon receiving a photographing instruction from the CPU 50, the photographing control circuit 60 engages the CCD 55 to capture a subject image via the CCD drive circuit 56 and temporarily stores image signals generated through the image-capturing operation as digital data (raw data) into the photographic buffer memory 59 via the analog processing unit 57 and the A/D conversion circuit 58. The photographing control circuit 60 then generates image data by converting or compressing the digital data stored in the photographic buffer memory 59 on a temporary basis in a predetermined recording format (such as JPEG) and stores the image data back into the photographic buffer memory 59.
  • The CPU 50 engages the wireless telephone circuit 72 to wirelessly output the image data through an antenna 76 and thus uploads the image data into the folder at the image server on the Internet. It also engages the wireless communication circuit 71 to wirelessly output the identification data of the image server to which the image data have been uploaded, the folder identification data and the image data identification data through an antenna 75. Subsequently, the CPU 50 converts the image data stored in the photographic buffer memory 59 to save them as reduced image data (hereafter referred to as “thumbnail image data”) with a smaller data volume and stores the thumbnail image data appended with the related information (the image server identification data, the folder identification data and the image data identification data), into the memory card 77.
  • In the reproduction mode, the screen control circuit 92 reads out thumbnail image data specified by the CPU 50 from the memory card 77, temporarily stores the thumbnail image data thus read out into the frame memory 69 and displays the thumbnail image data at the left screen 21. It also stores text data such as reproduction mode instructions into the frame memory 69 and displays the text data at the right screen 22, in response to an instruction issued by the CPU 50. If the PRINT button 27 is operated in the reproduction mode, the CPU 50 executes an operation for having the image data corresponding to the reproduced thumbnail image data printed on a printer. If, on the other hand, the SEND button is operated in the reproduction mode, the CPU 50 executes an operation for having the image data corresponding to the reproduced thumbnail image data transmitted to a recipient electronic instrument.
  • In addition, if the DOWNLOAD button 29 is operated in the reproduction mode, the CPU 50 downloads the image data corresponding to the reproduced thumbnail image data from image server through the wireless telephone circuit 72 and sets the downloaded image data into the frame memory 69. The screen control circuit 92 converts the image data set in the frame memory 69 to display image data, stores the display image data back into the frame memory 69 and displays the display image data at the left screen 21.
  • In FIG. 104 presenting a block diagram of an example of an electrical structure that may be adopted in the portable telephone 160 shown in FIG. 102, various components are connected around a CPU 151 at their center. The CPU 151 (central processing unit), which is a means for implementing overall control on the portable telephone 160, controls a wireless communication circuit 171, a wireless telephone circuit 172, a display screen 161 and a power source 163 in conformance to information input through the operating keys 162, a touch screen 166, a power switch 117 and a timer 174. In a talk mode, the CPU 151 converts sound input through a microphone 164 to audio data and outputs the audio data in a wireless transmission to the outside through an antenna 176 by engaging the public telephone line network 172 and reproduces through a speaker 165 audio data received from the outside at the wireless telephone circuit 172 via the antenna 176.
  • In an Internet mode, the CPU 151 engages the antenna 175 and the wireless communication circuit 171 to exchange information related to image data with the electronic camera 100 through short-distance wireless communication and, based upon related information (the image server identification data, the folder identification data and the image data identification data) thus obtained, it connects with the image server on the Internet via the public telephone circuit 172, downloads the desired image data, displays a image reproduced from the image data at its display screen and saves the image data into a UIM card or a flash memory 168.
  • The CPU 151 controls the various components in conformance to a control program stored in a ROM 167 (read only memory). In the flash memory 168 (electrically erasable programmable memory) which is a nonvolatile memory, image data and the like are stored. A RAM 169 is a volatile memory which is used as a work area of the CPU 151. The CPU 151 controls the power source 163 by detecting the operating state of the power switch 117.
  • FIG. 105 shows the relationship between the numbers set at the upload dial 32 of the electronic camera 100 shown in FIG. 101 and the image servers that are upload t recipients. As shown in FIG. 105, numbers 1, 2, 3 and 4 respectively correspond to image servers W, X, Y and Z.
  • FIG. 106 shows the structure of the data stored in the memory card 77. As shown in FIG. 106, a plurality of image files are saved in the memory card 77. Each image file is constituted of thumbnail image data and additional information data. The additional information data is constituted of photographing information data (see FIG. 107) indicating the various settings selected for the photographing operation and upload information data obtained when the image data are uploaded to an image server. The upload of information data include date/time data indicating the date and time point at which the image data were uploaded to the image server, the image server identification data (upload destination (location) URL), the folder name, the file name, the volume of the image data, the format adopted in recording the image data and the like.
  • In FIG. 107 showing the structure of the photographing information data, the photographing information data are constituted of photographing date/time point data and setting information data related to the photographic lens setting and the camera settings selected for the photographing operation.
  • FIG. 108 is a transition diagram for the state of the electronic camera according to the embodiment of the present invention. The electronic camera 100, which may be set in either of the photographing mode or the reproduction mode, shifts from one mode to the other in response to an operation of one of two operating buttons (the photographing mode button 25 and the reproduction mode button 26). As the power is turned on, the electronic camera first shifts to the photographing mode. In the photographing mode, the electronic camera executes a photographing operation, an upload operation to upload the image data to an image server, an operation for converting the image data to thumbnail image data and then storing the thumbnail image data into the memory card 77, and a short-distance wireless communication operation for communicating with a portable telephone. In the reproduction mode, it executes a reproducing operation for reproducing the thumbnail image data, a download operation for downloading image data from an image server, a transmission operation for transmitting the image data to a portable telephone and a print operation for printing the image data on the printer.
  • FIG. 108 presents a flowchart of the main flow of the operations executed in the electronic camera 100 (the CPU 50) in the embodiment described above. First, as the power switch 17 is operated, the power is turned on in S10, and then, a photographing mode subroutine is executed in S20 thereby setting the electronic camera in a photographing-enabled state. If the shutter release button 16 is operated in the photographing mode, a release interrupt processing subroutine in S30 is executed to perform a photographing operation. In addition, the image data obtained through the photographing operation are uploaded into an image server on the Internet specified with the upload dial 32 through a wireless telephone line, the image data are also converted to thumbnail image data and stored into the memory card 77, and information with regard to the image data upload is automatically transmitted to the portable telephone 160 carried by a subject in the vicinity of the electronic camera 100 through short-distance wireless communication.
  • In addition, if either of the two operating buttons (the photographing mode button 25 and the reproduction mode button 26) is operated while the electronic camera is set in the other operating mode, then mode switching interrupt processing in S50 is started up to switch over to a mode corresponding to the operating button that has been operated.
  • In the reproduction mode in S60, an image file saved in the memory card 77 is read out and the thumbnail image data are reproduced and displayed at the display screen, and as the SEND button 28 is operated, a transmission interrupt processing subroutine in S70 is executed to transmit information with regard to the upload of the image data corresponding to the selected thumbnail image data to the portable telephone 160 carried by the subject present in the vicinity of the electronic camera 100 through short-distance wireless communication. The portable telephone 160, in turn, automatically connects with the image server on the Internet based upon the information received from the electronic camera 100 and downloads the image data which are then displayed at the screen and saved into the UIM card.
  • If the PRINT button 27 is operated in the reproduction mode, a print interrupt processing subroutine in S80 is executed to transmit the information related to the upload of the image data corresponding to the selected thumbnail image data to a printer present in the vicinity of the electronic camera 100 through short-distance wireless communication, and the printer, in turn automatically connects with the image server oh the Internet based upon the information received from the electronic camera 100, downloads the image data and then executes a printing operation.
  • If the DOWNLOAD button 29 is operated in the reproduction mode, a download interrupt processing subroutine S90 is executed to download through a wireless telephone line the image data corresponding to the selected thumbnail image data from the image server at which the image data have been uploaded and the downloaded image data are reproduced and displayed at the display screen.
  • In the detailed flowchart of the photographing mode subroutine presented in FIG. 110, after the subroutine is started up in S20, the processing in S201 is repeatedly executed. In S201, image data sequentially generated by the CCD 55 in conformance to the photographing condition set by the user are displayed at the left screen 21 as shown in FIG. 111 and the condition is also displayed as a list at the right screen 22.
  • In the detailed flowchart of the release interrupt processing subroutine presented in FIG. 112, after the subroutine is started up in S30, a verification is achieved in S301 as to whether or not the camera is currently set in the photographing mode and, if it is decided that the camera is not set in the photographing mode, the operation makes a return in S311. If, on the other hand, it is decided that the camera is set in the photographing mode, an image-capturing operation is executed in conformance to the photographing setting condition selected by the user to generate image data in S302 and then in S303, the image data are converted to thumbnail image data and the thumbnail image data appended with the upload information data are stored into the memory card 77 as an image file. Then, in S304, a wireless communication with an electronic instrument (a portable telephone carried by a subject person) present in the vicinity of the electronic camera 100 is attempted through the means for wireless communication 71. At this time, the signal transmission output for the wireless communication is adjusted in conformance to the focal length of the photographic lens 10. For instance, the signal transmission output level is raised as the focal length increases, since a subject is deduced to be present over a greater distance in such a case. If it is decided in S305 that no communication has been achieved, the operation makes a return in S311. If, on the other hand, it is decided that communication has been achieved, the image data upload information is transmitted to the electronic instrument of the communication partner with whom communication has been achieved in S306. In S307, a display of the image data obtained through the photographing operation is started.
  • In S308, the image data are uploaded into a specific folder (a folder inherent to the photographer or the electronic camera 100) at the image server specified through the upload dial 32. Once the image data upload is completed, a notification of the image data upload completion is transmitted to the electronic instrument in the vicinity of the electronic camera 100 through the means for wireless communication 71 in S309. In S310, the image data display ends and then the operation makes a return in S311 to start a through image display in the photographing mode again.
  • In the sequence diagram presented in FIG. 113 to facilitate an explanation of the wireless communication operation between the electronic camera and an electronic instrument (a portable telephone) executed during the release interrupt processing, a photographing operation is executed in response to a shutter release operation performed at the electronic camera 100, and then the image data are converted to thumbnail image data which are saved. Next, the electronic camera 100 attempts a wireless search communication 1 with an electronic instrument in the vicinity. In the search communication 1, the identification information of the electronic camera 100 indicating a sender is transmitted.
  • An electronic instrument having received the search communication 1 returns a response 2 to the electronic camera 100 in response, through which the identification information of the sender electronic instrument and the identification information of the recipient electronic camera are transmitted.
  • Upon receiving the response 2, the electronic camera 100 verifies the communication partner based upon the electronic instrument identification information, and also executes an information transmission 3 to transmit the upload information to the specific communication partner's electronic instrument. Through the information transmission 3, the identification information of the recipient electronic instrument, the identification information of the sender electronic camera 100, the upload information, the thumbnail image data and the like are transmitted.
  • The electronic instrument corresponding to the electronic instrument identification information transmitted through the information transmission 3 executes an information transmission 4 to the electronic camera 100 in response. Through the information transmission 4, the identification information of the recipient electronic camera 100, the identification information of the sender electronic instrument and related information which includes specific image server information originating from the electronic instrument) are transmitted.
  • Upon receiving the information 4 transmitted by the electronic instrument, the electronic camera 100 uploads the image data to the image server on the Internet through a wireless telephone line. Once the image data upload is completed, the electronic camera 100 transmits an upload completion notification 5 to the electronic instrument. The electronic instrument having received this notification connects with the image server on the Internet through a wireless telephone line based upon the upload information received through the information transmission 3, and downloads and utilizes the specified image data.
  • It is to be noted that if the response 2 is returned by a plurality of electronic instruments, the electronic camera 100 may either execute the information transmission 3 through a simultaneous one-to-multiple partner communication with the electronic instruments or may individually execute the information transmission 3 through a one-to-one communication with each electronic instrument. In the latter case, the information transmission 4 must be executed a plurality of times.
  • In the detailed flowchart of the mode switching interrupt processing subroutine presented in FIG. 114, which is started up in response to an operation of an operating button (the photographing mode button 25 or the reproduction mode button 26), after the subroutine is started up in S50, a verification is achieved in S501 as to whether or not the operated button is the photographing mode button 25, and if it is decided that the photographing mode button 25 has been operated, the operation shifts to the photographing mode subroutine in S20. If, on the other hand, it is decided that the photographing mode button 25 has not been operated, the operation shifts to the reproduction mode subroutine in S60.
  • In the detailed flowchart of the reproduction mode subroutine presented in FIG. 115, after the subroutine is started up in S60, the processing in S601 is repeatedly executed. In S601, thumbnail image data in image files stored in the memory card 77 are sequentially read out starting with the data photographed most recently, and are reproduced and displayed at the left screen 21 while concurrently displaying relevant operational instructions at the right screen 22, as shown in FIG. 116. The thumbnail image data displayed at the left screen 21 are scrolled in response to an operation of the direction button 23 or 24. It is to be noted that in the reproduction mode, one of the plurality of sets of thumbnail image data displayed at the left screen 21 is selected by the user through the touch screen 66, and the thumbnail image data thus selected subsequently undergo a transmission operation, a print operation or a download operation as described later.
  • In the detailed flowchart of the transmission interrupt processing subroutine presented in FIG. 117, after the subroutine is started up in S70, a verification is achieved in S701 as to whether or not the electronic camera is currently set in the reproduction mode, and if it is decided that the electronic camera is not set in the reproduction mode, the operation makes a return in S707. If, on the other hand, it is decided that the electronic camera is set in the reproduction mode, a wireless communication with an electronic instrument in the vicinity of the electronic camera 100 is attempted at the signal transmission output level through the means for wireless communication 71 in S702. If it is decided in S703 that no communication has been achieved, the operation makes a return in S707. If, on the other hand, it is decided that communication has been achieved, the electronic instruments with which communication has been achieved are displayed at the right screen 22 as possible image data recipients in S704 to allow the user of the electronic camera 100 to select the desired recipient through the touch screen 66, as shown in FIG. 118. In S705, the operation waits in standby for the SEND button 28 to be operated again, and if the SEND button 28 is operated, the image data upload information is-transmitted to the selected recipient through short-distance wireless communication in S706 before the operation makes a return in S707.
  • In the sequence diagram presented in FIG. 119 to facilitate an explanation of the wireless communication operation between the electronic camera and an the electronic instrument (portable telephone) executed during the transmission interrupt processing, the image data to be transmitted are first selected at the electronic camera 100. Then, as the SEND button 28 is operated, a wireless search communication 1 with electronic instruments present within the vicinity is attempted by the electronic camera 100. Through the search communication 1, the identification information of the sender electronic camera 100 is transmitted.
  • Each electronic instrument having received the search communication 1 returns a response 2 to the electronic camera 100 in response by transmitting the identification information of the sender electronic instrument and the identification information of the recipient electronic camera.
  • The electronic camera 100 having received the response 2 verifies the communication partners with whom the communication has been achieved based upon the electronic instrument identification information and selects a recipient. Next, it reads out the upload information corresponding to the selected image data from the memory card 77. As the SEND button 28 is operated again, the electronic camera 100 executes an information transmission 3 to transmit the upload information to the specific electronic instrument carried by the selected recipient. Through the information transmission 3, the identification information of the selected recipient's electronic instrument, the identification information of the sender electronic camera 100, the upload information, the thumbnail image data and the like are transmitted.
  • The selected recipient's electronic instrument having received the information 3 connects with the image server on the Internet through a wireless telephone line based upon the upload information, and downloads and displays the specified image data.
  • In the detailed flowchart of the print interrupt processing subroutine presented in FIG. 120, after the subroutine is started up in S80, a verification is achieved in S801 as to whether or not the electronic camera is currently set in the reproduction mode, and if it is decided that the electronic camera is not set in the reproduction mode, the operation makes a return in S807. If, on the other hand, it is decided that the electronic camera is set in the reproduction mode, short-distance wireless communication with any printer in the vicinity of the electronic camera 100 is attempted at the signal transmission output level through the means for wireless communication 71 in S802. If it is decided in S803 that no communication has been achieved, the operation makes a return in S807. If, on the other hand, it is decided that communication has been achieved, the printers with which communication has been achieved are displayed at the right screen 22 in S804 to allow the user of the electronic camera 100 to select the desired printer through the touch screen 66, as shown in FIG. 121. In S805, the operation waits, in standby state, for the SEND button 28 to be operated again and if the PRINT button 27 is operated, the image data upload information and a print instruction are transmitted to the selected printer through short-distance wireless communication in S806 before the operation makes a return in S807.
  • In the sequence diagram presented in FIG. 122 to facilitate an explanation of the wireless communication operation between the electronic camera and a printer executed during the print interrupt processing, the image data to be transmitted are first selected at the electronic camera 100. Then, as the PRINT button 27 is operated, a wireless printer search communication 1 with any printer within the vicinity is attempted by the electronic camera 100. Through the printer search communication 1, the identification information of the sender electronic camera 100 and function identification information (“printer” in this case) indicating the function of electronic device with which the electronic camera-side wishes to communicate are transmitted.
  • A printer having received the printer search communication 1 returns a response 2 to the electronic camera 100 in response by transmitting the identification information of the sender printer, the identification information of the recipient electronic camera and printer specification information and the like.
  • The electronic camera 100 having received the response 2 verifies all the communication partners with whom communicate has been achieved based upon the printer identification information and selects a desired printer. Next, it reads out the upload information corresponding to the selected image data from the memory card 77. As the PRINT button 27 is operated again, the electronic camera 100 executes an information transmission 3 to transmit the upload information to the specific selected printer. Through the information transmission 3, the identification information of the selected printer, the identification information of the sender electronic camera 100, the upload information, the print instruction information and the like are transmitted.
  • The selected printer having received the information 3 connects with the image server on the Internet through a wireless telephone line based upon the upload information, downloads the specified image data, converts the image data to image data for printing based upon the print instruction information and executes a printing operation.
  • The conceptual diagram presented in FIG. 123 illustrating a mode of the information transmission among the electronic camera, a printer and the image server executed during the print interrupt processing described above, the electronic camera uploads image data obtained through a photographing operation to the image server on the Internet through a wireless telephone line. Next, the electronic camera transmits image data the upload information (the image server's Internet address is equivalent to URL (Uniform Resource Locator), the folder name and the image data filename at the image server) and control information (the print instruction information) an electronic device (printer) within a short-distance wireless communication range through short-distance wireless communication. Based upon the information received from the electronic camera through the short-distance wireless communication, the electronic device (printer) connect with the image server on the Internet through a wireless telephone line, downloads the image data from the specific folder at the image server and executes an operation (printing processing) on the downloaded image data based upon the control information (print instruction information).
  • In the detailed flowchart of the download interrupt processing subroutine presented in FIG. 124, after the subroutine is started up in S90, a verification is achieved in S901 as to whether or not the electronic camera is currently set in the reproduction mode and, if it is decided the electronic camera is not set in the reproduction mode, the operation makes a return in S905. If, on the other hand, it is decided that the electronic camera is set in the reproduction mode, the image server on the Internet is connected through the wireless telephone circuit 72 based upon the upload information data appended to the selected thumbnail image data to download the image data corresponding to the selected thumbnail image data in S902. In S903, the downloaded image data are reproduced and displayed at the left screen 21, as shown in FIG. 125. In S904, the operation waits in standby for the DOWNLOAD button 29 to be operated again, and if the DOWNLOAD button 29 is operated, the operation makes a return in S905.
  • FIG. 126 illustrates the structure of the wireless communications circuit 71 in the electronic camera. The wireless communication circuit 71 is enclosed by an electromagnetic shield film 79 along the camera rear surface (in the direction opposite from the direction along which the photographic optical axis extends, i.e., the side on which the photographer is present), the camera lower surface, the camera upper surface and the camera left and right side surfaces. Thus, the wireless communication circuit 71 is allowed to communicate readily with the portable telephone 160 carried by a subject present in the direction along which the photographing operation is performed with the camera and since it cannot readily communicate with a portable telephone or an electronic device carried by a non-subject party who is not present along the photographic optical axis. Thus, the image data recipient can be limited to the portable telephone carried by the subject who has been photographed. In FIG. 127 shows the communication directivity of the wireless communication circuit 71, and extreme directivity manifests to the front of the electronic camera 100 (along the optical axis 40) and little directivity to the rear.
  • By allowing the wireless communication circuit 71 to manifest extreme directivity along the direction in which the photographing operation is performed with the camera through the structure shown in FIG. 126 as described above, the likelihood of communicating with the electronic instrument carried by the subject is increased. It is to be noted that the directivity may be adjusted by adopting a specific antenna structure instead of by using an electromagnetic shield film. A compact directional antenna that may be adopted is disclosed, for instance, in Japanese Laid-Open Patent Publication No. 2000-278037.
  • In addition, the directivity of the wireless communication circuit 71 may be adjusted in conformance to the focal length of the photographic lens. For instance, the communication directivity should be reduced if the focal length is small since the photographic field angle will be wide in such a case, whereas the communication directivity should be increased if the focal length is larger since the photographic field angle will be narrow in such a case. The directivity of the wireless communication circuit 71 may be adjusted by mechanically moving the electromagnetic shield film or by selectively using one of a plurality of directional antennas with varying directional characteristics. By implementing such an adjustment, the likelihood of the electronic camera 100 achieving communication with an electronic instrument held by a photographed subject is increased without allowing ready communication with an electronic instrument carried by a party other than the subject.
  • In the conceptual diagram of the embodiment of the present invention presented in FIG. 92, the electronic camera uploads image data obtained through a photographing operation to an image server on the Internet, and an electronic instrument (such as a portable telephone) carried by a subject connects with the image server on the Internet based upon related information received from the electronic camera through short-distance wireless communication and displays at its screen the image data downloaded from a specific folder at the image server. As a variation, the electronic camera may obtain upload information (indicating an image server on the Internet, a folder name and an image data filename specified by the electronic instrument-side) from the electronic instrument during the short-distance wireless communication with the electronic instrument carried by the subject and upload the image data obtained through a photographing operation, appended with the filename specified by the electronic instrument into the specified folder at the specified image server on the Internet based upon the upload information. In this case, the electronic instrument downloads the image data assigned with the specified file name from the specified folder of the image server specified by itself.
  • This variation relieves the subject from a task of saving the downloaded image data into his own image server.
  • In addition, the location at which an image is to be saved (a specific folder at a specific image server) may be specified by the electronic instrument and a specific image data file name may be assigned by the electronic camera. Alternatively, the electronic camera may specify an image server and a folder, and the electronic instrument may assign a specific in image data filename.
  • In addition, as shown in FIG. 128, the electronic camera may obtain upload information (indicating an image server on the Internet, a folder name and an image data file name) specified by the electronic-instrument-side from an electronic instrument carried by a subject during the short-distance wireless communication with the electronic instrument and transmit the upload information obtained from the electronic instrument together with image data obtained through a photographing operation to a preassigned image server on the Internet so as to enable the preassigned server to transmit the received image data, appended the specified file name, into the specified folder at the specified image server on the Internet, instead. Also in this case, the electronic instrument can download the image data assigned with the specified file name from the specified folder on the image server specified by itself.
  • In the embodiment described above (see FIGS. 98 and 113), short-distance communication between the electronic camera and a portable instrument carried by a subject is executed while obtaining image data through a photographing operation, or either immediately before or after the photographing operation, to automatically transfer the image data upload information to the portable instrument, which eliminates the need for performing a special operation such as manually writing down the upload information and also ensures that the image data upload information is transmitted to the portable instrument of a person highly likely to be a subject photographed n the image data.
  • In the embodiment described above (see FIGS. 98 and 113), image data are not directly exchanged between the electronic camera used to obtain the image data through a photographing operation and the portable telephone which is the ultimate recipient of the image data and as a result, the electronic camera does not need to adjust the image data in conformance to various image display specifications of different portable telephones or to engage in communication with the individual portable telephones over an extended period of time in order to transfer the image data to multiple users. Thus, the onus placed on the electronic camera is reduced.
  • In the embodiment described above (see FIGS. 98, 123 and 128), image data upload information is exchanged between the electronic camera and an electronic instrument through short-distance wireless communication. As a result, the likelihood of communicating with limited partners such as a portable telephone carried by a subject and a printer in the vicinity of the electronic camera is increased and, at the same time, the likelihood of communicating with an irrelevant electronic instrument present at a large distance from the electronic camera is practically eliminated. In addition, since the signal transmission output level for the wireless communication is adjusted in conformance to the focal length of the photographic lens, a reliable wireless communication with a subject is achieved while reducing the likelihood of wirelessly communicating with a party other than the subject.
  • In the embodiment described above (see FIGS. 98, 123 and 128), image data upload information is exchanged between the electronic camera and an electronic instrument through short-distance wireless communication. Since the short-distance wireless communication is achieved at a lower information transmission rate than that of a wireless portable telephone line, the structure of the short-distance wireless communication circuit can be simplified, thereby making it possible to reduce the cost of the electronic camera and the cost of the portable telephone and also to reduce their sizes.
  • In the embodiment described above (see FIGS. 98, 128), the electronic camera which only needs to upload image data obtained through a photographing operation to an image server on the Internet and does not need to individually transmit the image data to a plurality of electronic instruments is able to start the next photographing operation immediately after the image data upload. In addition, since each portable telephone downloads the image data through the image server with a high transmission capability, the individual electronic instruments can download the image data simultaneously at high speed and the downloaded image data can be promptly reviewed at the individual electronic instruments.
  • In the embodiment described above (see FIGS. 98 and 128), thumbnail image data corresponding to a specific set of image data are transmitted by the electronic camera to the electronic instrument while the image data upload information is exchanged between the electronic camera and the electronic instrument through a wireless communication. Thus, the basic contents of the image data can be checked at the electronic instrument prior to downloading of an image data by reviewing the thumbnail image data and if necessary, an action such as cancellation of the downloading can be taken.
  • In the embodiment described above (see FIGS. 112 and 113, the electronic camera generates thumbnail image data correspondence to image data being obtained through a photographing operation, uploads the image data to an image server on the Internet and saves the upload information, appended to the thumbnail image data, in the electronic camera. As a result, desirable image data can be selected and promptly downloaded from the image server for use.
  • In the embodiment described above (see FIGS. 126 and 127), the wireless communication directivity of the wireless communication circuit 71 is restricted so that the directivity manifests in a direction a long which a subject is highly likely to be present relative to the electronic camera main unit by using an electromagnetic shield film or the like. Thus, reliable wireless communication with the portable telephone carried by the subject can be achieved to exchange the image data upload information between the electronic camera and the subject's portable telephone through wireless communication and, at the same time, the likelihood of wirelessly communicating with a party other than the subject is reduced.
  • In the embodiment described above (see FIG. 128), the image data are ultimately saved at the image server specified by the subject. As a result, the subject is relieved from the task of saving the image data at his own image server, whereas the electronic camera is able to start the next photographing operation immediately after the image data upload since it only needs to upload the image to a single preassigned image server.
  • EXAMPLES OF VARIATIONS
  • The present invention is not limited to the embodiment described above and allows for numerous variations and modifications.
  • In the embodiment described above (see FIGS. 98, 99, 123 and 128), the electronic camera uploads image data to an image server on the Internet and a portable telephone downloads the image data from this image server. However, the image server does not necessarily need to be one on the Internet, and it may be an image server on a local communication network, instead. For instance, the present invention may be adopted in a system in which an image server set up on a wireless LAN (local area network) conforming to the IEEE 802 standard is accessed by electronic cameras and portable telephones (electronic instruments) to upload and download image data.
  • In reference to the embodiment (see FIGS. 98, 123 and 128), an information transmission system and an information transmission method, through which image data obtained through a photographing operation performed at an electronic camera are delivered to a portable telephone carried by a subject or a printer, are described. However, the present invention is not limited to the application in image data transmission, and it may be adopted in general information transmission between an information sender and an information recipient present in close proximity to each other. For instance, the present invention may be adopted in an audio data transmission between an information sender that is an audio recording apparatus and an information recipient that is a portable telephone, in which audio data recorded by the audio recording apparatus are transmitted to the portable telephone via an audio server on the Internet.
  • In the embodiment described above (see FIGS. 98 and 123), the electronic camera uploads image data to an image server on the Internet and a portable telephone (electronic instrument) downloads the image data from this image server. However, the electronic camera may transmit the electronic instrument identification information of each electronic instrument with which communication has been achieved, together with the image data, to the image server and the image server, in turn, may only allow a download of the image data by the electronic instruments corresponding to the identification information thus received, in order to restrict recipients allowed to download the image data from the image server. In this case, an unauthorized image data download from the image server is prevented for privacy protection.
  • In the embodiment described above (see FIGS. 98 and 123), the electronic camera uploads image data to an image server on the Internet and a portable telephone downloads the image data from this image server. However, the portable telephone may transfer the upload information received from the electronic camera to another electronic device (e.g., a personal computer belonging to the subject) to download the image data by accessing the image server from the other electronic device, instead. In this case, the image data can be reviewed on a screen larger than the portable telephone screen and also a sufficient memory capacity for saving the image data can be assured.
  • In the embodiment described above (see FIGS. 98 and 123), the electronic camera uploads image data to an image server on the Internet and a portable telephone (electronic instrument) downloads the image data from this image server. Instead, the electronic camera may transmit the electronic instrument identification information of each electronic instrument with which communication has been achieved, together with the image data to the image server, and the image server, in turn, may transmit the image data (in a push-type information transmission) to the electronic instrument corresponding to the identification information by issuing a transmission request to the electronic instrument based upon the identification information. In such a case, since the electronic camera and the electronic instrument do not exchange the image data upload information and the image data are transmitted (pushed) to the specific electronic instrument only, privacy is protected and, at the same time, the image data recipient is relieved of the task of reading out the image data.
  • In the embodiment described above (see FIGS. 98 and 123), the electronic camera and a portable telephone (electronic instrument) exchange the image data upload information (the image server address information and the image data identification information), the electronic camera uploads the image data to the image server on the Internet and the portable telephone (electronic instrument) downloads the image data from the image server. However, the configuration described below may be adopted to provide an image service system in which electronic cameras are installed at a plurality of photographing points in a larger premises such as a theme park, a guest having taken a picture at a desired photographing point downloads the image data obtained through the photographing operation from an image server. Namely, each camera does not need to include an internal means for communication to communicate with a portable telephone carried by the guest but is instead paired with a means for communication installed separately from the electronic camera, image data obtained through a photographing operation executed with a given electronic camera by a guest are appended with specific identification information and are saved at a specific image server and the image server address information and the image data identification information are transmitted to the portable telephone of the guests by the means for communication. In addition, by restricting the photographing position at which a guest can be photographed during a photographing operation and installing the means for communication provided as a separate member from the electronic camera at a position near the guest being photographed, the communication output can be lowered to prevent any interference from an electronic device other than the portable telephone of the guest who is the subject.
  • Furthermore, when offering image services such as those described above in a theme park or the like, a special electronic instrument for exclusive use in conjunction with the image transmission system according to the present invention may be offered to each guest at the time of admission to ensure that image data are transmitted to the guest who is photographed in the image data with a high degree of reliability by preventing interference from other electronic devices. In an image transmission system in which the photographing side, the image recipient side and the image server are fixed as described above, it is not necessary to provide the image recipient side with the image server address information, and the image recipient side only needs to be provided with the image data identification information. In this case, the image server address information is fixed and stored within the electronic cameras, the means for communication and the electronic instrument for exclusive use within the property.
  • Alternatively, the identification information of the special electronic instrument carried by each guest may be transmitted by the electronic instrument to the means for communication so that each set of image data, appended with the corresponding electronic instrument identification information is saved into the image server. In this case, no image data identification information needs to be transmitted to the special electronic instrument carried by the guest. Namely, the electronic instrument identification information is transmitted to the photographing side from the special electronic instrument during the photographing operation and the identification information is appended to the image data and is saved into the image server. The image server, accessed by the special electronic instrument carried by the guest, only needs to download the image data corresponding to the electronic instrument identification information to the electronic instrument. As an alternative, the image server may forcibly transmit the image data to the special electronic instrument corresponding to the identification information appended to the image data. In this case, the volume of data exchanged between the image data photographing side and the image data recipient side and the types of information exchanged between them can both be reduced. In addition, since the communication is conducted by using the special electronic instrument for exclusive use within the property, interference from other electronic devices can be prevented to improve the reliability of the communication.
  • In the embodiment described above (see FIGS. 98 and 123), the electronic camera and a portable telephone (electronic instrument) exchange the image data upload information (the image server address information and the image data identification information), the electronic camera uploads the image data to the image server on the Internet based upon the upload information and the portable telephone (electronic instrument) downloads the image data from the image server based upon the upload information. However, the upload information does not need to include the image data identification information and may be constituted of the image server address information alone, instead. In this case, the portable telephone connects with the image server to which the image data have been uploaded based upon the image server address information, either manually or automatically searches for desired image data based upon the photographing data and the like after the connection is achieved and downloads the image data selected through the search from the image server. Since this eliminates the need for storing in memory identification information for each set of image data at the portable telephone, a means for storage with a small storage capacity can be utilized in the portable telephone.
  • In the embodiment described above (see FIG. 98), the electronic camera and a portable telephone (electronic instrument) exchange the image data upload information (the image server address information and the image data identification information), the electronic camera uploads the image data to the image server on the Internet and the portable telephone (electronic instrument) downloads the image data from the image server. Instead, the electronic camera may transmit the image server address information to the portable telephone and receive the portable telephone identification information (the telephone number or the like) from the portable telephone during the photographing operation, generate identification information for the image data based upon the portable telephone identification information and upload the image data appended with the identification information thus generated to the image server. In this case, when the image server is accessed by the subject's portable telephone, the portable-telephone-identification information is provided to the image server so that the image data appended with the identification information corresponding to the portable telephone identification information are downloaded to the portable telephone. In addition, the electronic camera may generate image data identification information (an image data filename or the like) in combination with other photographing information (e.g., the photographing date/time point (date and time) information) as well as the portable telephone identification information.
  • In the variation described above, it is not necessary to store in memory the image-data-identification information for each set of image data at the portable telephone and the subject carrying the portable telephone is allowed to download all the image data in which the subject is photographed simply by notifying the image server of the portable telephone number.
  • In addition, the portable telephone may access the image server by specifying a given photographing date/time point to selectively download the image data photographed at the photographing date/time point specified by the subject carrying the portable telephone. Furthermore, the image server may keep track of a download history (portable telephone numbers of download recipients, download date/time points, etc.) for each set of image data so that when a portable telephone accesses the image server by specifying the download history, image data that have never been downloaded to the mobile telephone or image data that have not been downloaded over a predetermined length of time can be selectively downloaded.
  • In this case, the portable telephone is able to selectively download specific image data without having to store in memory the identification information for each set of image data.
  • In the embodiment described above (see FIG. 98), the image data upload information is exchanged through a wireless (electromagnetic wave) communication between the electronic camera and an electronic instrument carried by a subject. During this process, a redundant communication with electronic instruments carried by parties other than the true subject should be prevented by keeping the wireless-transmission-signal output level within a range lower than a predetermined value and thus limiting the communication-enabled distance range based upon the assumption that electronic instruments have the average reception capability. This distance range is determined by taking into consideration the standard manner in which the electronic camera is electronic apparatus is used, whether or not an object that blocks electromagnetic waves is present between the electronic camera and the electronic instrument, the performance of the receiver at the electronic instrument and the like. It is desirable to set the distance range to approximately 10 m or less in correspondence to the focal length of the standard photographic lens.
  • In the embodiment described above (see FIG. 98, 123 and 128), the wireless communication circuit engages in communication through a short-distance wireless system such as Bluetooth. However, the short-distance wireless communication may be achieved through a system other than Bluetooth, such as a wireless LAN (IEEE802.11) or infrared communication (IrDA). Furthermore, communication may be achieved by simultaneously utilizing a plurality of short-distance wireless systems. In the latter case, an electronic instrument with any type of short-distance wireless function carried by a subject, can engage in wireless communication with the electronic camera.
  • In the embodiment described above (see FIGS. 98, 123 and 128), the image data upload information is exchanged through a wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by the subject. However, the image data upload information may instead be exchanged through another non-contact or contact method.
  • In the embodiment described above (see FIGS. 98, 123 and 128), the electronic camera attempts short-distance wireless communication with electronic instruments in the vicinity of the electronic camera in order to exchange image data upload information. The electronic camera, however, may be preset so as to disallow any wireless communication with an electronic instrument carried by the photographer operating the electronic camera.
  • In the embodiment described above (see FIGS. 98 and 123), an electronic instrument having obtained upload information from the electronic camera through communication between the electronic camera and the electronic instrument accesses the image server to which image data have been uploaded based upon the upload information and downloads desired image data from the image server. Instead, during the communication between the electronic camera and the electronic instrument, the electronic camera may obtain the identification information of the electronic instrument and provide the electronic instrument identification information thus obtained to the image server before the electronic instrument accesses the image server. In this case, the image server enables exclusive access by the electronic instrument corresponding to the identification information received from the electronic camera on a specific condition. The specific condition may be a specific length of time allowed to elapse after the electronic instrument identification information is received from the electronic camera or a specific limit on the number of image data downloads. These measures prevent an electronic instrument that has not communicated with the electronic camera from downloading image data from the image server. In addition, by limiting the time period during which a given set of image data can be downloaded or the number of times the data can be downloaded, the risk of a third-party downloading the image data from the image server by using the same electronic instrument later on is reduced.
  • In the embodiment described above (see FIGS. 98, 123 and 128), information (the image server address information, the image data identification information, the electronic instrument identification information and the like) exchange through wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by the subject. However, prior to the information exchange, an open key to be used to code the information may be transmitted from the information recipient to the information sender, the information sender may transmit the information coded by using the open key and the information recipient may decode the received information by using a secret key coupled with the open key. In this case, even if the information exchanged between the electronic camera and the electronic instrument is intercepted by another electronic instrument, the information remains secure.
  • In the embodiment described above (see FIG. 98), the electronic camera attempts wireless communication with electronic instruments present in the vicinity of the electronic camera to exchange the image data upload information during a photographing operation. During this process, the transmission signal output level for the wireless communication (electromagnetic waves, infrared light) may be adjusted in conformance to the subject distance (the output level is raised as the distance to the subject increases). The distance to a subject can be obtained through a detection performed by a distance detection device of the known art, or a photographing distance of the photographic lens which is manually set may be used as the subject distance. Through this output level adjustment, the electronic camera is able to achieve communication with the electronic instrument held by a subject with a high degree of reliability and the likelihood of achieving communication with an electronic instrument held by a party other than the subject is reduced.
  • In the embodiment described above (see FIG. 99), the electronic camera uploads image data obtained through a photographing operation to an image server on the Internet through its internal-wireless-telephone circuit. Instead, the electronic camera may first transmit the image data to a portable telephone carried by the photographer operating the electronic camera through short-distance wireless communication and the photographer's portable telephone may upload the image data to the image server on the Internet through a wireless telephone line. Since this eliminates the need to provide an internal wireless telephone circuit at the electronic camera, the electronic camera can be miniaturized and further advantages related to power consumption and cost are also achieved.
  • In the embodiment described above (see FIG. 99), the electronic camera uploads image data obtained through a photographing operation to a selected image server. However, image data may be uploaded to a fixed image server instead. For instance, by setting up a fixed image server to be exclusively used for image data upload, marketing an electronic camera in a package in which image services are offered through the image server and ensuring all the images photographed with the camera are uploaded to the image server, the purchaser of the electronic camera is relieved from the task of signing up with an image server himself and the camera can be operated with ease even by users not familiar with the Internet or the like. It is to be noted that the address information of the fixed image server is stored in an nonvolatile memory such as an EEPROM in the electronic camera.
  • In the embodiment described above (see FIG. 112), the electronic camera automatically downloads all image data photographed in the photographing mode to the image server (the download recording mode). However, the electronic camera may be allowed to assume another photographing mode (a recording mode for memory cards) for recording photographed image data into a memory card loaded at the electronic camera and may be manually or automatically switched to the download recording mode or the memory card recording mode. For instance, the electronic camera having been operated in the recording mode may be automatically switched to the download recording mode as the remaining capacity of the memory card becomes low or the electronic camera having been operated in the download recording mode may be automatically switched to the recording mode when the electronic camera moves out of the wireless telephone line communication range. Such features allow the photographer to take pictures with the electronic camera without having to worry about state of the memory card or the state of the wireless telephone line.
  • In addition, image data may be uploaded to the image server and also saved into the memory card at the same time under normal circumstances, and either the upload or the save may be skipped as necessary. This eliminates the need to create an image data back-up later on and thus, the image data can be saved with a higher degree of reliability.
  • In the embodiment described above (see FIGS. 112 and 113), the image data upload information is exchanged between the electronic camera and the electronic instrument through short-distance wireless communication and then the electronic camera uploads the image data to the image server. However, this order may be reversed, i.e., the electronic camera may first upload the image data to the image server and then the image data upload information may be exchanged between the electronic camera and the electronic instrument through a short-distance wireless communication.
  • In the embodiment described above (see FIGS. 112 and 113), a photographing operation is executed in response to an operation of the shutter release button provided at the electronic camera. However, a photographing operation in the electronic camera may be started up through short-distance wireless communication in response to remote control implemented from a portable telephone carried by a person to be photographed. In this case, the short-distance wireless circuit can be effectively utilized and, at the same time, the photographing operation can be executed with the timing desired by the subject.
  • In the embodiment described above (see FIGS. 113 and 119), the short-distance wireless communication between the electronic camera and an electronic instrument in the vicinity is initiated by the electronic camera to transmit the image data upload information to the electronic instrument from the electronic camera. However, a communication request may be first issued to the electronic camera from an electronic instrument carried by the subject and then the image data upload information may be transmitted by the electronic camera to the electronic instrument, instead. For instance, a request for the image data upload information may be issued from the portable telephone 160 to an electronic camera in its vicinity in response to a specific operation of an operating key 162 at the portable telephone 160 shown in FIG. 102. Since this enables the subject side to initiate the short-distance wireless communication between the electronic camera and the portable telephone, the electronic camera does not need to engage in short-distance wireless communication each time a photographing operation is performed unless necessary and thus, the electronic camera can perform the next photographing operation immediately after the photographed image data are uploaded to the image server.
  • In the embodiment described above (see FIGS. 113 and 119), the image data upload information is exchanged through wireless (electromagnetic wave) communication between the electronic camera and the electronic instrument carried by the subject. During this wireless communication, the electronic instrument having received search messages from the electronic camera may display a message indicating a communication from the electronic camera has been received or such a message may be provided to the subject via a means for sound generation on the subject side.
  • Such measures make it possible for the subject to verify that a communication from the electronic camera has been received. In addition, the subject, i.e., the user, provided with an audio message, can verify that communication from the electronic camera has been received while keeping the electronic instrument which may not include a means for display in a pocket or the like. In addition, the subject having verified that the communication has been received may refuse to download the image data if necessary by cutting off further communication.
  • In the embodiment described above (see FIGS. 113 and 119), the image data upload information is automatically exchanged through wireless communication between the electronic camera and the electronic instrument held by the subject. However, the electronic instrument having received the search communication from the electronic camera may compare the identification number of the electronic camera initiating the search communication with the identification numbers of electronic cameras preregistered at the electronic instrument and may refuse any further communication with the electronic camera if it is determined to be an unregistered camera. In this case, the subject can reject any image data photographed by a stranger.
  • In the embodiment described above (see FIGS. 113, 119 and 122), the image data upload information is exchanged between the electronic camera and an electronic instrument through wireless communication. During the wireless communication, the electronic camera may transmit information on the specifications (the data volume, the recording format, the image aspect ratio, the number of pixels, etc.) to the electronic instrument to allow the electronic instrument side to decide whether or not the image are to be downloaded in correspondence to a specific purpose of use for the image data based upon the received image data specification information. For instance, if the party on the electronic instrument side only wishes to review image data displayed at the screen of his portable telephone, he may choose to download image data with a small data volume or a small number of pixels.
  • In the embodiment described above (see FIG. 113), the image data upload information is automatically exchanged between the electronic camera and the electronic instrument held by a subject through wireless communication during a photographing operation. Instead, photographed image data may be saved into a RAM or a memory card on a temporary basis, a series of sets of image data may be uploaded to the image server and the image data upload information may be exchanged between the electronic camera and the electronic instrument held by the subject through wireless communication when a series of photographing operations has been completed. Since this eliminates the need to upload the image data and engage in wireless communication each time one of a plurality of pictures taken without changing the composition (as in an exposure bracket photographing operation) is photographed, the series of pictures can be taken quickly. Then, after the series of photographing operations is completed or the like, the plurality of sets of image data can be uploaded in a batch and the upload information with regard to the plurality of sets of image data can be exchanged through a single wireless communication, either automatically or manually.
  • In the embodiment described above (see FIG. 113), image data are uploaded to the image server and the electronic camera and the electronic instrument held by the subject engage in wireless communication every time a photographing operation is performed. However, the image data upload operation and the wireless communication operation may be disallowed if the image data upload and the wireless communication are likely to cause a delay in the photographing operation. For instance, by automatically disallowing the image data upload operation and the wireless communication operation when performing a continuous photographing (continuous shooting) operation, it is possible to prevent any significant lag attributable to the image data upload operation and the wireless communication operation from manifesting between the image frames obtained through the continuous photographing operation. In this case, the image data obtained through the continuous photographing operation should be saved into RAM or a memory card on a temporary basis so that the series of image data can be uploaded in a batch to the image server and the image data upload information can be exchanged between the electronic camera and the electronic instrument held by the subject through wireless communication either automatically or manually after the series of pictures are taken.
  • In the embodiment described above (see FIG. 122), the upload information and the print instruction information corresponding to the image data to be printed are transmitted to the printer from the electronic camera in response to an operation of the PRINT button. The print instruction information transmitted through this process may include information on various printer settings (such as the print size, the print quality and the printing color).
  • In the embodiment described above (see FIG. 122), the upload information and the print instruction information corresponding to the image data to be printed are transmitted to the printer from the electronic camera in response to an operation of the PRINT button. Instead, in response to PRINT button operation, the electronic camera may obtain from the printer the identification information (an IP address) used to identify the printer on the network and may transmit this identification information to the image server to enable the image server to transmit the image data to be printed to the printer corresponding to the identification information. In this case, it is not necessary to store image data at the electronic camera and, at the same time, image data to be printed can be quickly transmitted to the printer to undergo a print operation.
  • In the third embodiment described above, information (address information) with regard to an upload of information (image) to an information server (image server) on the Internet is exchanged through short-distance communication between the information (image) sender side and the information (image) recipient side, the information (image) is first uploaded to the information server on the Internet from the information (image) sender side and the information (image) recipient side downloads for a utilization the information (image) from the Information server (image server) on the Internet to which the information (image) has been uploaded. Thus, an information transmission system (image transmission system) and an information transmission method (image transmission method) that enable a speedy transmission of the information (image) to the information (image) recipient side without placing a great onus on the information (image) sender side during the information (image) transmission are provided.
  • In particular, when a group picture is taken with an electronic camera system that handles image data with a large data volume and the image data obtained through the photographing operation need to be delivered to portable telephones or the like of the plurality of subjects, the electronic camera is only required to execute the processing for transmitting the upload information to the individual portable telephones through short-distance communication and the processing for uploading the image data to the image server on the Internet, the onus of having to transmit the image data to the individual portable telephones through short-distance communication with the portable telephone is lifted from the electronic camera and the desired image data can be obtained promptly at each portable telephone as well.
  • The various programs explained in reference to the embodiments can be provided in a recording medium such as a CD-ROM or through data signals on the Internet. For instance, a program executed on the personal computer 140 in FIG. 1 may be read from a CD-ROM drive device (not shown) mounted at the personal computer 140. If a program installed at the electronic camera 100 or the portable telephone 160 can be overwritten, the electronic camera 100 or the portable telephone 160 may connect with the personal computer 140 to download an upgrade program. The programs may be downloaded via the Internet 130 by adopting the configuration shown in FIG. 1 as well. By setting up an application program server in place of the image sever 150 in the configuration, the programs may be downloaded from this server via the Internet. When providing the programs via the Internet, they are embodied as data signals on a carrier wave to be transmitted via a communication line. Thus, programs can each be distributed as a computer-readable computer program product assuming any of the various modes such as a recording medium or carrier wave.

Claims (52)

1. An electronic apparatus having a user identification function for identifying a user comprising:
a wireless communication unit that executes wireless communication with an electronic instrument carried by the user of the electronic apparatus and obtains personal information on the user from the electronic instrument; and
a setting unit that selects operational settings for electronic apparatus based upon the user personal information obtained by the wireless communication unit.
2. An electronic apparatus having a user identification function for identifying a user comprising:
a wireless communication unit that executes wireless communication with an electronic instrument carried by the user of the electronic apparatus and obtains personal information on the user from the electronic instrument;
a user selection unit that, if a plurality of possible users of the electronic apparatus are determined to be present based upon results of the wireless communication, automatically selects a user most likely to use the electronic apparatus as the electronic apparatus user from the plurality of possible users; and
a setting unit that selects operational settings for the electronic apparatus based upon the personal information on the user selected by the user selection unit.
3. An electronic apparatus having a user identification function according to claim 2, wherein:
the wireless communication unit communicates with electronic instruments by using an electromagnetic wave signal and limits a wireless communication range of the electronic apparatus within a predetermined distance range relative to the electronic apparatus by restricting a transmission output level of the electromagnetic wave signal.
4. An electronic apparatus having a user identification function according to claim 2, wherein:
the wireless communication unit has wireless communication directivity in a direction in which the user is highly likely to be present relative to the electronic apparatus when the user uses the electronic apparatus.
5. An electronic apparatus having a user identification function according to claim 4, wherein:
the electronic apparatus is a photographing apparatus having a photographic optical system; and
the wireless communication unit has wireless communication directivity toward a side opposite from a subject along an optical axis of the photographic optical system.
6. An electronic apparatus having a user identification function according to claim 2, wherein:
if a user of an electronic instrument with which wireless communication has been achieved is a user registered at the electronic apparatus, the user selection unit decides that the user is allowed to use the electronic apparatus.
7. An electronic apparatus having a user identification function according to claim 2, wherein:
the user selection unit selects a user of an electronic instrument present over a shortest distance to the electronic apparatus as the electronic apparatus user from the plurality of possible users of the electronic apparatus based upon varying distances between the electronic apparatus and the electronic instruments.
8. An electronic apparatus having a user identification function according to claim 7, wherein:
the wireless communication unit attempts communication with the electronic instrument by adjusting a wireless communication range to a plurality of settings to enable the user selection unit to select the electronic apparatus user.
9. An electronic apparatus having a user identification function according to claim 8, wherein:
as an output level of a wireless transmission signal from the wireless communication unit is gradually increased while communicating with the electronic instruments, the user selection unit checks whether or not the user of each electronic instrument is a possible user of the electronic apparatus and selects a possible user of the electronic apparatus who is firstly detected as the electronic apparatus user.
10. An electronic apparatus having a user identification function according to claim 7, wherein:
the user selection unit selects the electronic apparatus user based upon reception output levels at the electronic instruments receiving a wireless transmission signal transmitted by the wireless communication unit.
11. An electronic apparatus having a user identification function according to claim 10, wherein:
with an output level of the wireless transmission signal from the wireless communication unit sustained at a constant level while communicating with the electronic instruments, the user selection unit obtains output level data from the electronic instruments having received the wireless transmission signal and selects a user of an electronic instrument having returned output level data indicating a highest output level among the output level data returned from the electronic instruments carried by the possible users of the electronic apparatus as the electronic apparatus user.
12. An electronic apparatus having a user identification function according to claim 7, wherein:
the user selection unit selects the electronic apparatus user based upon attenuation rates of the wireless transmission signal transmitted by the electronic instruments.
13. An electronic apparatus having a user identification function according to claim 12, wherein:
with output level data of wireless transmission signals transmitted by the electronic instruments received at the wireless communication unit from the electronic instruments carried by the possible users of the electronic apparatus, the user selection unit detects reception output levels of the wireless transmission signals transmitted by the electronic instruments carried by the possible users of the electronic apparatus calculates attenuation rates of the wireless transmission signals from the electronic instruments based upon the output level data and the detected reception output levels, and selects a user of an electronic instrument corresponding to a lowest wireless transmission signal attenuation rate among the electronic instruments carried by the possible users of the electronic apparatus as the electronic apparatus user.
14. An electronic apparatus having a user identification function according to claim 7, wherein:
the user selection unit detects distances between the electronic apparatus and the electronic instruments carried by the possible users of the electronic apparatus and selects the electronic apparatus user based upon the detected distances.
15. An electronic apparatus having a user identification function according to claim 14, wherein:
the electronic apparatus and the electronic instruments each include a positional measurement unit that obtains through measurement position data indicating a position thereof; and
with the position data received at the wireless communication unit from the electronic instruments carried by the possible users of the electronic apparatus, the user selection unit calculates distance data indicating distances between the electronic apparatus and the individual electronic instruments based upon the position data of the electronic apparatus and the position data received from the electronic instruments, with each set of data obtained through the measurement by the positional measurement unit, and selects a user of an electronic instrument corresponding to distance data indicating the smallest distance as the electronic apparatus user.
16. An electronic apparatus having a user identification function according to claim 2, wherein:
based upon a predetermined user priority order, the user selection unit selects a user ranked highest in the priority order as the electronic apparatus user from the plurality of possible users of the electronic apparatus.
17. An electronic apparatus having a user identification function according to claim 2, further comprising:
a storage unit that stores in a memory last utilization time points at which the electronic apparatus was last used by individual users, wherein;
based upon the last utilization time points corresponding to the individual users stored in the storage unit, the user selection units selects a user having used the electronic apparatus most recently among the plurality of possible users of the electronic apparatus as the electronic apparatus user.
18. An electronic apparatus having a user identification function according to claim 2, further comprising:
an operating member operated by the electronic apparatus user; and
a control unit that engages the wireless communication unit and the user selection unit to execute a user identification operation when the operating member is operated.
19. An electronic apparatus having a user identification function according to claim 18, wherein:
the operating member is a power-on operating member operated to turn on power to the electronic apparatus; and
the control unit engages the wireless communication unit and the user selection unit to execute a user identification operation when the power-on operating member is operated and power to the electronic apparatus is turned on.
20. An electronic apparatus having a user identification function according to claim 2, further comprising:
a contact detection unit that detects a user contact with the electronic apparatus; and
a control unit that engages the wireless communication unit and the user selection unit to execute a user identification operation when the contact detection units detects a user contact.
21. An electronic apparatus having a user identification function according to claim 20, further comprising:
a holding unit hand-held by the user operating the electronic apparatus, wherein:
the contact detection unit is provided near the holding unit.
22. An electronic apparatus having a user identification function according to claim 2, further comprising:
a control unit that engages the wireless communication unit and the user selection unit in executing a user identification operation with a predetermined time cycle.
23. An electronic apparatus having a user identification function according to claim 2, further comprising:
a control unit that disables a user identification operation by the wireless communication unit and the user selection unit, if the electronic apparatus is set in an operating mode in which the user does not operate the electronic apparatus in close proximity to the electronic apparatus.
24. An electronic apparatus having a user identification function according to claim 2, wherein:
the electronic instrument includes an operating member;
when the operating member is operated by the user, the electronic instrument wirelessly issues a notification of a user operation of the operating member to the wireless communication unit of the electronic apparatus; and
the electronic apparatus further comprises a control unit that engages the wireless communication unit and the user selection unit in executing a user identification operation when the notification of a user operation of the operating member from the electronic instrument is received at the wireless communication unit.
25. An electronic apparatus having a user identification function according to claim 2, further comprising:
a switching unit operated to switch over between a mode in which a user identification operation by the wireless communication unit and the user selection unit is enabled and a mode in which the user identification operation by the wireless communication unit and the user selection unit is disabled.
26. An electronic apparatus having a user identification function according to claim 2, further comprising:
a remote control unit that enables remote control of the electronic apparatus, wherein:
the wireless communication unit and the user selection unit are internally provided at the remote control unit.
27. An electronic apparatus having a user identification function according to claim 2, wherein:
the user selection unit selects a preset default user as the user if no possible user of the electronic apparatus is detected.
28. An electronic apparatus having a user identification function according to claim 2, further comprising:
an operating member, wherein:
when the operating member is operated by the user, the setting unit initializes the operational settings of the electronic apparatus based upon the personal information on the user selected by the user selection unit.
29. An electronic apparatus having a user identification function according to claim 2, further comprising:
a manual user change unit operated to manually change the electronic apparatus user.
30. An electronic apparatus having a user identification function according to claim 2, further comprising:
a display unit; and
a control unit that controls the display unit to display the user automatically selected by the user selection unit.
31. An electronic apparatus having a user identification function according to claim 2, further comprising:
a notification unit that issues a notification indicating that the user has been selected by the user selection unit, to the electronic instrument carried by the user.
32. An electronic apparatus having a user identification function according to claim 2, further comprising:
a storage unit that stores in memory a user having lastly used the electronic apparatus; and
a reporting unit that provides a report on a mismatch if the user selected by the user selection unit does not match with the user stored in a memory at the storage unit.
33. An electronic apparatus having a user identification function according to claim 2, further comprising:
a user characteristics detection unit that detects information indicating physical characteristics of users of the electronic apparatus, wherein:
the user selection unit selects a user whose physical characteristics information detected by the user characteristics detection unit matches user characteristics information received from the electronic instrument through wireless communication as the electronic apparatus user.
34. An electronic apparatus having a user identification function according to claim 2, further comprising:
a display unit, wherein:
the setting unit selects a display setting for the display unit in conformance to a degree of expertise of the user selected by the user selection unit in handling the electronic apparatus, which is indirectly judged based upon general personal information on the user.
35. An electronic apparatus having a user identification function according to claim 34, wherein:
the display setting is a specific size of characters displayed at the display unit or a specific style of text displayed at the display unit.
36. An electronic apparatus having a user identification function according to claim 2, further comprising:
a display unit, wherein:
the setting unit adjusts a display setting for the display unit or the operational settings for the electronic apparatus in conformance to the degree of expertise of the user selected by the user selection unit in handling the electronic apparatus, which is indirectly judged based upon user's age information.
37. An electronic apparatus having a user identification function according to claim 2, further comprising:
a display unit, wherein:
the setting unit adjusts the size of characters displayed at the display unit based upon personal information indicating the visual acuity of the user selected by the user selection unit.
38. An electronic apparatus having a user identification function according to claim 2, wherein:
the electronic apparatus is a photographing apparatus operated to photograph a subject;
the electronic apparatus further comprises:
a viewfinder through which the subject is observed; and
a diopter adjustment unit that adjusts a diopter of the viewfinder; and
the setting unit automatically adjusts the diopter of the viewfinder via the diopter adjustment unit based upon personal information indicating the diopter of the user selected by the user selection unit.
39. An electronic apparatus having a user identification function according to claim 2, wherein:
the electronic instrument is either embedded inside the user's body or is worn in close contact with a surface of the user's body, and is capable of wirelessly communicating with the electronic apparatus coming into contact with the user through an electrical current or an electromagnetic wave traveling through the user's body;
the electronic apparatus further comprises a holding unit hand-held by the user operating the electronic apparatus; and
the wireless communication unit communicates with the electronic instrument through the hand of the user holding the holding unit.
40. An electronic apparatus having a user identification function for identifying a user, comprising:
a wireless communication unit that wirelessly communicates with electronic instruments present in the vicinity of the electronic apparatus;
a user selection unit that selects an electronic instrument carried by a user other than an operator of the electronic apparatus from electronic instruments with which wireless communication has been achieved and obtains, via the wireless communication unit, personal information on the user from the selected electronic instrument; and
a control unit that controls operations of the electronic apparatus based upon the user personal information obtained by the user selection unit.
41. An electronic apparatus having a user identification function according to claim 40, wherein:
the electronic apparatus is a photographing apparatus that generates image data through a photographing operation;
the personal information is information indicating the color of the irises of the user;
the electronic apparatus further comprises an electronic flash unit that emits flash light to illuminate the user to be photographed; and
the control unit extracts any red-eye portion of the user image from the image data obtained by photographing the user with the electronic flash unit illuminating the user and retouches the red-eye portion by using a color corresponding to the information indicating the iris color.
42. An electronic apparatus having a user identification function according to claim 40, wherein:
the electronic apparatus is a photographing apparatus that performs a photographing operation;
the electronic apparatus further comprises an electronic flash unit emits flash light to illuminate the user to be photographed;
the personal information is information indicating whether or not the eyes of the user tend to appear red in a photographed image obtained through a flash photographing operation performed by using the electronic flash unit; and
if the eyes of the user are determined to tend to appear red based upon the personal information, the control unit causes the pupils of the user to contract by engaging the electronic flash unit to emit flash light to illuminate the user prior to the flash photographing operation so that the eyes of the user do not readily turn red in the image obtained through the flash photographing operation.
43. An electronic apparatus having a user identification function according to claim 40, wherein:
the electronic apparatus is a photographing apparatus that generates an image through a photographing operation;
the electronic apparatus further comprises an electronic flash unit and an illumination unit that illuminates the user to be photographed;
the personal information is information indicating whether or not the eyes of the user tend to appear red in a photographed image obtained through a flash photographing operation performed by using the electronic flash unit; and
if the eyes of the user are determined to tend to appear red based upon the personal information, the control unit causes the pupils of the user to contract by engaging the illumination unit to illuminate the user prior to the flash photographing operation so that the eyes of the user do not readily turn red in the image obtained through the flash photographing operation.
44. An electronic apparatus having a user identification function for identifying a user, comprising:
a display unit;
a wireless communication unit that executes wireless communication with an electronic instrument carried by the user of the electronic apparatus and obtains personal information on the user from the electronic instrument;
a control unit that, if a plurality of possible users of the electronic apparatus are determined to be present based upon results of the wireless communication, controls the display unit to display the plurality of users at the display unit;
a manual user selection unit operated to manually select a single user from the plurality of possible users displayed at the display unit; and
a setting unit that selects operational settings for the electronic apparatus based upon the personal information on the user selected through the manual user selection unit.
45. An electronic apparatus having a user identification function for identifying a user, comprising:
a wireless communication unit that executes wireless communication with an electronic instrument carried by the user of the electronic apparatus and obtains personal information on the user from the electronic instrument through wireless communication;
a user selection unit that checks whether or not each user of the electronic instruments is a possible user of the electronic apparatus, as an output level of a wireless transmission signal from the wireless communication unit is gradually increased while communicating with the electronic instruments, and selects a possible user of the electronic apparatus who is first detected as the electronic apparatus user; and
a setting unit that selects operational settings for the electronic apparatus based upon the personal information on the user selected by the user selection unit.
46. An identification method for identifying a user of an electronic apparatus available for use by the user through wireless communication between the electronic apparatus and an electronic instrument carried by the user and having stored therein information on the user, comprising:
a wireless communication step in which the electronic apparatus executes wireless communication with the electronic instruments present in the vicinity of the electronic apparatus;
a first identification step in which if the electronic apparatus judges that there is a single possible user of the electronic apparatus based upon results of the wireless communication step, the electronic apparatus selects the user as the user of the electronic apparatus; and
a second identification step in which, if the electronic apparatus judges that there are a plurality of possible users of the electronic apparatus based upon the results of the wireless communication step, the electronic apparatus automatically selects a user most likely to be the user of the electronic apparatus as the electronic apparatus user.
47. A user identification method through which an electronic apparatus available for use by a user identifies the user of the electronic apparatus by wirelessly communicating with an electronic instrument carried by the user and having stored therein information on the user, comprising:
a search step in which the electronic apparatus wirelessly communicates with electronic instruments present in the vicinity of the electronic apparatus while gradually increasing the output level of a wireless transmission signal and searches for possible users of the electronic apparatus; and
an identification step in which the electronic apparatus selects a possible user of the electronic apparatus first detected by the electronic apparatus through the search step as the electronic apparatus user.
48. An identification method for identifying a user of an electronic apparatus available for use by the user through wireless communication between the electronic apparatus and an electronic instrument carried by the user and also having stored therein information on the user, comprising:
a wireless communication step in which the electronic apparatus executes wireless communication with the electronic instruments present in the vicinity of the electronic apparatus;
an identification step in which the electronic apparatus automatically selects a user most likely to be the user of the electronic apparatus as the electronic apparatus user based upon results of the wireless communication step;
a notification step in which the electronic apparatus issues an notification indicating that the user has been selected to the electronic instrument carried by the selected user through wireless communication; and
a reporting step in which the electronic instrument having received the notification reports to the user carrying the electronic instrument that the user has been selected as the electronic apparatus user by the electronic apparatus.
49. A user identification method through which an electronic apparatus having a non-contact IC card reading function and available for use by a user identifies the user of the electronic apparatus by wireless communicating with a non-contact IC card which is carried by the user and has stored therein information on the user, comprising:
a wireless communication step in which the electronic apparatus executes wireless communication with a non-contact IC card present in the vicinity of the electronic apparatus;
a first identification step in which, if the electronic apparatus determines that there is a single possible user of the electronic apparatus based upon results of the wireless communication step, the electronic apparatus selects the user as the user of the electronic apparatus; and
a second identification step in which, if the electronic apparatus determines that there are a plurality of possible users of the electronic apparatus based upon the results of the wireless communication step, the electronic apparatus automatically selects a user most likely to be the user of the electronic apparatus as the electronic apparatus user.
50. An identification method through which an electronic instrument carried by a user identifies an electronic apparatus to be utilized by the user by wirelessly communicating with the electronic apparatus to be utilized by the user, comprising:
a wireless communications step in which the electronic instrument carried by the user executes a wireless communication with electronic apparatuses present in the vicinity of the electronic instrument;
a first identification step in which, if the electronic instrument determines that there is a single electronic apparatus available for use by the user carrying the electronic instrument based upon results of the wireless communication step, then the electronic instrument selects the electronic apparatus as the electronic apparatus to be used by the user; and
a second identification step in which, if the electronic instrument determines that there are a plurality of electronic apparatuses available for use by the user carrying the electronic instrument based upon the results of the wireless communications step,then the electronic instrument automatically selects an electronic apparatus that the user most likely intends to use as the electronic apparatus to be used by the user.
51. An identification method through which an electronic instrument carried by a user identifies an electronic apparatus to be utilized by the user by wireless communication with the electronic apparatus to be utilized by the user, comprising:
a wireless communications step in which the electronic instrument carried by the user executes a wireless communication with electronic apparatuses present in the vicinity of the electronic instrument;
an identification step in which the electronic instrument automatically selects an electronic apparatus which is most likely to be used by the user as the electronic apparatus to be used by the user based upon results of the wireless communication step; and
a notification step in which a notification indicating that the user is to use the electronic apparatus selected through the identification step is issued to the selected electronic apparatus by the electronic instrument.
52-179. Cancelled.
US10/497,146 2001-12-03 2002-12-03 Electronic apparatus having a user identification function and user identification method Active 2024-07-08 US7382405B2 (en)

Priority Applications (12)

Application Number Priority Date Filing Date Title
US12/149,231 US7864218B2 (en) 2001-12-03 2008-04-29 Electronic camera, electronic instrument, and image transmission system and method, having user identification function
US12/926,564 US8482634B2 (en) 2001-12-03 2010-11-24 Image display apparatus having image-related information displaying function
US13/909,283 US8804006B2 (en) 2001-12-03 2013-06-04 Image display apparatus having image-related information displaying function
US14/327,903 US20140340534A1 (en) 2001-12-03 2014-07-10 Image display apparatus having image-related information displaying function
US14/808,549 US20150370360A1 (en) 2001-12-03 2015-07-24 Image display apparatus having image-related information displaying function
US15/131,395 US9894220B2 (en) 2001-12-03 2016-04-18 Image display apparatus having image-related information displaying function
US15/160,379 US9578186B2 (en) 2001-12-03 2016-05-20 Image display apparatus having image-related information displaying function
US15/213,845 US10015403B2 (en) 2001-12-03 2016-07-19 Image display apparatus having image-related information displaying function
US15/342,266 US9838550B2 (en) 2001-12-03 2016-11-03 Image display apparatus having image-related information displaying function
US15/802,033 US20180069968A1 (en) 2001-12-03 2017-11-02 Image display apparatus having image-related information displaying function
US15/859,966 US20180124259A1 (en) 2001-12-03 2018-01-02 Image display apparatus having image-related information displaying function
US16/270,870 US20190174066A1 (en) 2001-12-03 2019-02-08 Image display apparatus having image-related information displaying function

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP2001368014A JP4114348B2 (en) 2001-12-03 2001-12-03 Electronic device having user identification function and identification method
JP2001-368014 2001-12-03
JP2001373831A JP4158376B2 (en) 2001-12-07 2001-12-07 Electronic camera, image display apparatus, and image display method
JP2001-373831 2001-12-07
JP2001-375568 2001-12-10
JP2001375568A JP4114350B2 (en) 2001-12-10 2001-12-10 Electronic camera, electronic device, image transmission system, and image transmission method
PCT/JP2002/012625 WO2003049424A1 (en) 2001-12-03 2002-12-03 Electronic apparatus, electronic camera, electronic device, image display apparatus, and image transmission system

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2002/012625 A-371-Of-International WO2003049424A1 (en) 2001-12-03 2002-12-03 Electronic apparatus, electronic camera, electronic device, image display apparatus, and image transmission system
PCT/JP2002/012625 Continuation WO2003049424A1 (en) 2001-12-03 2002-12-03 Electronic apparatus, electronic camera, electronic device, image display apparatus, and image transmission system

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US12/149,231 Division US7864218B2 (en) 2001-12-03 2008-04-29 Electronic camera, electronic instrument, and image transmission system and method, having user identification function
US12/149,231 Continuation US7864218B2 (en) 2001-12-03 2008-04-29 Electronic camera, electronic instrument, and image transmission system and method, having user identification function

Publications (2)

Publication Number Publication Date
US20050001024A1 true US20050001024A1 (en) 2005-01-06
US7382405B2 US7382405B2 (en) 2008-06-03

Family

ID=27347896

Family Applications (13)

Application Number Title Priority Date Filing Date
US10/497,146 Active 2024-07-08 US7382405B2 (en) 2001-12-03 2002-12-03 Electronic apparatus having a user identification function and user identification method
US12/149,231 Expired - Lifetime US7864218B2 (en) 2001-12-03 2008-04-29 Electronic camera, electronic instrument, and image transmission system and method, having user identification function
US12/926,564 Active 2024-08-10 US8482634B2 (en) 2001-12-03 2010-11-24 Image display apparatus having image-related information displaying function
US13/909,283 Expired - Lifetime US8804006B2 (en) 2001-12-03 2013-06-04 Image display apparatus having image-related information displaying function
US14/327,903 Abandoned US20140340534A1 (en) 2001-12-03 2014-07-10 Image display apparatus having image-related information displaying function
US14/808,549 Abandoned US20150370360A1 (en) 2001-12-03 2015-07-24 Image display apparatus having image-related information displaying function
US15/131,395 Expired - Lifetime US9894220B2 (en) 2001-12-03 2016-04-18 Image display apparatus having image-related information displaying function
US15/160,379 Expired - Lifetime US9578186B2 (en) 2001-12-03 2016-05-20 Image display apparatus having image-related information displaying function
US15/213,845 Expired - Lifetime US10015403B2 (en) 2001-12-03 2016-07-19 Image display apparatus having image-related information displaying function
US15/342,266 Expired - Lifetime US9838550B2 (en) 2001-12-03 2016-11-03 Image display apparatus having image-related information displaying function
US15/802,033 Abandoned US20180069968A1 (en) 2001-12-03 2017-11-02 Image display apparatus having image-related information displaying function
US15/859,966 Abandoned US20180124259A1 (en) 2001-12-03 2018-01-02 Image display apparatus having image-related information displaying function
US16/270,870 Abandoned US20190174066A1 (en) 2001-12-03 2019-02-08 Image display apparatus having image-related information displaying function

Family Applications After (12)

Application Number Title Priority Date Filing Date
US12/149,231 Expired - Lifetime US7864218B2 (en) 2001-12-03 2008-04-29 Electronic camera, electronic instrument, and image transmission system and method, having user identification function
US12/926,564 Active 2024-08-10 US8482634B2 (en) 2001-12-03 2010-11-24 Image display apparatus having image-related information displaying function
US13/909,283 Expired - Lifetime US8804006B2 (en) 2001-12-03 2013-06-04 Image display apparatus having image-related information displaying function
US14/327,903 Abandoned US20140340534A1 (en) 2001-12-03 2014-07-10 Image display apparatus having image-related information displaying function
US14/808,549 Abandoned US20150370360A1 (en) 2001-12-03 2015-07-24 Image display apparatus having image-related information displaying function
US15/131,395 Expired - Lifetime US9894220B2 (en) 2001-12-03 2016-04-18 Image display apparatus having image-related information displaying function
US15/160,379 Expired - Lifetime US9578186B2 (en) 2001-12-03 2016-05-20 Image display apparatus having image-related information displaying function
US15/213,845 Expired - Lifetime US10015403B2 (en) 2001-12-03 2016-07-19 Image display apparatus having image-related information displaying function
US15/342,266 Expired - Lifetime US9838550B2 (en) 2001-12-03 2016-11-03 Image display apparatus having image-related information displaying function
US15/802,033 Abandoned US20180069968A1 (en) 2001-12-03 2017-11-02 Image display apparatus having image-related information displaying function
US15/859,966 Abandoned US20180124259A1 (en) 2001-12-03 2018-01-02 Image display apparatus having image-related information displaying function
US16/270,870 Abandoned US20190174066A1 (en) 2001-12-03 2019-02-08 Image display apparatus having image-related information displaying function

Country Status (3)

Country Link
US (13) US7382405B2 (en)
AU (1) AU2002354181A1 (en)
WO (1) WO2003049424A1 (en)

Cited By (190)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030194203A1 (en) * 2002-04-16 2003-10-16 Samsung Electronics Co., Ltd. Image recording apparatus capable of recording position and direction information of object of shooting
US20040176118A1 (en) * 2003-02-18 2004-09-09 Michael Strittmatter Service attribute based filtering system and method
US20040223063A1 (en) * 1997-10-09 2004-11-11 Deluca Michael J. Detecting red eye filter and apparatus using meta-data
US20050041121A1 (en) * 1997-10-09 2005-02-24 Eran Steinberg Red-eye filter method and apparatus
US20050232580A1 (en) * 2004-03-11 2005-10-20 Interdigital Technology Corporation Control of device operation within an area
US20050258229A1 (en) * 2003-09-22 2005-11-24 Matsushita Electric Industrial Co., Ltd. Secure device and information processing unit
US20060018628A1 (en) * 2003-06-10 2006-01-26 Fujitsu Limited Data transmission system
US20060039221A1 (en) * 2004-08-18 2006-02-23 Sony Corporation Memory card, memory card control method and memory card access control method
US20060080340A1 (en) * 2004-09-13 2006-04-13 Hirokazu Oi Communication system, communication apparatus, and communication method
US20060093212A1 (en) * 2004-10-28 2006-05-04 Eran Steinberg Method and apparatus for red-eye detection in an acquired digital image
US20060120599A1 (en) * 2004-10-28 2006-06-08 Eran Steinberg Method and apparatus for red-eye detection in an acquired digital image
US20060137018A1 (en) * 2004-11-29 2006-06-22 Interdigital Technology Corporation Method and apparatus to provide secured surveillance data to authorized entities
US20060132908A1 (en) * 2004-07-26 2006-06-22 Baun Kenneth W Apparatus and methods for focusing and collimating telescopes
US20060136379A1 (en) * 2004-12-17 2006-06-22 Eastman Kodak Company Image content sharing device and method
US20060148418A1 (en) * 2004-12-06 2006-07-06 Interdigital Technology Corporation Method and apparatus for alerting a target that it is subject to sensing and restricting access to sensed content associated with the target
US20060172063A1 (en) * 2004-12-06 2006-08-03 Interdigital Technology Corporation Method and apparatus for detecting portable electronic device functionality
US20060189349A1 (en) * 2005-02-24 2006-08-24 Memory Matrix, Inc. Systems and methods for automatic uploading of cell phone images
US20060189348A1 (en) * 2005-02-23 2006-08-24 Memory Matrix, Inc. Systems and methods for automatic synchronization of cellular telephones
US20060200564A1 (en) * 2003-04-23 2006-09-07 Canon Kabushiki Kaisha Wireless communication system, and wireless communication device and control method
US20060206592A1 (en) * 2003-04-23 2006-09-14 Canon Kabushiki Kaisha Wireless communication system and wireless communication device and control method
US20060227640A1 (en) * 2004-12-06 2006-10-12 Interdigital Technology Corporation Sensing device with activation and sensing alert functions
US20070003148A1 (en) * 2001-10-02 2007-01-04 Shoji Muramatsu Image processing apparatus and image pickup device
US20070016369A1 (en) * 2005-07-14 2007-01-18 Murata Kikai Kabushiki Kaisha Guided vehicle system and travel route map creation method for guided vehicle system
US20070081805A1 (en) * 2005-08-17 2007-04-12 Nokia Corporation Combined actuator for camera shutter and focus functions
US20070116379A1 (en) * 2005-11-18 2007-05-24 Peter Corcoran Two stage detection for photographic eye artifacts
US20070198509A1 (en) * 2006-02-20 2007-08-23 Sony Ericsson Mobile Information processing apparatus, information processing method, information processing program, and mobile terminal apparatus
US20070217650A1 (en) * 2006-03-20 2007-09-20 Fujifilm Corporation Remote controller, remote control system, and method for displaying detailed information
US20070236327A1 (en) * 2006-03-24 2007-10-11 Fujifilm Corporation Apparatus, method, program and system for remote control
US20070255853A1 (en) * 2006-04-27 2007-11-01 Toutonghi Michael J Sharing digital content via a packet-switched network
US20070287478A1 (en) * 2006-04-18 2007-12-13 Samsung Electronics Co. Ltd. System and method for transferring character between portable communication devices
US20080013798A1 (en) * 2006-06-12 2008-01-17 Fotonation Vision Limited Advances in extending the aam techniques from grayscale to color images
US20080112599A1 (en) * 2006-11-10 2008-05-15 Fotonation Vision Limited method of detecting redeye in a digital image
US20080175481A1 (en) * 2007-01-18 2008-07-24 Stefan Petrescu Color Segmentation
US20080186389A1 (en) * 1997-10-09 2008-08-07 Fotonation Vision Limited Image Modification Based on Red-Eye Filter Analysis
US20080186385A1 (en) * 2007-02-06 2008-08-07 Samsung Electronics Co., Ltd. Photographing apparatus capable of communication with external apparatus and method of controlling the same
US20080200897A1 (en) * 2007-02-19 2008-08-21 Abbott Diabetes Care, Inc. Modular combination of medication infusion and analyte monitoring
US20080219518A1 (en) * 2007-03-05 2008-09-11 Fotonation Vision Limited Red Eye False Positive Filtering Using Face Location and Orientation
US20080242946A1 (en) * 2007-03-30 2008-10-02 Sony Corporation Method and apparatus for transporting images
US20080240555A1 (en) * 2005-11-18 2008-10-02 Florin Nanu Two Stage Detection for Photographic Eye Artifacts
US20080255434A1 (en) * 2007-04-14 2008-10-16 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in medical communication system
US20080256048A1 (en) * 2007-04-14 2008-10-16 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in medical communication system
US20080278332A1 (en) * 2007-05-08 2008-11-13 Abbott Diabetes Care, Inc. Analyte monitoring system and methods
US20080281179A1 (en) * 2007-05-08 2008-11-13 Abbott Diabetes Care, Inc. Analyte monitoring system and methods
US20080288204A1 (en) * 2007-04-14 2008-11-20 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in medical communication system
US20080287762A1 (en) * 2007-05-14 2008-11-20 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in a medical communication system
US20080297409A1 (en) * 2007-05-29 2008-12-04 Research In Motion Limited System and method for selecting a geographic location to associate with an object
US20080319296A1 (en) * 2007-06-21 2008-12-25 Abbott Diabetes Care, Inc. Health monitor
US20090011707A1 (en) * 2007-07-04 2009-01-08 Samsung Electronics Co., Ltd. Method and apparatus for identifying neighboring device
US20090015654A1 (en) * 2003-02-04 2009-01-15 Nec Corporation Operation limiting technique for a camera-equipped mobile communication terminal
US20090036760A1 (en) * 2007-07-31 2009-02-05 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in a medical communication system
US20090054747A1 (en) * 2005-10-31 2009-02-26 Abbott Diabetes Care, Inc. Method and system for providing analyte sensor tester isolation
US20090063402A1 (en) * 2007-08-31 2009-03-05 Abbott Diabetes Care, Inc. Method and System for Providing Medication Level Determination
US20090079845A1 (en) * 2006-04-05 2009-03-26 Olympus Corporation Digital camera system
US20090105571A1 (en) * 2006-06-30 2009-04-23 Abbott Diabetes Care, Inc. Method and System for Providing Data Communication in Data Management Systems
US20090115879A1 (en) * 2006-09-28 2009-05-07 Hideki Nagata Information management method and digital camera
US20090123063A1 (en) * 2007-11-08 2009-05-14 Fotonation Vision Limited Detecting Redeye Defects in Digital Images
US20090138544A1 (en) * 2006-11-22 2009-05-28 Rainer Wegenkittl Method and System for Dynamic Image Processing
US20090164190A1 (en) * 2007-12-19 2009-06-25 Abbott Diabetes Care, Inc. Physiological condition simulation device and method
US20090164239A1 (en) * 2007-12-19 2009-06-25 Abbott Diabetes Care, Inc. Dynamic Display Of Glucose Information
US20090189998A1 (en) * 2008-01-30 2009-07-30 Fotonation Ireland Limited Methods And Apparatuses For Using Image Acquisition Data To Detect And Correct Image Defects
US20090207270A1 (en) * 2006-10-27 2009-08-20 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US20090257432A1 (en) * 2006-03-16 2009-10-15 Tsuyoshi Yamaguchi Terminal
US20090256673A1 (en) * 2006-12-26 2009-10-15 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US20090277470A1 (en) * 2008-05-08 2009-11-12 Mitchell Monique M Artificial nail decorating system utilizing computer technology
US20090290030A1 (en) * 2007-01-29 2009-11-26 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US20090315671A1 (en) * 2007-02-28 2009-12-24 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US20100040284A1 (en) * 2005-11-18 2010-02-18 Fotonation Vision Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US20100039520A1 (en) * 2008-08-14 2010-02-18 Fotonation Ireland Limited In-Camera Based Method of Detecting Defect Eye with High Accuracy
US20100039525A1 (en) * 2003-06-26 2010-02-18 Fotonation Ireland Limited Perfecting of Digital Image Capture Parameters Within Acquisition Devices Using Face Detection
US20100045231A1 (en) * 2006-03-31 2010-02-25 Abbott Diabetes Care Inc. Method and System for Powering an Electronic Device
US20100057042A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Closed Loop Control With Improved Alarm Functions
US20100057044A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care Inc. Robust Closed Loop Control And Methods
US20100053362A1 (en) * 2003-08-05 2010-03-04 Fotonation Ireland Limited Partial face detector red-eye filter method and apparatus
US20100057041A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Closed Loop Control With Reference Measurement And Methods Thereof
US20100053368A1 (en) * 2003-08-05 2010-03-04 Fotonation Ireland Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US20100056992A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Variable Rate Closed Loop Control And Methods
US20100076284A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Management Devices and Methods
US20100085435A1 (en) * 2008-10-07 2010-04-08 Fuji Xerox Co., Ltd. Information processing apparatus, remote indication system, and computer readable medium
US20100198034A1 (en) * 2009-02-03 2010-08-05 Abbott Diabetes Care Inc. Compact On-Body Physiological Monitoring Devices and Methods Thereof
US20100230285A1 (en) * 2009-02-26 2010-09-16 Abbott Diabetes Care Inc. Analyte Sensors and Methods of Making and Using the Same
US20100257195A1 (en) * 2009-02-20 2010-10-07 Nikon Corporation Mobile information device, image pickup device, and information acquisition system
US20100274220A1 (en) * 2005-11-04 2010-10-28 Abbott Diabetes Care Inc. Method and System for Providing Basal Profile Modification in Analyte Monitoring and Management Systems
US20100274515A1 (en) * 2009-04-28 2010-10-28 Abbott Diabetes Care Inc. Dynamic Analyte Sensor Calibration Based On Sensor Stability Profile
US20100283586A1 (en) * 2007-12-28 2010-11-11 Yoichi Ikeda Communication device, communication system, image presentation method, and program
US20110021889A1 (en) * 2009-07-23 2011-01-27 Abbott Diabetes Care Inc. Continuous Analyte Measurement Systems and Systems and Methods for Implanting Them
US20110029269A1 (en) * 2009-07-31 2011-02-03 Abbott Diabetes Care Inc. Method and Apparatus for Providing Analyte Monitoring System Calibration Accuracy
US20110054282A1 (en) * 2009-08-31 2011-03-03 Abbott Diabetes Care Inc. Analyte Monitoring System and Methods for Managing Power and Noise
US20110060530A1 (en) * 2009-08-31 2011-03-10 Abbott Diabetes Care Inc. Analyte Signal Processing Device and Methods
US20110060836A1 (en) * 2005-06-17 2011-03-10 Tessera Technologies Ireland Limited Method for Establishing a Paired Connection Between Media Devices
US20110063465A1 (en) * 2004-10-28 2011-03-17 Fotonation Ireland Limited Analyzing Partial Face Regions for Red-Eye Detection in Acquired Digital Images
US7916190B1 (en) 1997-10-09 2011-03-29 Tessera Technologies Ireland Limited Red-eye filter method and apparatus
US20110075894A1 (en) * 2003-06-26 2011-03-31 Tessera Technologies Ireland Limited Digital Image Processing Using Face Detection Information
US20110102643A1 (en) * 2004-02-04 2011-05-05 Tessera Technologies Ireland Limited Partial Face Detector Red-Eye Filter Method and Apparatus
US7970182B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US20110163892A1 (en) * 2010-01-07 2011-07-07 Emilcott Associates, Inc. System and method for mobile environmental measurements and displays
US20110193985A1 (en) * 2010-02-08 2011-08-11 Nikon Corporation Imaging device, information acquisition system and program
US20110213225A1 (en) * 2009-08-31 2011-09-01 Abbott Diabetes Care Inc. Medical devices and methods
US20120017181A1 (en) * 2010-07-16 2012-01-19 Canon Kabushiki Kaisha Image processing apparatus control method and program
US8184900B2 (en) 2006-02-14 2012-05-22 DigitalOptics Corporation Europe Limited Automatic detection and correction of non-red eye flash defects
US8391258B2 (en) 2006-10-20 2013-03-05 Canon Kabushiki Kaisha Communication parameter setting method, communicating apparatus, and managing apparatus for managing communication parameters
US8423511B1 (en) * 2011-07-07 2013-04-16 Symantec Corporation Systems and methods for securing data on mobile devices
US8503818B2 (en) 2007-09-25 2013-08-06 DigitalOptics Corporation Europe Limited Eye defect detection in international standards organization images
US20130258121A1 (en) * 2008-06-03 2013-10-03 Canon Kabushiki Kaisha Information processing apparatus, system and control method thereof
US8571808B2 (en) 2007-05-14 2013-10-29 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US20130304989A1 (en) * 2012-05-08 2013-11-14 Fujitsu Limited Information processing apparatus
US8612163B2 (en) 2007-05-14 2013-12-17 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8682615B2 (en) 2007-05-14 2014-03-25 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US20140085526A1 (en) * 2011-06-24 2014-03-27 Olympus Corporation Imaging device and wireless system
US8698615B2 (en) 2007-04-14 2014-04-15 Abbott Diabetes Care Inc. Method and apparatus for providing dynamic multi-stage signal amplification in a medical device
US20140104443A1 (en) * 2011-06-24 2014-04-17 Olympus Corporation Imaging apparatus and wireless system
US8710993B2 (en) 2011-11-23 2014-04-29 Abbott Diabetes Care Inc. Mitigating single point failure of devices in an analyte monitoring system and methods thereof
US8756305B2 (en) 2003-04-23 2014-06-17 Canon Kabushiki Kaisha Information processing apparatus and connection control method for joining a wireless network and searching for a printer having a predetermined function
US8798934B2 (en) 2009-07-23 2014-08-05 Abbott Diabetes Care Inc. Real time management of data relating to physiological control of glucose levels
US8799609B1 (en) * 2009-06-30 2014-08-05 Emc Corporation Error handling
US20140226029A1 (en) * 2013-02-14 2014-08-14 Olympus Corporation Computer readable recording medium, transmitting device, management server and image transmitting method
US8818182B2 (en) 2005-10-17 2014-08-26 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US8834366B2 (en) 2007-07-31 2014-09-16 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor calibration
US8880138B2 (en) 2005-09-30 2014-11-04 Abbott Diabetes Care Inc. Device for channeling fluid and methods of use
US8932216B2 (en) 2006-08-07 2015-01-13 Abbott Diabetes Care Inc. Method and system for providing data management in integrated analyte monitoring and infusion system
US20150054984A1 (en) * 2012-03-15 2015-02-26 Nec Display Solutions, Ltd. Image display apparatus and image display method
US8986208B2 (en) 2008-09-30 2015-03-24 Abbott Diabetes Care Inc. Analyte sensor sensitivity attenuation mitigation
US9000929B2 (en) 2007-05-08 2015-04-07 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9031630B2 (en) 2006-02-28 2015-05-12 Abbott Diabetes Care Inc. Analyte sensors and methods of use
US9035767B2 (en) 2007-05-08 2015-05-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9039975B2 (en) 2006-03-31 2015-05-26 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
US9069536B2 (en) 2011-10-31 2015-06-30 Abbott Diabetes Care Inc. Electronic devices having integrated reset systems and methods thereof
US20150189179A1 (en) * 2013-12-30 2015-07-02 Samsung Electronics Co., Ltd. Electronic apparatus and method for controlling the same
US9095290B2 (en) 2007-03-01 2015-08-04 Abbott Diabetes Care Inc. Method and apparatus for providing rolling data in communication systems
US9125548B2 (en) 2007-05-14 2015-09-08 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9177456B2 (en) 2007-05-08 2015-11-03 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US20150332196A1 (en) * 2014-05-15 2015-11-19 Heinz-Werner Stiller Surgical Workflow Support System
JP2015219650A (en) * 2014-05-15 2015-12-07 キヤノン株式会社 Communication device, control method thereof, and program
US9226701B2 (en) 2009-04-28 2016-01-05 Abbott Diabetes Care Inc. Error detection in critical repeating data in a wireless sensor system
US20160049072A1 (en) * 2013-04-24 2016-02-18 The Swatch Group Research And Development Ltd Multi-device system with simplified communication
US9320468B2 (en) 2008-01-31 2016-04-26 Abbott Diabetes Care Inc. Analyte sensor with time lag compensation
US20160117066A1 (en) * 2007-12-14 2016-04-28 Scenera Technologies, Llc Methods, Systems, And Computer Readable Media For Controlling Presentation And Selection Of Objects That Are Digital Images Depicting Subjects
US9326727B2 (en) 2006-01-30 2016-05-03 Abbott Diabetes Care Inc. On-body medical device securement
US9332934B2 (en) 2007-10-23 2016-05-10 Abbott Diabetes Care Inc. Analyte sensor with lag compensation
US9357959B2 (en) 2006-10-02 2016-06-07 Abbott Diabetes Care Inc. Method and system for dynamically updating calibration parameters for an analyte sensor
US9392969B2 (en) 2008-08-31 2016-07-19 Abbott Diabetes Care Inc. Closed loop control and signal attenuation detection
US9408566B2 (en) 2006-08-09 2016-08-09 Abbott Diabetes Care Inc. Method and system for providing calibration of an analyte sensor in an analyte monitoring system
US9439586B2 (en) 2007-10-23 2016-09-13 Abbott Diabetes Care Inc. Assessing measures of glycemic variability
US20160307443A1 (en) * 2015-04-20 2016-10-20 Imagine If, LLC Global positioning system
US9483608B2 (en) 2007-05-14 2016-11-01 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9532737B2 (en) 2011-02-28 2017-01-03 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US9541556B2 (en) 2008-05-30 2017-01-10 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US9558325B2 (en) 2007-05-14 2017-01-31 Abbott Diabetes Care Inc. Method and system for determining analyte levels
US9574914B2 (en) 2007-05-08 2017-02-21 Abbott Diabetes Care Inc. Method and device for determining elapsed sensor life
US9721063B2 (en) 2011-11-23 2017-08-01 Abbott Diabetes Care Inc. Compatibility mechanisms for devices in a continuous analyte monitoring system and methods thereof
US9730623B2 (en) 2008-03-28 2017-08-15 Abbott Diabetes Care Inc. Analyte sensor calibration management
US9730584B2 (en) 2003-06-10 2017-08-15 Abbott Diabetes Care Inc. Glucose measuring device for use in personal area network
US9782076B2 (en) 2006-02-28 2017-10-10 Abbott Diabetes Care Inc. Smart messages and alerts for an infusion delivery and management system
US9795331B2 (en) 2005-12-28 2017-10-24 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor insertion
US9814416B2 (en) 2009-08-31 2017-11-14 Abbott Diabetes Care Inc. Displays for a medical device
US9913600B2 (en) 2007-06-29 2018-03-13 Abbott Diabetes Care Inc. Analyte monitoring and management device and method to analyze the frequency of user interaction with the device
US9931075B2 (en) 2008-05-30 2018-04-03 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US9962091B2 (en) 2002-12-31 2018-05-08 Abbott Diabetes Care Inc. Continuous glucose monitoring system and methods of use
US9968306B2 (en) 2012-09-17 2018-05-15 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems
US10002233B2 (en) 2007-05-14 2018-06-19 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US20180176433A1 (en) * 2016-12-15 2018-06-21 Fujifilm Corporation Printer, digital camera with printer, and printing method
US10009244B2 (en) 2009-04-15 2018-06-26 Abbott Diabetes Care Inc. Analyte monitoring system having an alert
US20180183994A1 (en) * 2016-12-27 2018-06-28 Canon Kabushiki Kaisha Display control apparatus, electronic apparatus, and control method therefor
US10022499B2 (en) 2007-02-15 2018-07-17 Abbott Diabetes Care Inc. Device and method for automatic data acquisition and/or detection
US10031002B2 (en) 2007-05-14 2018-07-24 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10039881B2 (en) 2002-12-31 2018-08-07 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US10082493B2 (en) 2011-11-25 2018-09-25 Abbott Diabetes Care Inc. Analyte monitoring system and methods of use
US10111608B2 (en) 2007-04-14 2018-10-30 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US10132793B2 (en) 2012-08-30 2018-11-20 Abbott Diabetes Care Inc. Dropout detection in continuous analyte monitoring data during data excursions
US10142534B2 (en) 2014-11-07 2018-11-27 Bianconero, Inc. Image-capturing and image-distributing system for automatically or manually capturing image of user carrying mobile communication terminal
US10136845B2 (en) 2011-02-28 2018-11-27 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US10158795B2 (en) 2012-12-27 2018-12-18 Panasonic Intellectual Property Management Co., Ltd. Electronic apparatus for communicating with another apparatus
US10173007B2 (en) 2007-10-23 2019-01-08 Abbott Diabetes Care Inc. Closed loop control system with safety parameters and methods
US10194844B2 (en) 2009-04-29 2019-02-05 Abbott Diabetes Care Inc. Methods and systems for early signal attenuation detection and processing
US10206629B2 (en) 2006-08-07 2019-02-19 Abbott Diabetes Care Inc. Method and system for providing integrated analyte monitoring and infusion system therapy management
US10225467B2 (en) * 2015-07-20 2019-03-05 Motorola Mobility Llc 360° video multi-angle attention-focus recording
US10328201B2 (en) 2008-07-14 2019-06-25 Abbott Diabetes Care Inc. Closed loop control system interface and methods
US10448230B2 (en) 2014-04-25 2019-10-15 Alibaba Group Holding Limited Data transmission
US10816808B2 (en) * 2017-12-08 2020-10-27 Seiko Epson Corporation Head-mounted display apparatus, information processing device, system, and method for controlling use of captured images from head-mounted display apparatus
US10859835B2 (en) * 2018-01-24 2020-12-08 Seiko Epson Corporation Head-mounted display apparatus and method for controlling imaging data of head-mounted display apparatus using release code
US20210295552A1 (en) * 2018-08-02 2021-09-23 Sony Corporation Information processing device, information processing method, and program
US11163356B2 (en) * 2017-05-18 2021-11-02 Guohua Liu Device-facing human-computer interaction method and system
US11213226B2 (en) 2010-10-07 2022-01-04 Abbott Diabetes Care Inc. Analyte monitoring devices and methods
US11298058B2 (en) 2005-12-28 2022-04-12 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor insertion
US11303815B2 (en) * 2017-11-29 2022-04-12 Sony Corporation Imaging apparatus and imaging method
US20220284248A1 (en) * 2021-03-04 2022-09-08 Brother Kogyo Kabushiki Kaisha Computer-readable medium, image processing device, and method for reducing time taken from input of print instruction until start of printing
US11509402B2 (en) * 2017-09-29 2022-11-22 Orange Method and system for recognizing a user during a radio communication via the human body
US11553883B2 (en) 2015-07-10 2023-01-17 Abbott Diabetes Care Inc. System, device and method of dynamic glucose profile response to physiological parameters
US11596330B2 (en) 2017-03-21 2023-03-07 Abbott Diabetes Care Inc. Methods, devices and system for providing diabetic condition diagnosis and therapy
US11793936B2 (en) 2009-05-29 2023-10-24 Abbott Diabetes Care Inc. Medical device antenna systems having external antenna configurations
US11896371B2 (en) 2012-09-26 2024-02-13 Abbott Diabetes Care Inc. Method and apparatus for improving lag correction during in vivo measurement of analyte concentration with analyte concentration variability and range data

Families Citing this family (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003049424A1 (en) 2001-12-03 2003-06-12 Nikon Corporation Electronic apparatus, electronic camera, electronic device, image display apparatus, and image transmission system
JP4346925B2 (en) * 2003-02-25 2009-10-21 キヤノン株式会社 Display device, display device control method, program, and computer-readable recording medium
DE602004025811D1 (en) * 2003-08-11 2010-04-15 Sony Corp RADIO COMMUNICATION SYSTEM AND RADIO COMMUNICATION DEVICE
JP2005269604A (en) * 2004-02-20 2005-09-29 Fuji Photo Film Co Ltd Imaging device, imaging method, and imaging program
JP2005322158A (en) * 2004-05-11 2005-11-17 Canon Inc System and method for information processing
JP4708733B2 (en) * 2004-05-21 2011-06-22 キヤノン株式会社 Imaging device
JP2006109296A (en) * 2004-10-08 2006-04-20 Canon Inc Digital camera and control method of digital camera
US10204338B2 (en) * 2004-11-24 2019-02-12 Microsoft Technology Licensing, Llc Synchronizing contents of removable storage devices with a multimedia network
JP4324116B2 (en) * 2005-02-17 2009-09-02 キヤノン株式会社 Image processing apparatus, control method therefor, program, and storage medium
US7791757B2 (en) * 2005-05-27 2010-09-07 Norbert Weig Procedure and system for the production of photo books
JP4756950B2 (en) * 2005-08-08 2011-08-24 キヤノン株式会社 Imaging apparatus and control method thereof
JPWO2008012905A1 (en) * 2006-07-27 2009-12-17 パナソニック株式会社 Authentication apparatus and authentication image display method
US7751597B2 (en) * 2006-11-14 2010-07-06 Lctank Llc Apparatus and method for identifying a name corresponding to a face or voice using a database
US7986230B2 (en) * 2006-11-14 2011-07-26 TrackThings LLC Apparatus and method for finding a misplaced object using a database and instructions generated by a portable device
USD609714S1 (en) 2007-03-22 2010-02-09 Fujifilm Corporation Electronic camera
US8154608B2 (en) * 2007-11-13 2012-04-10 Olympus Corporation Digital camera security
TW200937108A (en) * 2008-01-18 2009-09-01 Geotate Bv Camera with satellite positioning system
US20100030872A1 (en) * 2008-08-04 2010-02-04 Serge Caleca System for remote processing, printing, and uploading of digital images to a remote server via wireless connections
KR20100058109A (en) * 2008-11-24 2010-06-03 삼성전자주식회사 A method and apparatus of displaying background image of portable terminal using position information
US20100164685A1 (en) * 2008-12-31 2010-07-01 Trevor Pering Method and apparatus for establishing device connections
US8194136B1 (en) 2009-01-26 2012-06-05 Amazon Technologies, Inc. Systems and methods for lens characterization
US20100203876A1 (en) * 2009-02-11 2010-08-12 Qualcomm Incorporated Inferring user profile properties based upon mobile device usage
US9740921B2 (en) 2009-02-26 2017-08-22 Tko Enterprises, Inc. Image processing sensor systems
US8780198B2 (en) * 2009-02-26 2014-07-15 Tko Enterprises, Inc. Image processing sensor systems
US9277878B2 (en) * 2009-02-26 2016-03-08 Tko Enterprises, Inc. Image processing sensor systems
US8327367B2 (en) 2009-03-05 2012-12-04 Empire Technology Development Llc Information service providing system, information service providing device, and method therefor
US8566060B2 (en) * 2009-03-05 2013-10-22 Empire Technology Development Llc Information service providing system, information service providing device, and method therefor
JP2010211353A (en) * 2009-03-09 2010-09-24 Fujitsu Ltd Virtual access controller
JP4931089B2 (en) * 2009-03-13 2012-05-16 エンパイア テクノロジー ディベロップメント エルエルシー HEALTH DIAGNOSIS SYSTEM, HEALTH DIAGNOSIS DEVICE AND METHOD THEREOF
JP4369526B1 (en) * 2009-03-13 2009-11-25 キーパー=スミス エル・エル・ピー Image photographing system, image photographing device and method thereof
US20100287230A1 (en) * 2009-05-05 2010-11-11 Thomas Frederick Stalcup Internet-enabled Data Transceiver
US9112928B2 (en) * 2009-05-29 2015-08-18 Nokia Technologies Oy Method and apparatus for automatic loading of applications
JP2011029924A (en) * 2009-07-24 2011-02-10 Canon Inc Information processing apparatus, imaging apparatus, and shooting mode setting system
KR20110040248A (en) * 2009-10-13 2011-04-20 삼성전자주식회사 Apparatus and method for reducing the energy of comsumption in digital image processing device
JP5318819B2 (en) * 2010-05-14 2013-10-16 日本電信電話株式会社 Image providing method and image providing system
JP5318821B2 (en) * 2010-05-14 2013-10-16 日本電信電話株式会社 Image providing method and image providing system
JP5318820B2 (en) * 2010-05-14 2013-10-16 日本電信電話株式会社 Image providing method and image providing system
JP5858754B2 (en) * 2011-11-29 2016-02-10 キヤノン株式会社 Imaging apparatus, display method, and program
JP5950151B2 (en) 2012-02-28 2016-07-13 ソニー株式会社 Electronic device, power supply control method, and program
US9459606B2 (en) * 2012-02-28 2016-10-04 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for control information, method for displaying control information, and system for displaying control information
JP5279930B1 (en) * 2012-03-27 2013-09-04 株式会社東芝 Server, electronic device, server control method, server control program
US9137428B2 (en) * 2012-06-01 2015-09-15 Microsoft Technology Licensing, Llc Storyboards for capturing images
JP5777649B2 (en) * 2013-01-28 2015-09-09 京セラドキュメントソリューションズ株式会社 Information processing device
KR102049855B1 (en) 2013-01-31 2019-11-28 엘지전자 주식회사 Mobile terminal and controlling method thereof
US9987184B2 (en) * 2013-02-05 2018-06-05 Valentin Borovinov Systems, methods, and media for providing video of a burial memorial
US10372396B2 (en) * 2013-02-21 2019-08-06 Lenovo ( Singapore) Pte. Ltd. Discovery and connection to wireless displays
JP5762470B2 (en) * 2013-06-06 2015-08-12 シャープ株式会社 Display system and electronic device
KR102108066B1 (en) * 2013-09-02 2020-05-08 엘지전자 주식회사 Head mounted display device and method for controlling the same
US10013564B2 (en) 2013-10-10 2018-07-03 Elwha Llc Methods, systems, and devices for handling image capture devices and captured images
US10346624B2 (en) 2013-10-10 2019-07-09 Elwha Llc Methods, systems, and devices for obscuring entities depicted in captured images
US10102543B2 (en) 2013-10-10 2018-10-16 Elwha Llc Methods, systems, and devices for handling inserted data into captured images
US9799036B2 (en) 2013-10-10 2017-10-24 Elwha Llc Devices, methods, and systems for managing representations of entities through use of privacy indicators
US10185841B2 (en) 2013-10-10 2019-01-22 Elwha Llc Devices, methods, and systems for managing representations of entities through use of privacy beacons
US20150104004A1 (en) 2013-10-10 2015-04-16 Elwha Llc Methods, systems, and devices for delivering image data from captured images to devices
JP6317576B2 (en) * 2013-11-29 2018-04-25 キヤノン株式会社 COMMUNICATION DEVICE, COMMUNICATION DEVICE CONTROL METHOD, AND PROGRAM
DE202014005558U1 (en) 2014-04-26 2014-07-22 Carl Zeiss Microscopy Gmbh Microscope with electrically adjustable microscope components
US9307234B1 (en) * 2014-07-23 2016-04-05 American Express Travel Related Services Company, Inc. Interactive latency control with lossless image optimization
EP3175368A4 (en) * 2014-07-29 2018-03-14 Hewlett-Packard Development Company, L.P. Default calibrated sensor module settings
JP6550736B2 (en) 2014-12-08 2019-07-31 富士ゼロックス株式会社 Shooting target management program and information processing apparatus
JP6824767B2 (en) * 2017-02-03 2021-02-03 キヤノン株式会社 Imaging device and its control method
US10325463B2 (en) * 2017-11-09 2019-06-18 Ademco Inc. Systems and methods for changing an operation of a security system in response to comparing a first unique identifier and a second unique identifier
KR20200094525A (en) * 2019-01-30 2020-08-07 삼성전자주식회사 Electronic device for processing a file including a plurality of related data
US11740132B2 (en) 2019-10-02 2023-08-29 Datacolor, Inc. Method and apparatus for color lookup using a mobile device
EP4038517A4 (en) * 2019-10-02 2023-07-05 Datacolor, Inc. Method and apparatus for color lookup using a mobile device
US11899757B2 (en) * 2019-12-02 2024-02-13 Cox Automotive, Inc. Systems and methods for temporary digital content sharing
CN111629322A (en) * 2020-04-07 2020-09-04 普联技术有限公司 WiFi-based vehicle positioning method and system, storage medium and mobile terminal
JP2022012403A (en) * 2020-07-01 2022-01-17 キヤノン株式会社 Program, information processing device, and control method
JP2022096292A (en) * 2020-12-17 2022-06-29 富士フイルム株式会社 Imaging system, server, imaging device, imaging method, program, and recording medium
CN114422711B (en) * 2022-03-29 2022-08-12 深圳市猿人创新科技有限公司 Shooting equipment and software function and shooting parameter self-adaption method thereof

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5999213A (en) * 1995-06-08 1999-12-07 Sony Corporation Method of and apparatus for setting up electronic device
US20020012048A1 (en) * 2000-05-18 2002-01-31 Yoichi Yamagishi Notification of operating status in image sensing system
US20020047905A1 (en) * 2000-10-20 2002-04-25 Naoto Kinjo Image processing system and ordering system
US20020101519A1 (en) * 2001-01-29 2002-08-01 Myers Jeffrey S. Automatic generation of information identifying an object in a photographic image
US6670933B1 (en) * 1997-11-27 2003-12-30 Fuji Photo Film Co., Ltd. Image display apparatus and camera and image communication system
US20040051787A1 (en) * 2002-08-30 2004-03-18 Yasuo Mutsuro Camera system
US20040087273A1 (en) * 2002-10-31 2004-05-06 Nokia Corporation Method and system for selecting data items for service requests
US20040174435A1 (en) * 1999-06-28 2004-09-09 Olympus Optical Co., Ltd. Information processing system and camera system
US20050179811A1 (en) * 2004-02-12 2005-08-18 Dennis Palatov Video recording system utilizing Miniature Digital Camcorder
US6980234B2 (en) * 2000-10-19 2005-12-27 Canon Kabushiki Kaisha System for changing setup of first device that executes predetermined function by second device and these devices
US20060038896A1 (en) * 2004-08-18 2006-02-23 Fuji Xerox Co., Ltd. Image pickup apparatus and subject ID adding method
US20060044402A1 (en) * 2004-09-02 2006-03-02 Sony Corporation Picture processing apparatus, picture processing method, picture pickup apparatus, and program
US7034684B2 (en) * 2004-01-06 2006-04-25 Matsushita Electric Industrial Co., Ltd. Personal item monitor using radio frequency identification
US20060197842A1 (en) * 2005-03-07 2006-09-07 Konica Minolta Photo Imaging, Inc. Image capturing apparatus
US20070109417A1 (en) * 2005-11-16 2007-05-17 Per Hyttfors Methods, devices and computer program products for remote control of an image capturing device
US7236185B2 (en) * 2001-05-30 2007-06-26 Minolta Co., Ltd. Image capturing system
US20070147815A1 (en) * 2005-10-20 2007-06-28 Fujifilm Corporation Camera system, lens unit, camera body, and wireless communication adapter
US20070164104A1 (en) * 2005-05-06 2007-07-19 James Dulgerian Personal information management system
US20070229670A1 (en) * 2006-03-31 2007-10-04 Takumi Soga Information apparatus system, electronic camera for use therein, and method for controlling information processing apparatus from the electronic camera
US7298269B2 (en) * 2005-01-27 2007-11-20 Konica Minolta Photo Imaging, Inc. Image taking apparatus having selectively enabled IC tags

Family Cites Families (126)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3147358B2 (en) * 1990-02-23 2001-03-19 ミノルタ株式会社 Camera that can record location data
US5717496A (en) * 1992-11-19 1998-02-10 Olympus Optical Co., Ltd. Electronic imaging apparatus
JPH0743804A (en) * 1993-07-30 1995-02-14 Canon Inc Function selecting device
US5625765A (en) * 1993-09-03 1997-04-29 Criticom Corp. Vision systems including devices and methods for combining images for extended magnification schemes
US6037936A (en) * 1993-09-10 2000-03-14 Criticom Corp. Computer vision system with a graphic user interface and remote camera control
JPH07184089A (en) * 1993-12-21 1995-07-21 Canon Inc Video camera
IL108352A (en) 1994-01-17 2000-02-29 Given Imaging Ltd In vivo video camera system
US6388707B1 (en) * 1994-04-12 2002-05-14 Canon Kabushiki Kaisha Image pickup apparatus having means for appointing an arbitrary position on the display frame and performing a predetermined signal process thereon
US5805215A (en) * 1994-04-21 1998-09-08 Canon Kabushiki Kaisha Information processing method and apparatus for storing additional data about an image
US5682142A (en) * 1994-07-29 1997-10-28 Id Systems Inc. Electronic control system/network
JP3504738B2 (en) * 1994-09-09 2004-03-08 オリンパス株式会社 Electronic imaging apparatus and electronic imaging system
US6282362B1 (en) * 1995-11-07 2001-08-28 Trimble Navigation Limited Geographical position/image digital recording and display system
US6628325B1 (en) 1998-06-26 2003-09-30 Fotonation Holdings, Llc Camera network communication device
US6188431B1 (en) * 1996-02-17 2001-02-13 Casio Computers Co., Ltd. Electronic still camera and method for communication between electronic still cameras
JP3760510B2 (en) * 1996-06-14 2006-03-29 株式会社ニコン Information processing device
US5768633A (en) * 1996-09-03 1998-06-16 Eastman Kodak Company Tradeshow photographic and data transmission system
AU4258897A (en) * 1996-09-04 1998-03-26 David A. Goldberg Method and system for obtaining person-specific images in a public venue
US6384927B1 (en) 1996-10-11 2002-05-07 Ricoh Company, Ltd. Internet facsimile machine
JPH10150523A (en) * 1996-11-20 1998-06-02 Fuji Photo Film Co Ltd Preserving and utilizing system for photographic image data photographed by digital camera
JPH10161227A (en) * 1996-12-02 1998-06-19 Fuji Photo Film Co Ltd Camera and photographing information input system
JPH10173936A (en) 1996-12-06 1998-06-26 Ricoh Co Ltd Control method of network facsimile equipment
JP3906938B2 (en) * 1997-02-18 2007-04-18 富士フイルム株式会社 Image reproduction method and image data management method
JP2002502499A (en) * 1997-06-03 2002-01-22 ステファン バイド Portable information distribution device
JPH10341388A (en) 1997-06-06 1998-12-22 Nikon Corp Electronic camera having communication function
US20040051785A1 (en) * 1997-06-06 2004-03-18 Nikon Corporation Electronic camera having a communication function
US6642959B1 (en) * 1997-06-30 2003-11-04 Casio Computer Co., Ltd. Electronic camera having picture data output function
JPH1172348A (en) 1997-08-28 1999-03-16 Seiko Epson Corp Camera device and method for obtaining information
US6070167A (en) * 1997-09-29 2000-05-30 Sharp Laboratories Of America, Inc. Hierarchical method and system for object-based audiovisual descriptive tagging of images for information retrieval, editing, and manipulation
US6597392B1 (en) * 1997-10-14 2003-07-22 Healthcare Vision, Inc. Apparatus and method for computerized multi-media data organization and transmission
US6628333B1 (en) * 1997-11-12 2003-09-30 International Business Machines Corporation Digital instant camera having a printer
US6396537B1 (en) * 1997-11-24 2002-05-28 Eastman Kodak Company Photographic system for enabling interactive communication between a camera and an attraction site
JP3305645B2 (en) 1998-02-09 2002-07-24 富士写真フイルム株式会社 Application server in network photo service system
US6167469A (en) * 1998-05-18 2000-12-26 Agilent Technologies, Inc. Digital camera having display device for displaying graphical representation of user input and method for transporting the selected digital images thereof
JPH11355617A (en) * 1998-06-05 1999-12-24 Fuji Photo Film Co Ltd Camera with image display device
JP2000023015A (en) 1998-07-03 2000-01-21 Fuji Photo Film Co Ltd Electronic camera system
US7161619B1 (en) * 1998-07-28 2007-01-09 Canon Kabushiki Kaisha Data communication system, data communication control method and electronic apparatus
US6538698B1 (en) * 1998-08-28 2003-03-25 Flashpoint Technology, Inc. Method and system for sorting images in an image capture unit to ease browsing access
JP2000151498A (en) * 1998-11-11 2000-05-30 Japan Radio Co Ltd Mobile object management system
JP2000278037A (en) 1999-03-25 2000-10-06 Tdk Corp Chip antenna
JP2001016568A (en) * 1999-06-30 2001-01-19 Fuji Photo Film Co Ltd Image communication system
JP4431216B2 (en) * 1999-07-09 2010-03-10 富士フイルム株式会社 Data communication system
US6628899B1 (en) * 1999-10-08 2003-09-30 Fuji Photo Film Co., Ltd. Image photographing system, image processing system, and image providing system connecting them, as well as photographing camera, image editing apparatus, image order sheet for each object and method of ordering images for each object
JP4022683B2 (en) 1999-11-10 2007-12-19 富士フイルム株式会社 Wireless LAN connection destination selection method
JP2001177742A (en) * 1999-12-20 2001-06-29 Minolta Co Ltd Electronic camera
JP4679684B2 (en) 2000-01-11 2011-04-27 富士フイルム株式会社 Wireless communication apparatus and wireless communication control method
JP3893442B2 (en) * 2000-01-14 2007-03-14 富士フイルム株式会社 Print ordering method and apparatus, and printing apparatus
JP4343373B2 (en) * 2000-01-14 2009-10-14 富士フイルム株式会社 Photo service system and digital camera
JP3642561B2 (en) 2000-02-14 2005-04-27 株式会社東芝 Communication method
EP1128284A2 (en) * 2000-02-21 2001-08-29 Hewlett-Packard Company, A Delaware Corporation Associating image and location data
JP2001236504A (en) * 2000-02-22 2001-08-31 Konica Corp Image information acquiring and transmitting device and image information inputting and recording device
FR2805630B1 (en) 2000-02-24 2002-09-13 Eastman Kodak Co METHOD FOR MANAGING A QUICK DISTRIBUTION OF IMAGES
AU2001245426A1 (en) 2000-03-03 2001-09-17 Lawrence R. Jones Picture communications system and associated network services
GB0005337D0 (en) * 2000-03-07 2000-04-26 Hewlett Packard Co Image transfer over mobile radio network
JP4240739B2 (en) 2000-03-21 2009-03-18 富士フイルム株式会社 Electronic camera and information acquisition system
US7414746B2 (en) * 2000-05-23 2008-08-19 Fujifilm Corporation Image data communication method
AU4783300A (en) * 2000-05-30 2001-12-11 Bandai Co., Ltd. Image delivering system and method therefor
JP2001346173A (en) 2000-05-31 2001-12-14 Sony Corp Image data communication system and method, and image pickup device and image data processing method
US6970189B1 (en) * 2000-05-31 2005-11-29 Ipac Acquisition Subsidiary I, Llc Method and system for automatically configuring a hand-held camera using wireless communication to improve image quality
JP3955170B2 (en) * 2000-07-03 2007-08-08 富士フイルム株式会社 Image search system
JP2002024078A (en) 2000-07-06 2002-01-25 Hitachi Information Systems Ltd Image file management server
US6789113B1 (en) * 2000-07-17 2004-09-07 Kabushiki Kaisha Toshiba Information input/output system, information input/output method, recording medium of recording information transmitting/receiving program, and image forming apparatus
JP2002044355A (en) * 2000-07-25 2002-02-08 Omron Corp Photo card transmission method and photo card transmission system
US7298520B2 (en) * 2000-08-17 2007-11-20 Dai Nippon Printing Co., Ltd. Image printing system
JP4378045B2 (en) 2000-09-18 2009-12-02 富士フイルム株式会社 Camera, photographing information acquisition system and method
JP2002112074A (en) 2000-09-29 2002-04-12 Casio Comput Co Ltd Photographic image managing system and recording medium
US7619657B2 (en) * 2000-10-04 2009-11-17 Fujifilm Corp. Recording apparatus, communications apparatus, recording system, communications system, and methods therefor for setting the recording function of the recording apparatus in a restricted state
US6909889B2 (en) * 2000-10-06 2005-06-21 Fuji Photo Film Co., Ltd. Print service system, print order receiving server, image storage service system, image storage server and mobile telephone
JP3913520B2 (en) 2000-10-20 2007-05-09 富士フイルム株式会社 Image processing system and order system
JP2002171434A (en) * 2000-12-04 2002-06-14 Nikon Corp Electronic camera
JP2002176675A (en) 2000-12-06 2002-06-21 Seiko Epson Corp Data transmission system and its method
JP2002183167A (en) * 2000-12-14 2002-06-28 Canon Inc Data communication device and image storage system
US20020076217A1 (en) * 2000-12-15 2002-06-20 Ibm Corporation Methods and apparatus for automatic recording of photograph information into a digital camera or handheld computing device
JP2002183743A (en) 2000-12-18 2002-06-28 Kotobuki Nihachi:Kk Information system and recording medium
JP4189525B2 (en) * 2000-12-27 2008-12-03 富士フイルム株式会社 Communication terminal and communication system
JP4003034B2 (en) * 2001-03-09 2007-11-07 富士フイルム株式会社 Imaging system
JP2002279394A (en) 2001-03-21 2002-09-27 Casio Comput Co Ltd Digital camera and information transmitter, imaging system, image distributing apparatus, and image distribution system
JP2002288314A (en) * 2001-03-23 2002-10-04 Sanyo Electric Co Ltd Server system and its method for managing image
JP2002297753A (en) * 2001-03-30 2002-10-11 Fujitsu Ltd System for providing image data
US7248285B2 (en) * 2001-03-30 2007-07-24 Intel Corporation Method and apparatus for automatic photograph annotation
JP2002320172A (en) * 2001-04-20 2002-10-31 Olympus Optical Co Ltd Photographing system
US20030208409A1 (en) * 2001-04-30 2003-11-06 Mault James R. Method and apparatus for diet control
US20030030731A1 (en) * 2001-05-03 2003-02-13 Colby Steven M. System and method for transferring image data between digital cameras
JP2002344867A (en) 2001-05-18 2002-11-29 Fujitsu Ltd Image data storage system
JP4919545B2 (en) * 2001-06-28 2012-04-18 パナソニック株式会社 Data transmission system and photographing apparatus
US7154535B2 (en) * 2001-07-13 2006-12-26 Matsushita Electric Industrial Co., Ltd. Digital camera capable of directly accessing images recorded on another digital camera
GB0118436D0 (en) * 2001-07-27 2001-09-19 Hewlett Packard Co Synchronised cameras with auto-exchange
US7133070B2 (en) * 2001-09-20 2006-11-07 Eastman Kodak Company System and method for deciding when to correct image-specific defects based on camera, scene, display and demographic data
JP4076057B2 (en) * 2001-09-26 2008-04-16 富士フイルム株式会社 Image data transmission method, digital camera, and program
US20030058110A1 (en) * 2001-09-27 2003-03-27 Rich Michael John Radio frequency patient identification and information system
JP2003110975A (en) * 2001-09-27 2003-04-11 Fuji Photo Film Co Ltd Image recording method and apparatus, image distribution method and apparatus, and program
US6999112B2 (en) * 2001-10-31 2006-02-14 Hewlett-Packard Development Company, L.P. System and method for communicating content information to an image capture device
US7102670B2 (en) * 2001-10-31 2006-09-05 Hewlett-Packard Development Company, L.P. Bookmarking captured digital images at an event to all present devices
WO2003049424A1 (en) 2001-12-03 2003-06-12 Nikon Corporation Electronic apparatus, electronic camera, electronic device, image display apparatus, and image transmission system
US6690883B2 (en) * 2001-12-14 2004-02-10 Koninklijke Philips Electronics N.V. Self-annotating camera
JP3619963B2 (en) 2002-01-08 2005-02-16 富士写真フイルム株式会社 Image printing system
JP2003271650A (en) 2002-03-12 2003-09-26 Nec Corp Web site image retrieval system and method for cellular phone, server and program
US7446800B2 (en) * 2002-10-08 2008-11-04 Lifetouch, Inc. Methods for linking photographs to data related to the subjects of the photographs
US20040125216A1 (en) * 2002-12-31 2004-07-01 Keskar Dhananjay V. Context based tagging used for location based services
JP4058352B2 (en) * 2003-01-07 2008-03-05 キヤノン株式会社 Imaging apparatus and imaging control method
EP1588570A2 (en) * 2003-01-21 2005-10-26 Koninklijke Philips Electronics N.V. Adding metadata to pictures
US7146188B2 (en) * 2003-01-31 2006-12-05 Nokia Corporation Method and system for requesting photographs
EP1450549B1 (en) * 2003-02-18 2011-05-04 Canon Kabushiki Kaisha Photographing apparatus with radio information acquisition means and control method therefor
JP2004260304A (en) * 2003-02-24 2004-09-16 Fuji Photo Film Co Ltd Image management system
GB2399983A (en) 2003-03-24 2004-09-29 Canon Kk Picture storage and retrieval system for telecommunication system
GB2400514B (en) * 2003-04-11 2006-07-26 Hewlett Packard Development Co Image capture method
GB2403099B (en) * 2003-06-20 2007-01-24 Hewlett Packard Development Co Sharing image items
US20050046727A1 (en) * 2003-09-02 2005-03-03 Nikon Corporation Digital camera
US7477841B2 (en) * 2003-09-22 2009-01-13 Fujifilm Corporation Service provision system and automatic photography system
US7248166B2 (en) * 2003-09-29 2007-07-24 Fujifilm Corporation Imaging device, information storage server, article identification apparatus and imaging system
US7636733B1 (en) * 2003-10-03 2009-12-22 Adobe Systems Incorporated Time-based image management
US20050104976A1 (en) * 2003-11-17 2005-05-19 Kevin Currans System and method for applying inference information to digital camera metadata to identify digital picture content
US20050116821A1 (en) * 2003-12-01 2005-06-02 Clifton Labs, Inc. Optical asset tracking system
US7403225B2 (en) * 2004-07-12 2008-07-22 Scenera Technologies, Llc System and method for automatically annotating images in an image-capture device
US7362219B2 (en) * 2004-07-28 2008-04-22 Canon Kabushiki Kaisha Information acquisition apparatus
JP2006059292A (en) * 2004-08-24 2006-03-02 Sony Corp Information processor and information processing method
JP2006060741A (en) * 2004-08-24 2006-03-02 Matsushita Electric Ind Co Ltd Television terminal and server
US20060190812A1 (en) * 2005-02-22 2006-08-24 Geovector Corporation Imaging systems including hyperlink associations
JP2007102575A (en) * 2005-10-05 2007-04-19 Fujifilm Corp Photography system
JP2007251706A (en) * 2006-03-17 2007-09-27 Seiko Epson Corp Title setting of image
JP2007272836A (en) * 2006-03-31 2007-10-18 Toshiba Corp Device, program and method for network retrieval
US20080079984A1 (en) 2006-09-22 2008-04-03 Gallucci Frank V Digital Image Sharing
US8015189B2 (en) * 2006-11-08 2011-09-06 Yahoo! Inc. Customizable connections between media and meta-data via feeds
KR100819812B1 (en) * 2006-12-27 2008-04-08 삼성테크윈 주식회사 Digital image processing apparatus comprising radio frequency identification system, and method of controlling the same
KR101485458B1 (en) * 2007-07-02 2015-01-26 삼성전자주식회사 Method For Creating Image File Including Information of Individual And Apparatus Thereof
US8164426B1 (en) * 2008-04-09 2012-04-24 Chanan Steinhart System and method for producing and distributing digital images including data identifying subjects being photographed
KR20120028491A (en) * 2010-09-15 2012-03-23 삼성전자주식회사 Device and method for managing image data

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5999213A (en) * 1995-06-08 1999-12-07 Sony Corporation Method of and apparatus for setting up electronic device
US6670933B1 (en) * 1997-11-27 2003-12-30 Fuji Photo Film Co., Ltd. Image display apparatus and camera and image communication system
US20040174435A1 (en) * 1999-06-28 2004-09-09 Olympus Optical Co., Ltd. Information processing system and camera system
US6961540B1 (en) * 1999-06-28 2005-11-01 Olympus Optical Co., Ltd. Information processing system and camera system
US20020012048A1 (en) * 2000-05-18 2002-01-31 Yoichi Yamagishi Notification of operating status in image sensing system
US6980234B2 (en) * 2000-10-19 2005-12-27 Canon Kabushiki Kaisha System for changing setup of first device that executes predetermined function by second device and these devices
US20060007319A1 (en) * 2000-10-19 2006-01-12 Canon Kabushiki Kaisha System for changing setup of first device that executes predetermined function by second device and these devices
US20020047905A1 (en) * 2000-10-20 2002-04-25 Naoto Kinjo Image processing system and ordering system
US20020101519A1 (en) * 2001-01-29 2002-08-01 Myers Jeffrey S. Automatic generation of information identifying an object in a photographic image
US7236185B2 (en) * 2001-05-30 2007-06-26 Minolta Co., Ltd. Image capturing system
US20040051787A1 (en) * 2002-08-30 2004-03-18 Yasuo Mutsuro Camera system
US20040087273A1 (en) * 2002-10-31 2004-05-06 Nokia Corporation Method and system for selecting data items for service requests
US7034684B2 (en) * 2004-01-06 2006-04-25 Matsushita Electric Industrial Co., Ltd. Personal item monitor using radio frequency identification
US20050179811A1 (en) * 2004-02-12 2005-08-18 Dennis Palatov Video recording system utilizing Miniature Digital Camcorder
US20060038896A1 (en) * 2004-08-18 2006-02-23 Fuji Xerox Co., Ltd. Image pickup apparatus and subject ID adding method
US20060044402A1 (en) * 2004-09-02 2006-03-02 Sony Corporation Picture processing apparatus, picture processing method, picture pickup apparatus, and program
US7298269B2 (en) * 2005-01-27 2007-11-20 Konica Minolta Photo Imaging, Inc. Image taking apparatus having selectively enabled IC tags
US20060197842A1 (en) * 2005-03-07 2006-09-07 Konica Minolta Photo Imaging, Inc. Image capturing apparatus
US20070164104A1 (en) * 2005-05-06 2007-07-19 James Dulgerian Personal information management system
US20070147815A1 (en) * 2005-10-20 2007-06-28 Fujifilm Corporation Camera system, lens unit, camera body, and wireless communication adapter
US20070109417A1 (en) * 2005-11-16 2007-05-17 Per Hyttfors Methods, devices and computer program products for remote control of an image capturing device
US20070229670A1 (en) * 2006-03-31 2007-10-04 Takumi Soga Information apparatus system, electronic camera for use therein, and method for controlling information processing apparatus from the electronic camera

Cited By (462)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7847839B2 (en) 1997-10-09 2010-12-07 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7852384B2 (en) 1997-10-09 2010-12-14 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US7787022B2 (en) 1997-10-09 2010-08-31 Fotonation Vision Limited Red-eye filter method and apparatus
US7804531B2 (en) 1997-10-09 2010-09-28 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US8264575B1 (en) 1997-10-09 2012-09-11 DigitalOptics Corporation Europe Limited Red eye filter method and apparatus
US8203621B2 (en) 1997-10-09 2012-06-19 DigitalOptics Corporation Europe Limited Red-eye filter method and apparatus
US20080316341A1 (en) * 1997-10-09 2008-12-25 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US20110134271A1 (en) * 1997-10-09 2011-06-09 Tessera Technologies Ireland Limited Detecting Red Eye Filter and Apparatus Using Meta-Data
US7738015B2 (en) 1997-10-09 2010-06-15 Fotonation Vision Limited Red-eye filter method and apparatus
US7746385B2 (en) 1997-10-09 2010-06-29 Fotonation Vision Limited Red-eye filter method and apparatus
US20090027520A1 (en) * 1997-10-09 2009-01-29 Fotonation Vision Limited Red-eye filter method and apparatus
US20080186389A1 (en) * 1997-10-09 2008-08-07 Fotonation Vision Limited Image Modification Based on Red-Eye Filter Analysis
US20070263104A1 (en) * 1997-10-09 2007-11-15 Fotonation Vision Limited Detecting Red Eye Filter and Apparatus Using Meta-Data
US20040223063A1 (en) * 1997-10-09 2004-11-11 Deluca Michael J. Detecting red eye filter and apparatus using meta-data
US20050041121A1 (en) * 1997-10-09 2005-02-24 Eran Steinberg Red-eye filter method and apparatus
US20080211937A1 (en) * 1997-10-09 2008-09-04 Fotonation Vision Limited Red-eye filter method and apparatus
US7916190B1 (en) 1997-10-09 2011-03-29 Tessera Technologies Ireland Limited Red-eye filter method and apparatus
US7847840B2 (en) 1997-10-09 2010-12-07 Fotonation Vision Limited Detecting red eye filter and apparatus using meta-data
US20070003148A1 (en) * 2001-10-02 2007-01-04 Shoji Muramatsu Image processing apparatus and image pickup device
US7535415B2 (en) * 2002-04-16 2009-05-19 Samsung Electronics Co., Ltd. Image recording apparatus capable of recording position and direction information of object of shooting
US20030194203A1 (en) * 2002-04-16 2003-10-16 Samsung Electronics Co., Ltd. Image recording apparatus capable of recording position and direction information of object of shooting
US10039881B2 (en) 2002-12-31 2018-08-07 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US9962091B2 (en) 2002-12-31 2018-05-08 Abbott Diabetes Care Inc. Continuous glucose monitoring system and methods of use
US10750952B2 (en) 2002-12-31 2020-08-25 Abbott Diabetes Care Inc. Continuous glucose monitoring system and methods of use
US20090015654A1 (en) * 2003-02-04 2009-01-15 Nec Corporation Operation limiting technique for a camera-equipped mobile communication terminal
US20040176118A1 (en) * 2003-02-18 2004-09-09 Michael Strittmatter Service attribute based filtering system and method
US20060206592A1 (en) * 2003-04-23 2006-09-14 Canon Kabushiki Kaisha Wireless communication system and wireless communication device and control method
US8131859B2 (en) 2003-04-23 2012-03-06 Canon Kabushiki Kaisha Wireless communication system, and wireless communication device and control method
US9167371B2 (en) 2003-04-23 2015-10-20 Canon Kabushiki Kaisha Wireless communication system, and wireless communication device and control method for establishing a connection with another wireless device before an elapsed time period without the intervention of a base station
US20060200564A1 (en) * 2003-04-23 2006-09-07 Canon Kabushiki Kaisha Wireless communication system, and wireless communication device and control method
US8756305B2 (en) 2003-04-23 2014-06-17 Canon Kabushiki Kaisha Information processing apparatus and connection control method for joining a wireless network and searching for a printer having a predetermined function
US9268510B2 (en) 2003-04-23 2016-02-23 Canon Kabushiki Kaisha Information processing apparatus and connection control method for searching for a printer having a predetermined function identified by identification information included in a beacon signal and sending a print request directly to the printer which is operating as an access point without going through an external access point
US20110143789A1 (en) * 2003-04-23 2011-06-16 Canon Kabushiki Kaisha Wireless communication system, and wireless communication device and control method
US10616863B2 (en) 2003-04-23 2020-04-07 Canon Kabushiki Kaisha Wireless communication system, device, and control method for searching multiple communication frequency channels and processing cryptographic communication in an infrastructure mode using a received communication parameter including information of an encrypted key
US7882234B2 (en) * 2003-04-23 2011-02-01 Canon Kabushiki Kaisha Wireless communication system, wireless communication device, and control method for establishing a one-to-one relationship
US8250218B2 (en) 2003-04-23 2012-08-21 Canon Kabushiki Kaisha Wireless communication system, and wireless communication device and control method for establishing a one-to-one relationship between wireless communication devices
US20060018628A1 (en) * 2003-06-10 2006-01-26 Fujitsu Limited Data transmission system
US9730584B2 (en) 2003-06-10 2017-08-15 Abbott Diabetes Care Inc. Glucose measuring device for use in personal area network
US8224108B2 (en) 2003-06-26 2012-07-17 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8131016B2 (en) 2003-06-26 2012-03-06 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US8126208B2 (en) 2003-06-26 2012-02-28 DigitalOptics Corporation Europe Limited Digital image processing using face detection information
US20110075894A1 (en) * 2003-06-26 2011-03-31 Tessera Technologies Ireland Limited Digital Image Processing Using Face Detection Information
US20100039525A1 (en) * 2003-06-26 2010-02-18 Fotonation Ireland Limited Perfecting of Digital Image Capture Parameters Within Acquisition Devices Using Face Detection
US20100271499A1 (en) * 2003-06-26 2010-10-28 Fotonation Ireland Limited Perfecting of Digital Image Capture Parameters Within Acquisition Devices Using Face Detection
US20100053362A1 (en) * 2003-08-05 2010-03-04 Fotonation Ireland Limited Partial face detector red-eye filter method and apparatus
US20100053368A1 (en) * 2003-08-05 2010-03-04 Fotonation Ireland Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US9412007B2 (en) 2003-08-05 2016-08-09 Fotonation Limited Partial face detector red-eye filter method and apparatus
US8520093B2 (en) 2003-08-05 2013-08-27 DigitalOptics Corporation Europe Limited Face tracker and partial face tracker for red-eye filter method and apparatus
US20050258229A1 (en) * 2003-09-22 2005-11-24 Matsushita Electric Industrial Co., Ltd. Secure device and information processing unit
US20070145132A1 (en) * 2003-09-22 2007-06-28 Matsushita Electric Industrial Co., Ltd. Secure device and information processing unit
US20110102643A1 (en) * 2004-02-04 2011-05-05 Tessera Technologies Ireland Limited Partial Face Detector Red-Eye Filter Method and Apparatus
US20050232580A1 (en) * 2004-03-11 2005-10-20 Interdigital Technology Corporation Control of device operation within an area
US20060132908A1 (en) * 2004-07-26 2006-06-22 Baun Kenneth W Apparatus and methods for focusing and collimating telescopes
US7277223B2 (en) * 2004-07-26 2007-10-02 Meade Instruments Corporation Apparatus and methods for focusing and collimating telescopes
US20060039221A1 (en) * 2004-08-18 2006-02-23 Sony Corporation Memory card, memory card control method and memory card access control method
US7769867B2 (en) * 2004-08-18 2010-08-03 Sony Corporation Mountable memory card and method for communicating, controlling, accessing and/or using the same
US20060080340A1 (en) * 2004-09-13 2006-04-13 Hirokazu Oi Communication system, communication apparatus, and communication method
US20060120599A1 (en) * 2004-10-28 2006-06-08 Eran Steinberg Method and apparatus for red-eye detection in an acquired digital image
US20060093212A1 (en) * 2004-10-28 2006-05-04 Eran Steinberg Method and apparatus for red-eye detection in an acquired digital image
US8265388B2 (en) 2004-10-28 2012-09-11 DigitalOptics Corporation Europe Limited Analyzing partial face regions for red-eye detection in acquired digital images
US8036460B2 (en) 2004-10-28 2011-10-11 DigitalOptics Corporation Europe Limited Analyzing partial face regions for red-eye detection in acquired digital images
US20110063465A1 (en) * 2004-10-28 2011-03-17 Fotonation Ireland Limited Analyzing Partial Face Regions for Red-Eye Detection in Acquired Digital Images
US20060137018A1 (en) * 2004-11-29 2006-06-22 Interdigital Technology Corporation Method and apparatus to provide secured surveillance data to authorized entities
US20060172063A1 (en) * 2004-12-06 2006-08-03 Interdigital Technology Corporation Method and apparatus for detecting portable electronic device functionality
US20060148418A1 (en) * 2004-12-06 2006-07-06 Interdigital Technology Corporation Method and apparatus for alerting a target that it is subject to sensing and restricting access to sensed content associated with the target
US20060227640A1 (en) * 2004-12-06 2006-10-12 Interdigital Technology Corporation Sensing device with activation and sensing alert functions
US7574220B2 (en) 2004-12-06 2009-08-11 Interdigital Technology Corporation Method and apparatus for alerting a target that it is subject to sensing and restricting access to sensed content associated with the target
US7948375B2 (en) 2004-12-06 2011-05-24 Interdigital Technology Corporation Method and apparatus for detecting portable electronic device functionality
US20060136379A1 (en) * 2004-12-17 2006-06-22 Eastman Kodak Company Image content sharing device and method
US20060189348A1 (en) * 2005-02-23 2006-08-24 Memory Matrix, Inc. Systems and methods for automatic synchronization of cellular telephones
US20060189349A1 (en) * 2005-02-24 2006-08-24 Memory Matrix, Inc. Systems and methods for automatic uploading of cell phone images
US7962629B2 (en) 2005-06-17 2011-06-14 Tessera Technologies Ireland Limited Method for establishing a paired connection between media devices
US20110060836A1 (en) * 2005-06-17 2011-03-10 Tessera Technologies Ireland Limited Method for Establishing a Paired Connection Between Media Devices
US7680597B2 (en) * 2005-07-14 2010-03-16 Murata Kikai Kabushiki Kaisha Guided vehicle system and travel route map creation method for guided vehicle system
US20070016369A1 (en) * 2005-07-14 2007-01-18 Murata Kikai Kabushiki Kaisha Guided vehicle system and travel route map creation method for guided vehicle system
US20070081805A1 (en) * 2005-08-17 2007-04-12 Nokia Corporation Combined actuator for camera shutter and focus functions
US7480451B2 (en) * 2005-08-17 2009-01-20 Nokia Corporation Combined actuator for camera shutter and focus functions
US8880138B2 (en) 2005-09-30 2014-11-04 Abbott Diabetes Care Inc. Device for channeling fluid and methods of use
US8917982B1 (en) 2005-10-17 2014-12-23 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US9936116B2 (en) 2005-10-17 2018-04-03 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US9485403B2 (en) 2005-10-17 2016-11-01 Cutting Edge Vision Llc Wink detecting camera
US8818182B2 (en) 2005-10-17 2014-08-26 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US8824879B2 (en) 2005-10-17 2014-09-02 Cutting Edge Vision Llc Two words as the same voice command for a camera
US11818458B2 (en) 2005-10-17 2023-11-14 Cutting Edge Vision, LLC Camera touchpad
US8831418B2 (en) * 2005-10-17 2014-09-09 Cutting Edge Vision Llc Automatic upload of pictures from a camera
US8897634B2 (en) 2005-10-17 2014-11-25 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US8923692B2 (en) 2005-10-17 2014-12-30 Cutting Edge Vision Llc Pictures using voice commands and automatic upload
US11153472B2 (en) * 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
US10257401B2 (en) 2005-10-17 2019-04-09 Cutting Edge Vision Llc Pictures using voice commands
US10063761B2 (en) 2005-10-17 2018-08-28 Cutting Edge Vision Llc Automatic upload of pictures from a camera
US20090054747A1 (en) * 2005-10-31 2009-02-26 Abbott Diabetes Care, Inc. Method and system for providing analyte sensor tester isolation
US11538580B2 (en) 2005-11-04 2022-12-27 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US20100274220A1 (en) * 2005-11-04 2010-10-28 Abbott Diabetes Care Inc. Method and System for Providing Basal Profile Modification in Analyte Monitoring and Management Systems
US9323898B2 (en) 2005-11-04 2016-04-26 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US9669162B2 (en) 2005-11-04 2017-06-06 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US8585591B2 (en) 2005-11-04 2013-11-19 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US7869628B2 (en) 2005-11-18 2011-01-11 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8175342B2 (en) 2005-11-18 2012-05-08 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8126217B2 (en) 2005-11-18 2012-02-28 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US8126218B2 (en) 2005-11-18 2012-02-28 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US20110069182A1 (en) * 2005-11-18 2011-03-24 Tessera Technologies Ireland Limited Two Stage Detection For Photographic Eye Artifacts
US20100182454A1 (en) * 2005-11-18 2010-07-22 Fotonation Ireland Limited Two Stage Detection for Photographic Eye Artifacts
US8131021B2 (en) 2005-11-18 2012-03-06 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US20110115949A1 (en) * 2005-11-18 2011-05-19 Tessera Technologies Ireland Limited Two Stage Detection for Photographic Eye Artifacts
US7953252B2 (en) 2005-11-18 2011-05-31 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7689009B2 (en) 2005-11-18 2010-03-30 Fotonation Vision Ltd. Two stage detection for photographic eye artifacts
US8160308B2 (en) 2005-11-18 2012-04-17 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US20110211095A1 (en) * 2005-11-18 2011-09-01 Tessera Technologies Ireland Limited Two Stage Detection For Photographic Eye Artifacts
US7970184B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US20110069208A1 (en) * 2005-11-18 2011-03-24 Tessera Technologies Ireland Limited Two Stage Detection For Photographic Eye Artifacts
US7970182B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US8180115B2 (en) 2005-11-18 2012-05-15 DigitalOptics Corporation Europe Limited Two stage detection for photographic eye artifacts
US7970183B2 (en) 2005-11-18 2011-06-28 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US20080240555A1 (en) * 2005-11-18 2008-10-02 Florin Nanu Two Stage Detection for Photographic Eye Artifacts
US20070116379A1 (en) * 2005-11-18 2007-05-24 Peter Corcoran Two stage detection for photographic eye artifacts
US7920723B2 (en) 2005-11-18 2011-04-05 Tessera Technologies Ireland Limited Two stage detection for photographic eye artifacts
US7865036B2 (en) 2005-11-18 2011-01-04 Tessera Technologies Ireland Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US20100040284A1 (en) * 2005-11-18 2010-02-18 Fotonation Vision Limited Method and apparatus of correcting hybrid flash artifacts in digital images
US10307091B2 (en) 2005-12-28 2019-06-04 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor insertion
US11298058B2 (en) 2005-12-28 2022-04-12 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor insertion
US9795331B2 (en) 2005-12-28 2017-10-24 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor insertion
US9326727B2 (en) 2006-01-30 2016-05-03 Abbott Diabetes Care Inc. On-body medical device securement
US8184900B2 (en) 2006-02-14 2012-05-22 DigitalOptics Corporation Europe Limited Automatic detection and correction of non-red eye flash defects
US20070198509A1 (en) * 2006-02-20 2007-08-23 Sony Ericsson Mobile Information processing apparatus, information processing method, information processing program, and mobile terminal apparatus
US9844329B2 (en) 2006-02-28 2017-12-19 Abbott Diabetes Care Inc. Analyte sensors and methods of use
US10448834B2 (en) 2006-02-28 2019-10-22 Abbott Diabetes Care Inc. Smart messages and alerts for an infusion delivery and management system
US9031630B2 (en) 2006-02-28 2015-05-12 Abbott Diabetes Care Inc. Analyte sensors and methods of use
US9782076B2 (en) 2006-02-28 2017-10-10 Abbott Diabetes Care Inc. Smart messages and alerts for an infusion delivery and management system
US20090257432A1 (en) * 2006-03-16 2009-10-15 Tsuyoshi Yamaguchi Terminal
US20070217650A1 (en) * 2006-03-20 2007-09-20 Fujifilm Corporation Remote controller, remote control system, and method for displaying detailed information
US8193901B2 (en) 2006-03-20 2012-06-05 Fujifilm Corporation Remote controller, remote control system, and method for displaying detailed information
US20070236327A1 (en) * 2006-03-24 2007-10-11 Fujifilm Corporation Apparatus, method, program and system for remote control
US9039975B2 (en) 2006-03-31 2015-05-26 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
US9743863B2 (en) 2006-03-31 2017-08-29 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US8593109B2 (en) 2006-03-31 2013-11-26 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US20100045231A1 (en) * 2006-03-31 2010-02-25 Abbott Diabetes Care Inc. Method and System for Powering an Electronic Device
US9380971B2 (en) 2006-03-31 2016-07-05 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US8933664B2 (en) 2006-03-31 2015-01-13 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US9625413B2 (en) 2006-03-31 2017-04-18 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
US8081226B2 (en) * 2006-04-05 2011-12-20 Olympus Corporation Digital camera system which allows a digital camera unit to communicate with peripheral devices via a human body
US20090079845A1 (en) * 2006-04-05 2009-03-26 Olympus Corporation Digital camera system
US20070287478A1 (en) * 2006-04-18 2007-12-13 Samsung Electronics Co. Ltd. System and method for transferring character between portable communication devices
US7600031B2 (en) * 2006-04-27 2009-10-06 Microsoft Corporation Sharing digital content via a packet-switched network
US20070255853A1 (en) * 2006-04-27 2007-11-01 Toutonghi Michael J Sharing digital content via a packet-switched network
US7965875B2 (en) 2006-06-12 2011-06-21 Tessera Technologies Ireland Limited Advances in extending the AAM techniques from grayscale to color images
US20080013798A1 (en) * 2006-06-12 2008-01-17 Fotonation Vision Limited Advances in extending the aam techniques from grayscale to color images
US20090105571A1 (en) * 2006-06-30 2009-04-23 Abbott Diabetes Care, Inc. Method and System for Providing Data Communication in Data Management Systems
US11445910B2 (en) 2006-08-07 2022-09-20 Abbott Diabetes Care Inc. Method and system for providing data management in integrated analyte monitoring and infusion system
US11806110B2 (en) 2006-08-07 2023-11-07 Abbott Diabetes Care Inc. Method and system for providing data management in integrated analyte monitoring and infusion system
US8932216B2 (en) 2006-08-07 2015-01-13 Abbott Diabetes Care Inc. Method and system for providing data management in integrated analyte monitoring and infusion system
US10206629B2 (en) 2006-08-07 2019-02-19 Abbott Diabetes Care Inc. Method and system for providing integrated analyte monitoring and infusion system therapy management
US9697332B2 (en) 2006-08-07 2017-07-04 Abbott Diabetes Care Inc. Method and system for providing data management in integrated analyte monitoring and infusion system
US9833181B2 (en) 2006-08-09 2017-12-05 Abbot Diabetes Care Inc. Method and system for providing calibration of an analyte sensor in an analyte monitoring system
US10278630B2 (en) 2006-08-09 2019-05-07 Abbott Diabetes Care Inc. Method and system for providing calibration of an analyte sensor in an analyte monitoring system
US9408566B2 (en) 2006-08-09 2016-08-09 Abbott Diabetes Care Inc. Method and system for providing calibration of an analyte sensor in an analyte monitoring system
US11864894B2 (en) 2006-08-09 2024-01-09 Abbott Diabetes Care Inc. Method and system for providing calibration of an analyte sensor in an analyte monitoring system
US20090115879A1 (en) * 2006-09-28 2009-05-07 Hideki Nagata Information management method and digital camera
US10342469B2 (en) 2006-10-02 2019-07-09 Abbott Diabetes Care Inc. Method and system for dynamically updating calibration parameters for an analyte sensor
US9357959B2 (en) 2006-10-02 2016-06-07 Abbott Diabetes Care Inc. Method and system for dynamically updating calibration parameters for an analyte sensor
US9629578B2 (en) 2006-10-02 2017-04-25 Abbott Diabetes Care Inc. Method and system for dynamically updating calibration parameters for an analyte sensor
US9839383B2 (en) 2006-10-02 2017-12-12 Abbott Diabetes Care Inc. Method and system for dynamically updating calibration parameters for an analyte sensor
US10143024B2 (en) 2006-10-20 2018-11-27 Canon Kabushiki Kaisha Communication parameter setting method, communicating apparatus, and managing apparatus for managing communication parameters
US10750555B2 (en) 2006-10-20 2020-08-18 Canon Kabushiki Kaisha Communication parameter setting method, communicating apparatus, and managing apparatus for managing communication parameters
US8391258B2 (en) 2006-10-20 2013-03-05 Canon Kabushiki Kaisha Communication parameter setting method, communicating apparatus, and managing apparatus for managing communication parameters
US8169493B2 (en) 2006-10-27 2012-05-01 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US20090207270A1 (en) * 2006-10-27 2009-08-20 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US20080112599A1 (en) * 2006-11-10 2008-05-15 Fotonation Vision Limited method of detecting redeye in a digital image
US8170294B2 (en) 2006-11-10 2012-05-01 DigitalOptics Corporation Europe Limited Method of detecting redeye in a digital image
US8793301B2 (en) * 2006-11-22 2014-07-29 Agfa Healthcare Method and system for dynamic image processing
US20090138544A1 (en) * 2006-11-22 2009-05-28 Rainer Wegenkittl Method and System for Dynamic Image Processing
US20090256673A1 (en) * 2006-12-26 2009-10-15 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US8253529B2 (en) 2006-12-26 2012-08-28 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US8055067B2 (en) 2007-01-18 2011-11-08 DigitalOptics Corporation Europe Limited Color segmentation
US20080175481A1 (en) * 2007-01-18 2008-07-24 Stefan Petrescu Color Segmentation
US8264557B2 (en) 2007-01-29 2012-09-11 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US20090290030A1 (en) * 2007-01-29 2009-11-26 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US20080186385A1 (en) * 2007-02-06 2008-08-07 Samsung Electronics Co., Ltd. Photographing apparatus capable of communication with external apparatus and method of controlling the same
US8848085B2 (en) * 2007-02-06 2014-09-30 Samsung Electronics Co., Ltd. Photographing apparatus capable of communication with external apparatus and method of controlling the same
US10022499B2 (en) 2007-02-15 2018-07-17 Abbott Diabetes Care Inc. Device and method for automatic data acquisition and/or detection
US10617823B2 (en) 2007-02-15 2020-04-14 Abbott Diabetes Care Inc. Device and method for automatic data acquisition and/or detection
US20080200897A1 (en) * 2007-02-19 2008-08-21 Abbott Diabetes Care, Inc. Modular combination of medication infusion and analyte monitoring
US9636450B2 (en) 2007-02-19 2017-05-02 Udo Hoss Pump system modular components for delivering medication and analyte sensing at seperate insertion sites
US20090315671A1 (en) * 2007-02-28 2009-12-24 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US8742890B2 (en) 2007-02-28 2014-06-03 Olympus Corporation Image acquisition system and method of authenticating image acquisition device in the image acquisition system
US9801545B2 (en) 2007-03-01 2017-10-31 Abbott Diabetes Care Inc. Method and apparatus for providing rolling data in communication systems
US9095290B2 (en) 2007-03-01 2015-08-04 Abbott Diabetes Care Inc. Method and apparatus for providing rolling data in communication systems
US8233674B2 (en) 2007-03-05 2012-07-31 DigitalOptics Corporation Europe Limited Red eye false positive filtering using face location and orientation
US20110222730A1 (en) * 2007-03-05 2011-09-15 Tessera Technologies Ireland Limited Red Eye False Positive Filtering Using Face Location and Orientation
US7995804B2 (en) 2007-03-05 2011-08-09 Tessera Technologies Ireland Limited Red eye false positive filtering using face location and orientation
US20080219518A1 (en) * 2007-03-05 2008-09-11 Fotonation Vision Limited Red Eye False Positive Filtering Using Face Location and Orientation
US20080242946A1 (en) * 2007-03-30 2008-10-02 Sony Corporation Method and apparatus for transporting images
US8144948B2 (en) * 2007-03-30 2012-03-27 Sony Corporation Method and apparatus for transporting images
US20080288204A1 (en) * 2007-04-14 2008-11-20 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in medical communication system
US9008743B2 (en) 2007-04-14 2015-04-14 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US8937540B2 (en) 2007-04-14 2015-01-20 Abbott Diabetes Care Inc. Method and apparatus for providing dynamic multi-stage signal amplification in a medical device
US10194846B2 (en) 2007-04-14 2019-02-05 Abbott Diabetes Care Inc. Method and apparatus for providing dynamic multi-stage signal amplification in a medical device
US9204827B2 (en) 2007-04-14 2015-12-08 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US10349877B2 (en) 2007-04-14 2019-07-16 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US8698615B2 (en) 2007-04-14 2014-04-15 Abbott Diabetes Care Inc. Method and apparatus for providing dynamic multi-stage signal amplification in a medical device
US10111608B2 (en) 2007-04-14 2018-10-30 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US20080255434A1 (en) * 2007-04-14 2008-10-16 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in medical communication system
US20080256048A1 (en) * 2007-04-14 2008-10-16 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in medical communication system
US9402584B2 (en) 2007-04-14 2016-08-02 Abbott Diabetes Care Inc. Method and apparatus for providing dynamic multi-stage signal amplification in a medical device
US11039767B2 (en) 2007-04-14 2021-06-22 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US9743866B2 (en) 2007-04-14 2017-08-29 Abbott Diabetes Care Inc. Method and apparatus for providing dynamic multi-stage signal amplification in a medical device
US9615780B2 (en) 2007-04-14 2017-04-11 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US11696684B2 (en) 2007-05-08 2023-07-11 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US10178954B2 (en) 2007-05-08 2019-01-15 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US10952611B2 (en) 2007-05-08 2021-03-23 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US20080281179A1 (en) * 2007-05-08 2008-11-13 Abbott Diabetes Care, Inc. Analyte monitoring system and methods
US9649057B2 (en) 2007-05-08 2017-05-16 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9949678B2 (en) 2007-05-08 2018-04-24 Abbott Diabetes Care Inc. Method and device for determining elapsed sensor life
US20080278332A1 (en) * 2007-05-08 2008-11-13 Abbott Diabetes Care, Inc. Analyte monitoring system and methods
US9574914B2 (en) 2007-05-08 2017-02-21 Abbott Diabetes Care Inc. Method and device for determining elapsed sensor life
US10653317B2 (en) 2007-05-08 2020-05-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9314198B2 (en) 2007-05-08 2016-04-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9177456B2 (en) 2007-05-08 2015-11-03 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US20090318792A1 (en) * 2007-05-08 2009-12-24 Fennell Martin J Analyte Monitoring System and Methods
US9035767B2 (en) 2007-05-08 2015-05-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9000929B2 (en) 2007-05-08 2015-04-07 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US10820841B2 (en) 2007-05-14 2020-11-03 Abbot Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10653344B2 (en) 2007-05-14 2020-05-19 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9483608B2 (en) 2007-05-14 2016-11-01 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US11300561B2 (en) 2007-05-14 2022-04-12 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in a medical communication system
US9558325B2 (en) 2007-05-14 2017-01-31 Abbott Diabetes Care Inc. Method and system for determining analyte levels
US10031002B2 (en) 2007-05-14 2018-07-24 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9801571B2 (en) 2007-05-14 2017-10-31 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US10045720B2 (en) 2007-05-14 2018-08-14 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10002233B2 (en) 2007-05-14 2018-06-19 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9804150B2 (en) 2007-05-14 2017-10-31 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US20080287762A1 (en) * 2007-05-14 2008-11-20 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in a medical communication system
US8612163B2 (en) 2007-05-14 2013-12-17 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9060719B2 (en) 2007-05-14 2015-06-23 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US11076785B2 (en) 2007-05-14 2021-08-03 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8682615B2 (en) 2007-05-14 2014-03-25 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9737249B2 (en) 2007-05-14 2017-08-22 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10143409B2 (en) 2007-05-14 2018-12-04 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10119956B2 (en) 2007-05-14 2018-11-06 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9125548B2 (en) 2007-05-14 2015-09-08 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US9797880B2 (en) 2007-05-14 2017-10-24 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10976304B2 (en) 2007-05-14 2021-04-13 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10261069B2 (en) 2007-05-14 2019-04-16 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10463310B2 (en) 2007-05-14 2019-11-05 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US11828748B2 (en) 2007-05-14 2023-11-28 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10991456B2 (en) 2007-05-14 2021-04-27 Abbott Diabetes Care Inc. Method and system for determining analyte levels
US8571808B2 (en) 2007-05-14 2013-10-29 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10634662B2 (en) 2007-05-14 2020-04-28 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US11125592B2 (en) 2007-05-14 2021-09-21 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8560038B2 (en) 2007-05-14 2013-10-15 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US11119090B2 (en) 2007-05-14 2021-09-14 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8803980B2 (en) * 2007-05-29 2014-08-12 Blackberry Limited System and method for selecting a geographic location to associate with an object
US20080297409A1 (en) * 2007-05-29 2008-12-04 Research In Motion Limited System and method for selecting a geographic location to associate with an object
US11276492B2 (en) 2007-06-21 2022-03-15 Abbott Diabetes Care Inc. Health management devices and methods
US20100076293A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US20080319296A1 (en) * 2007-06-21 2008-12-25 Abbott Diabetes Care, Inc. Health monitor
US20100076291A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US20100076289A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US8617069B2 (en) 2007-06-21 2013-12-31 Abbott Diabetes Care Inc. Health monitor
US8597188B2 (en) 2007-06-21 2013-12-03 Abbott Diabetes Care Inc. Health management devices and methods
US20100076284A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Management Devices and Methods
US20100076292A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US20100076280A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US20100076290A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US11264133B2 (en) 2007-06-21 2022-03-01 Abbott Diabetes Care Inc. Health management devices and methods
US10856785B2 (en) 2007-06-29 2020-12-08 Abbott Diabetes Care Inc. Analyte monitoring and management device and method to analyze the frequency of user interaction with the device
US9913600B2 (en) 2007-06-29 2018-03-13 Abbott Diabetes Care Inc. Analyte monitoring and management device and method to analyze the frequency of user interaction with the device
US11678821B2 (en) 2007-06-29 2023-06-20 Abbott Diabetes Care Inc. Analyte monitoring and management device and method to analyze the frequency of user interaction with the device
US20090011707A1 (en) * 2007-07-04 2009-01-08 Samsung Electronics Co., Ltd. Method and apparatus for identifying neighboring device
US9002382B2 (en) * 2007-07-04 2015-04-07 Samsung Electronics Co., Ltd. Method and apparatus for identifying neighboring device
US9479893B2 (en) * 2007-07-04 2016-10-25 Samsung Electronics Co., Ltd. Method and apparatus for identifying neighboring device
US9398872B2 (en) 2007-07-31 2016-07-26 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor calibration
US8834366B2 (en) 2007-07-31 2014-09-16 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor calibration
US20090036760A1 (en) * 2007-07-31 2009-02-05 Abbott Diabetes Care, Inc. Method and apparatus for providing data processing and control in a medical communication system
US20090063402A1 (en) * 2007-08-31 2009-03-05 Abbott Diabetes Care, Inc. Method and System for Providing Medication Level Determination
US8503818B2 (en) 2007-09-25 2013-08-06 DigitalOptics Corporation Europe Limited Eye defect detection in international standards organization images
US9439586B2 (en) 2007-10-23 2016-09-13 Abbott Diabetes Care Inc. Assessing measures of glycemic variability
US9332934B2 (en) 2007-10-23 2016-05-10 Abbott Diabetes Care Inc. Analyte sensor with lag compensation
US9804148B2 (en) 2007-10-23 2017-10-31 Abbott Diabetes Care Inc. Analyte sensor with lag compensation
US9743865B2 (en) 2007-10-23 2017-08-29 Abbott Diabetes Care Inc. Assessing measures of glycemic variability
US11083843B2 (en) 2007-10-23 2021-08-10 Abbott Diabetes Care Inc. Closed loop control system with safety parameters and methods
US10173007B2 (en) 2007-10-23 2019-01-08 Abbott Diabetes Care Inc. Closed loop control system with safety parameters and methods
US20100260414A1 (en) * 2007-11-08 2010-10-14 Tessera Technologies Ireland Limited Detecting redeye defects in digital images
US8000526B2 (en) 2007-11-08 2011-08-16 Tessera Technologies Ireland Limited Detecting redeye defects in digital images
US20090123063A1 (en) * 2007-11-08 2009-05-14 Fotonation Vision Limited Detecting Redeye Defects in Digital Images
US8036458B2 (en) 2007-11-08 2011-10-11 DigitalOptics Corporation Europe Limited Detecting redeye defects in digital images
US20160117066A1 (en) * 2007-12-14 2016-04-28 Scenera Technologies, Llc Methods, Systems, And Computer Readable Media For Controlling Presentation And Selection Of Objects That Are Digital Images Depicting Subjects
US9569072B2 (en) * 2007-12-14 2017-02-14 Scenera Technologies, Llc Methods, systems, and computer readable media for controlling presentation and selection of objects that are digital images depicting subjects
US10685749B2 (en) 2007-12-19 2020-06-16 Abbott Diabetes Care Inc. Insulin delivery apparatuses capable of bluetooth data transmission
US20090164190A1 (en) * 2007-12-19 2009-06-25 Abbott Diabetes Care, Inc. Physiological condition simulation device and method
US20090164239A1 (en) * 2007-12-19 2009-06-25 Abbott Diabetes Care, Inc. Dynamic Display Of Glucose Information
US20090164251A1 (en) * 2007-12-19 2009-06-25 Abbott Diabetes Care, Inc. Method and apparatus for providing treatment profile management
US9477860B2 (en) 2007-12-28 2016-10-25 Panasonic Intellectual Property Corporation Of America Communication device, communication system, image presentation method, and program
US8692905B2 (en) 2007-12-28 2014-04-08 Panasonic Corporation Communication device, communication system, image presentation method, and program
US20100283586A1 (en) * 2007-12-28 2010-11-11 Yoichi Ikeda Communication device, communication system, image presentation method, and program
US8400530B2 (en) * 2007-12-28 2013-03-19 Panasonic Corporation Communication device, communication system, image presentation method, and program
US8212864B2 (en) 2008-01-30 2012-07-03 DigitalOptics Corporation Europe Limited Methods and apparatuses for using image acquisition data to detect and correct image defects
US20090189998A1 (en) * 2008-01-30 2009-07-30 Fotonation Ireland Limited Methods And Apparatuses For Using Image Acquisition Data To Detect And Correct Image Defects
US9320468B2 (en) 2008-01-31 2016-04-26 Abbott Diabetes Care Inc. Analyte sensor with time lag compensation
US9770211B2 (en) 2008-01-31 2017-09-26 Abbott Diabetes Care Inc. Analyte sensor with time lag compensation
US10463288B2 (en) 2008-03-28 2019-11-05 Abbott Diabetes Care Inc. Analyte sensor calibration management
US9730623B2 (en) 2008-03-28 2017-08-15 Abbott Diabetes Care Inc. Analyte sensor calibration management
US11779248B2 (en) 2008-03-28 2023-10-10 Abbott Diabetes Care Inc. Analyte sensor calibration management
US20090277470A1 (en) * 2008-05-08 2009-11-12 Mitchell Monique M Artificial nail decorating system utilizing computer technology
US10327682B2 (en) 2008-05-30 2019-06-25 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US9541556B2 (en) 2008-05-30 2017-01-10 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US9795328B2 (en) 2008-05-30 2017-10-24 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US11735295B2 (en) 2008-05-30 2023-08-22 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US9931075B2 (en) 2008-05-30 2018-04-03 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US9141312B2 (en) * 2008-06-03 2015-09-22 Canon Kabushiki Kaisha Information processing apparatus, system and control method for transmitting data enabling selection of an image to be transmitted
US20130258121A1 (en) * 2008-06-03 2013-10-03 Canon Kabushiki Kaisha Information processing apparatus, system and control method thereof
US10328201B2 (en) 2008-07-14 2019-06-25 Abbott Diabetes Care Inc. Closed loop control system interface and methods
US11621073B2 (en) 2008-07-14 2023-04-04 Abbott Diabetes Care Inc. Closed loop control system interface and methods
US20100039520A1 (en) * 2008-08-14 2010-02-18 Fotonation Ireland Limited In-Camera Based Method of Detecting Defect Eye with High Accuracy
US8081254B2 (en) 2008-08-14 2011-12-20 DigitalOptics Corporation Europe Limited In-camera based method of detecting defect eye with high accuracy
US9610046B2 (en) 2008-08-31 2017-04-04 Abbott Diabetes Care Inc. Closed loop control with improved alarm functions
US11679200B2 (en) 2008-08-31 2023-06-20 Abbott Diabetes Care Inc. Closed loop control and signal attenuation detection
US8622988B2 (en) 2008-08-31 2014-01-07 Abbott Diabetes Care Inc. Variable rate closed loop control and methods
US8795252B2 (en) 2008-08-31 2014-08-05 Abbott Diabetes Care Inc. Robust closed loop control and methods
US9943644B2 (en) 2008-08-31 2018-04-17 Abbott Diabetes Care Inc. Closed loop control with reference measurement and methods thereof
US20100057041A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Closed Loop Control With Reference Measurement And Methods Thereof
US20100057044A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care Inc. Robust Closed Loop Control And Methods
US8734422B2 (en) 2008-08-31 2014-05-27 Abbott Diabetes Care Inc. Closed loop control with improved alarm functions
US20100057042A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Closed Loop Control With Improved Alarm Functions
US10188794B2 (en) 2008-08-31 2019-01-29 Abbott Diabetes Care Inc. Closed loop control and signal attenuation detection
US20100057040A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Robust Closed Loop Control And Methods
US9392969B2 (en) 2008-08-31 2016-07-19 Abbott Diabetes Care Inc. Closed loop control and signal attenuation detection
US20100056992A1 (en) * 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Variable Rate Closed Loop Control And Methods
US9572934B2 (en) 2008-08-31 2017-02-21 Abbott DiabetesCare Inc. Robust closed loop control and methods
US10045739B2 (en) 2008-09-30 2018-08-14 Abbott Diabetes Care Inc. Analyte sensor sensitivity attenuation mitigation
US8986208B2 (en) 2008-09-30 2015-03-24 Abbott Diabetes Care Inc. Analyte sensor sensitivity attenuation mitigation
US8125525B2 (en) * 2008-10-07 2012-02-28 Fuji Xerox Co., Ltd. Information processing apparatus, remote indication system, and computer readable medium
US20100085435A1 (en) * 2008-10-07 2010-04-08 Fuji Xerox Co., Ltd. Information processing apparatus, remote indication system, and computer readable medium
US11213229B2 (en) 2009-02-03 2022-01-04 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11202591B2 (en) 2009-02-03 2021-12-21 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11006871B2 (en) 2009-02-03 2021-05-18 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US20100198034A1 (en) * 2009-02-03 2010-08-05 Abbott Diabetes Care Inc. Compact On-Body Physiological Monitoring Devices and Methods Thereof
US11006870B2 (en) 2009-02-03 2021-05-18 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11006872B2 (en) 2009-02-03 2021-05-18 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11166656B2 (en) 2009-02-03 2021-11-09 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11836194B2 (en) 2009-02-20 2023-12-05 Nikon Corporation Mobile information device, image pickup device, and information acquisition system
US20100257195A1 (en) * 2009-02-20 2010-10-07 Nikon Corporation Mobile information device, image pickup device, and information acquisition system
US10430471B2 (en) 2009-02-20 2019-10-01 Nikon Corporation Mobile information device, image pickup device, and information acquisition system
US20100230285A1 (en) * 2009-02-26 2010-09-16 Abbott Diabetes Care Inc. Analyte Sensors and Methods of Making and Using the Same
US10009244B2 (en) 2009-04-15 2018-06-26 Abbott Diabetes Care Inc. Analyte monitoring system having an alert
US9226701B2 (en) 2009-04-28 2016-01-05 Abbott Diabetes Care Inc. Error detection in critical repeating data in a wireless sensor system
US20100274515A1 (en) * 2009-04-28 2010-10-28 Abbott Diabetes Care Inc. Dynamic Analyte Sensor Calibration Based On Sensor Stability Profile
US10820842B2 (en) 2009-04-29 2020-11-03 Abbott Diabetes Care Inc. Methods and systems for early signal attenuation detection and processing
US11298056B2 (en) 2009-04-29 2022-04-12 Abbott Diabetes Care Inc. Methods and systems for early signal attenuation detection and processing
US10952653B2 (en) 2009-04-29 2021-03-23 Abbott Diabetes Care Inc. Methods and systems for early signal attenuation detection and processing
US11013431B2 (en) 2009-04-29 2021-05-25 Abbott Diabetes Care Inc. Methods and systems for early signal attenuation detection and processing
US10194844B2 (en) 2009-04-29 2019-02-05 Abbott Diabetes Care Inc. Methods and systems for early signal attenuation detection and processing
US11116431B1 (en) 2009-04-29 2021-09-14 Abbott Diabetes Care Inc. Methods and systems for early signal attenuation detection and processing
US11872370B2 (en) 2009-05-29 2024-01-16 Abbott Diabetes Care Inc. Medical device antenna systems having external antenna configurations
US11793936B2 (en) 2009-05-29 2023-10-24 Abbott Diabetes Care Inc. Medical device antenna systems having external antenna configurations
US8799609B1 (en) * 2009-06-30 2014-08-05 Emc Corporation Error handling
US10872102B2 (en) 2009-07-23 2020-12-22 Abbott Diabetes Care Inc. Real time management of data relating to physiological control of glucose levels
US8798934B2 (en) 2009-07-23 2014-08-05 Abbott Diabetes Care Inc. Real time management of data relating to physiological control of glucose levels
US20110021889A1 (en) * 2009-07-23 2011-01-27 Abbott Diabetes Care Inc. Continuous Analyte Measurement Systems and Systems and Methods for Implanting Them
US9795326B2 (en) 2009-07-23 2017-10-24 Abbott Diabetes Care Inc. Continuous analyte measurement systems and systems and methods for implanting them
US10827954B2 (en) 2009-07-23 2020-11-10 Abbott Diabetes Care Inc. Continuous analyte measurement systems and systems and methods for implanting them
US10660554B2 (en) 2009-07-31 2020-05-26 Abbott Diabetes Care Inc. Methods and devices for analyte monitoring calibration
US8478557B2 (en) 2009-07-31 2013-07-02 Abbott Diabetes Care Inc. Method and apparatus for providing analyte monitoring system calibration accuracy
US11234625B2 (en) 2009-07-31 2022-02-01 Abbott Diabetes Care Inc. Method and apparatus for providing analyte monitoring and therapy management system accuracy
US8718965B2 (en) 2009-07-31 2014-05-06 Abbott Diabetes Care Inc. Method and apparatus for providing analyte monitoring system calibration accuracy
US20110029269A1 (en) * 2009-07-31 2011-02-03 Abbott Diabetes Care Inc. Method and Apparatus for Providing Analyte Monitoring System Calibration Accuracy
US9936910B2 (en) 2009-07-31 2018-04-10 Abbott Diabetes Care Inc. Method and apparatus for providing analyte monitoring and therapy management system accuracy
US9968302B2 (en) 2009-08-31 2018-05-15 Abbott Diabetes Care Inc. Analyte signal processing device and methods
US11241175B2 (en) 2009-08-31 2022-02-08 Abbott Diabetes Care Inc. Displays for a medical device
US11202586B2 (en) 2009-08-31 2021-12-21 Abbott Diabetes Care Inc. Displays for a medical device
US10456091B2 (en) 2009-08-31 2019-10-29 Abbott Diabetes Care Inc. Displays for a medical device
US11045147B2 (en) 2009-08-31 2021-06-29 Abbott Diabetes Care Inc. Analyte signal processing device and methods
US10136816B2 (en) 2009-08-31 2018-11-27 Abbott Diabetes Care Inc. Medical devices and methods
US10492685B2 (en) 2009-08-31 2019-12-03 Abbott Diabetes Care Inc. Medical devices and methods
US20110213225A1 (en) * 2009-08-31 2011-09-01 Abbott Diabetes Care Inc. Medical devices and methods
USRE47315E1 (en) 2009-08-31 2019-03-26 Abbott Diabetes Care Inc. Displays for a medical device
US8993331B2 (en) 2009-08-31 2015-03-31 Abbott Diabetes Care Inc. Analyte monitoring system and methods for managing power and noise
US10918342B1 (en) 2009-08-31 2021-02-16 Abbott Diabetes Care Inc. Displays for a medical device
US10123752B2 (en) 2009-08-31 2018-11-13 Abbott Diabetes Care Inc. Displays for a medical device
US10881355B2 (en) 2009-08-31 2021-01-05 Abbott Diabetes Care Inc. Displays for a medical device
US9814416B2 (en) 2009-08-31 2017-11-14 Abbott Diabetes Care Inc. Displays for a medical device
US20110060530A1 (en) * 2009-08-31 2011-03-10 Abbott Diabetes Care Inc. Analyte Signal Processing Device and Methods
USD1010133S1 (en) 2009-08-31 2024-01-02 Abbott Diabetes Care Inc. Analyte sensor assembly
US10429250B2 (en) 2009-08-31 2019-10-01 Abbott Diabetes Care, Inc. Analyte monitoring system and methods for managing power and noise
US11730429B2 (en) 2009-08-31 2023-08-22 Abbott Diabetes Care Inc. Displays for a medical device
US10772572B2 (en) 2009-08-31 2020-09-15 Abbott Diabetes Care Inc. Displays for a medical device
US9314195B2 (en) 2009-08-31 2016-04-19 Abbott Diabetes Care Inc. Analyte signal processing device and methods
US11150145B2 (en) 2009-08-31 2021-10-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods for managing power and noise
US11635332B2 (en) 2009-08-31 2023-04-25 Abbott Diabetes Care Inc. Analyte monitoring system and methods for managing power and noise
US20110054282A1 (en) * 2009-08-31 2011-03-03 Abbott Diabetes Care Inc. Analyte Monitoring System and Methods for Managing Power and Noise
US20110163892A1 (en) * 2010-01-07 2011-07-07 Emilcott Associates, Inc. System and method for mobile environmental measurements and displays
US8325061B2 (en) * 2010-01-07 2012-12-04 Emilcott Associates, Inc. System and method for mobile environmental measurements and displays
US9420251B2 (en) 2010-02-08 2016-08-16 Nikon Corporation Imaging device and information acquisition system in which an acquired image and associated information are held on a display
US11741706B2 (en) 2010-02-08 2023-08-29 Nikon Corporation Imaging device and information acquisition system in which an acquired image and associated information are held on a display
US20110193985A1 (en) * 2010-02-08 2011-08-11 Nikon Corporation Imaging device, information acquisition system and program
US11048941B2 (en) 2010-02-08 2021-06-29 Nikon Corporation Imaging device and information acquisition system in which an acquired image and associated information are held on a display
US11455798B2 (en) 2010-02-08 2022-09-27 Nikon Corporation Imaging device and information acquisition system in which an acquired image and associated information are held on a display
US10452914B2 (en) 2010-02-08 2019-10-22 Nikon Corporation Imaging device and information acquisition system in which an acquired image and associated information are held on a display
US9756253B2 (en) 2010-02-08 2017-09-05 Nikon Corporation Imaging device and information acquisition system in which an acquired image and associated information are held on a display
US9411826B2 (en) * 2010-07-16 2016-08-09 Canon Kabushiki Kaisha Image processing apparatus control method and program
US20120017181A1 (en) * 2010-07-16 2012-01-19 Canon Kabushiki Kaisha Image processing apparatus control method and program
US11213226B2 (en) 2010-10-07 2022-01-04 Abbott Diabetes Care Inc. Analyte monitoring devices and methods
US11534089B2 (en) 2011-02-28 2022-12-27 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US11627898B2 (en) 2011-02-28 2023-04-18 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US10136845B2 (en) 2011-02-28 2018-11-27 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US9532737B2 (en) 2011-02-28 2017-01-03 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US20140085526A1 (en) * 2011-06-24 2014-03-27 Olympus Corporation Imaging device and wireless system
US20140104443A1 (en) * 2011-06-24 2014-04-17 Olympus Corporation Imaging apparatus and wireless system
US9092049B2 (en) * 2011-06-24 2015-07-28 Olympus Corporation Imaging apparatus and wireless system
US9094086B2 (en) * 2011-06-24 2015-07-28 Olympus Corporation Imaging device and wireless system
US8423511B1 (en) * 2011-07-07 2013-04-16 Symantec Corporation Systems and methods for securing data on mobile devices
US9465420B2 (en) 2011-10-31 2016-10-11 Abbott Diabetes Care Inc. Electronic devices having integrated reset systems and methods thereof
US9069536B2 (en) 2011-10-31 2015-06-30 Abbott Diabetes Care Inc. Electronic devices having integrated reset systems and methods thereof
US9289179B2 (en) 2011-11-23 2016-03-22 Abbott Diabetes Care Inc. Mitigating single point failure of devices in an analyte monitoring system and methods thereof
US11783941B2 (en) 2011-11-23 2023-10-10 Abbott Diabetes Care Inc. Compatibility mechanisms for devices in a continuous analyte monitoring system and methods thereof
US8710993B2 (en) 2011-11-23 2014-04-29 Abbott Diabetes Care Inc. Mitigating single point failure of devices in an analyte monitoring system and methods thereof
US10939859B2 (en) 2011-11-23 2021-03-09 Abbott Diabetes Care Inc. Mitigating single point failure of devices in an analyte monitoring system and methods thereof
US9743872B2 (en) 2011-11-23 2017-08-29 Abbott Diabetes Care Inc. Mitigating single point failure of devices in an analyte monitoring system and methods thereof
US9721063B2 (en) 2011-11-23 2017-08-01 Abbott Diabetes Care Inc. Compatibility mechanisms for devices in a continuous analyte monitoring system and methods thereof
US11205511B2 (en) 2011-11-23 2021-12-21 Abbott Diabetes Care Inc. Compatibility mechanisms for devices in a continuous analyte monitoring system and methods thereof
US10136847B2 (en) 2011-11-23 2018-11-27 Abbott Diabetes Care Inc. Mitigating single point failure of devices in an analyte monitoring system and methods thereof
US11391723B2 (en) 2011-11-25 2022-07-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods of use
US10082493B2 (en) 2011-11-25 2018-09-25 Abbott Diabetes Care Inc. Analyte monitoring system and methods of use
US20150054984A1 (en) * 2012-03-15 2015-02-26 Nec Display Solutions, Ltd. Image display apparatus and image display method
US20130304989A1 (en) * 2012-05-08 2013-11-14 Fujitsu Limited Information processing apparatus
US9207878B2 (en) * 2012-05-08 2015-12-08 Fujitsu Limited Information processing apparatus
US10132793B2 (en) 2012-08-30 2018-11-20 Abbott Diabetes Care Inc. Dropout detection in continuous analyte monitoring data during data excursions
US10656139B2 (en) 2012-08-30 2020-05-19 Abbott Diabetes Care Inc. Dropout detection in continuous analyte monitoring data during data excursions
US10942164B2 (en) 2012-08-30 2021-03-09 Abbott Diabetes Care Inc. Dropout detection in continuous analyte monitoring data during data excursions
US10345291B2 (en) 2012-08-30 2019-07-09 Abbott Diabetes Care Inc. Dropout detection in continuous analyte monitoring data during data excursions
US9968306B2 (en) 2012-09-17 2018-05-15 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems
US11612363B2 (en) 2012-09-17 2023-03-28 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems
US11896371B2 (en) 2012-09-26 2024-02-13 Abbott Diabetes Care Inc. Method and apparatus for improving lag correction during in vivo measurement of analyte concentration with analyte concentration variability and range data
US10158795B2 (en) 2012-12-27 2018-12-18 Panasonic Intellectual Property Management Co., Ltd. Electronic apparatus for communicating with another apparatus
US10165178B2 (en) 2013-02-14 2018-12-25 Olympus Corporation Image file management system and imaging device with tag information in a communication network
US9742990B2 (en) * 2013-02-14 2017-08-22 Olympus Corporation Image file communication system with tag information in a communication network
US20140226029A1 (en) * 2013-02-14 2014-08-14 Olympus Corporation Computer readable recording medium, transmitting device, management server and image transmitting method
KR101816374B1 (en) * 2013-04-24 2018-01-08 더 스와치 그룹 리서치 앤 디벨롭먼트 엘티디 System having multiple simplified communication apparatuses
US9997061B2 (en) * 2013-04-24 2018-06-12 The Swatch Group Research And Development Ltd Multi-device system with simplified communication
US20160049072A1 (en) * 2013-04-24 2016-02-18 The Swatch Group Research And Development Ltd Multi-device system with simplified communication
US20150189179A1 (en) * 2013-12-30 2015-07-02 Samsung Electronics Co., Ltd. Electronic apparatus and method for controlling the same
US9661222B2 (en) * 2013-12-30 2017-05-23 Samsung Electronics Co., Ltd. Electronic apparatus and method for controlling the same
US10448230B2 (en) 2014-04-25 2019-10-15 Alibaba Group Holding Limited Data transmission
JP2015219650A (en) * 2014-05-15 2015-12-07 キヤノン株式会社 Communication device, control method thereof, and program
US20150332196A1 (en) * 2014-05-15 2015-11-19 Heinz-Werner Stiller Surgical Workflow Support System
US10142534B2 (en) 2014-11-07 2018-11-27 Bianconero, Inc. Image-capturing and image-distributing system for automatically or manually capturing image of user carrying mobile communication terminal
US20160307443A1 (en) * 2015-04-20 2016-10-20 Imagine If, LLC Global positioning system
US9576487B2 (en) * 2015-04-20 2017-02-21 Intellectual Fortress, LLC Global positioning system
US11553883B2 (en) 2015-07-10 2023-01-17 Abbott Diabetes Care Inc. System, device and method of dynamic glucose profile response to physiological parameters
US10225467B2 (en) * 2015-07-20 2019-03-05 Motorola Mobility Llc 360° video multi-angle attention-focus recording
US10547771B2 (en) * 2016-12-15 2020-01-28 Fujifilm Corporation Printer, digital camera with printer, and printing method
US20180176433A1 (en) * 2016-12-15 2018-06-21 Fujifilm Corporation Printer, digital camera with printer, and printing method
US20180183994A1 (en) * 2016-12-27 2018-06-28 Canon Kabushiki Kaisha Display control apparatus, electronic apparatus, and control method therefor
US10873695B2 (en) * 2016-12-27 2020-12-22 Canon Kabushiki Kaisha Display control apparatus, electronic apparatus, and control method therefor
US11596330B2 (en) 2017-03-21 2023-03-07 Abbott Diabetes Care Inc. Methods, devices and system for providing diabetic condition diagnosis and therapy
US11163356B2 (en) * 2017-05-18 2021-11-02 Guohua Liu Device-facing human-computer interaction method and system
US11509402B2 (en) * 2017-09-29 2022-11-22 Orange Method and system for recognizing a user during a radio communication via the human body
US11303815B2 (en) * 2017-11-29 2022-04-12 Sony Corporation Imaging apparatus and imaging method
US10816808B2 (en) * 2017-12-08 2020-10-27 Seiko Epson Corporation Head-mounted display apparatus, information processing device, system, and method for controlling use of captured images from head-mounted display apparatus
US10859835B2 (en) * 2018-01-24 2020-12-08 Seiko Epson Corporation Head-mounted display apparatus and method for controlling imaging data of head-mounted display apparatus using release code
US20210295552A1 (en) * 2018-08-02 2021-09-23 Sony Corporation Information processing device, information processing method, and program
US11720768B2 (en) * 2021-03-04 2023-08-08 Brother Kogyo Kabushiki Kaisha Computer-readable medium, image processing device, and method for reducing time taken from input of print instruction until start of printing
US20220284248A1 (en) * 2021-03-04 2022-09-08 Brother Kogyo Kabushiki Kaisha Computer-readable medium, image processing device, and method for reducing time taken from input of print instruction until start of printing

Also Published As

Publication number Publication date
US7864218B2 (en) 2011-01-04
US20080239083A1 (en) 2008-10-02
AU2002354181A1 (en) 2003-06-17
WO2003049424A1 (en) 2003-06-12
US20150370360A1 (en) 2015-12-24
US10015403B2 (en) 2018-07-03
US20180069968A1 (en) 2018-03-08
US9578186B2 (en) 2017-02-21
US20160234392A1 (en) 2016-08-11
US20190174066A1 (en) 2019-06-06
US20140340534A1 (en) 2014-11-20
US20160330381A1 (en) 2016-11-10
US20160269571A1 (en) 2016-09-15
US20130271614A1 (en) 2013-10-17
US8482634B2 (en) 2013-07-09
US9838550B2 (en) 2017-12-05
US20110096197A1 (en) 2011-04-28
US8804006B2 (en) 2014-08-12
US20180124259A1 (en) 2018-05-03
US20170054932A1 (en) 2017-02-23
US9894220B2 (en) 2018-02-13
US7382405B2 (en) 2008-06-03

Similar Documents

Publication Publication Date Title
US9838550B2 (en) Image display apparatus having image-related information displaying function
US7764308B2 (en) Image transmission system, image relay apparatus, and electronic image device
JP4158376B2 (en) Electronic camera, image display apparatus, and image display method
CA2491684C (en) Imaging system processing images based on location
US7535492B2 (en) Imaging system providing automated fulfillment of image photofinishing based on location
JP4114350B2 (en) Electronic camera, electronic device, image transmission system, and image transmission method
US20050186946A1 (en) Mobile communication terminal, method for controlling mobile communication terminal, and remote control system using program and email
JP4767453B2 (en) Data transmission method, transmission destination determination device, and photographing terminal
JP4572954B2 (en) Image display device
JP4513867B2 (en) Electronic camera
CN111712807A (en) Portable information terminal, information presentation system, and information presentation method
JP4919545B2 (en) Data transmission system and photographing apparatus
JP2004046313A (en) Image transmission system and image relay device
KR20050046147A (en) Method for creating a name of photograph in mobile phone
KR100747426B1 (en) Photographing apparatus and method for providing silhouette information of subject
JP2008017188A (en) Camera
JP2008017189A (en) Camera

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIKON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUSAKA, YOSUKE;NAKAMURA, SHOEI;REEL/FRAME:015771/0635;SIGNING DATES FROM 20040524 TO 20040525

Owner name: NIKON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUSAKA, YOSUKE;NAKAMURA, SHOEI;REEL/FRAME:015582/0253;SIGNING DATES FROM 20040524 TO 20040525

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12