EP2829955A2 - Electronic device - Google Patents
Electronic device Download PDFInfo
- Publication number
- EP2829955A2 EP2829955A2 EP14176814.3A EP14176814A EP2829955A2 EP 2829955 A2 EP2829955 A2 EP 2829955A2 EP 14176814 A EP14176814 A EP 14176814A EP 2829955 A2 EP2829955 A2 EP 2829955A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- projection
- region
- light
- sensor
- notification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
- G06F3/0426—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
- H04N5/7408—Direct viewing projectors, e.g. an image displayed on a video CRT or LCD display being projected on a screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Projection Apparatus (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
- The present invention relates to an electronic device and more particularly relates to an electronic device configured to detect a reflected light of a projected light.
- Conventionally, projectors that project a projection image on a projection surface such as a screen, a wall, or a desktop are known, and various devices using the projectors are also developed, such as that in
Patent Document 1. -
Patent Document 1 discloses a projector provided with a virtual user interface (VUI). In this projector, a projection unit, a light projection unit, and a visible light camera are securely disposed in a same enclosure. The projection unit projects a predetermined projection image on a projection surface. The light projection unit projects a light for detecting an object, such as a finger of a user, positioned over a projection region of the projection image. The visible light camera images a light reflected at the object over the projection region. Based on this imaging result, a position of the object over the projection image is detected by the VUI. - [Patent Document 1] Japanese Unexamined Patent Application Publication No.
2012-108233 - However, in conventional projectors, the projection unit needs to be at or above a certain height, and when the projection unit is installed in a same enclosure as the visible light camera, as in
Patent Document 1, the installation position of the projection unit is limited, and the projectors are more likely to increase in size. Moreover, when the projection unit is securely disposed in the enclosure, as inPatent Document 1, where a size of an image projected to a projection surface F by the projection unit is fixed by a device and the location where the projector is installed is also limited. - Furthermore, even if the projection unit and the visible light camera may be separately disposed, the installation position of the visible light camera may shift from the position where the visible light camera may detect a reflected light.
- One or more embodiments of the present invention provide an electronic device comprising a projection device and a detection device which can detect a reflected light of a light projected from the projection device even if the projection device and the detection device are disposed separately from each other.
- An electronic device according to one or more embodiments of the present invention may comprise a projection device configured to project a light to a projection region on a projection surface and a detection device separately disposed from the projection device. The detection device may comprise a light detector configured to detect a reflected light of the light projected from the projection device, and a notification light emitter configured to emit to the projection surface a notification light for indicating an inclination of the light detector relative to the projection surface. The notification light emitter may be further configured or arranged to emit the notification light in a plurality of directions, for example in directions respectively fixed relative to the detection device, from a plurality of locations in the detection device. The notification light emitter may be further configured to emit the notification light so that each notification light may pass through a same point on the projection surface when the detection device is installed at a predetermined or particular inclination relative to the projection surface.
- According to the above configuration, for example, the electronic device may be made smaller in size because the projection device and the detection device are disposed separately from each other. Moreover, the installation location of the electronic device may also be less restricted than in conventional devices. Moreover, the projection region may be set to a desired size according to the relative position of the projection device with respect to the projection surface because the projection device may be disposed without being restricted by the installation position of the detection device. Moreover, the light detector may be disposed at a predetermined or particular inclination relative to the projection surface based on the notification light emitted from the notification light emitter. Therefore, the detection device may detect the reflected light of the light projected from the projection device even if the projection device and the detection device are disposed separately from each other.
- Furthermore, according to one or more embodiments of the electronic device, the notification light emitter may be configured to emit a notification light so that when the detection device is installed at the predetermined inclination relative to the projection surface, each notification light may project projection images of a spotted shape that overlap each other at the same point on the projection surface.
- According to this configuration, for example, indication may be given of whether the detection device is installed at the predetermined inclination relative to the projection surface based on the projection images of the spot shape.
- Furthermore, according to one or more embodiments of the electronic device, the notification light emitter may be configured to emit a notification light so that when the detection device is installed at the predetermined inclination relative to the projection surface, each notification light projects projection images of a line-segment shape where the same point on the projection surface may be an endpoint of each line-segment shape.
- According to one or more embodiments of the above configuration, for example, indication may be given of whether the detection device is installed at the predetermined inclination relative to the projection surface based on the projection images of the line-segment shape.
- Furthermore, according to one or more embodiments of the electronic device, the detection device may further comprise an inclination detector configured to detect an inclination of the detection device, and an inclination determinator configured to determine, based on a detection result of the inclination detector, whether an inclination of the detection device relative to the projection surface is at the predetermined inclination.
- According to one or more embodiments of the above configuration, the inclination detector may detect, for example, a change in the inclination of the detection device, for example a change amount in the inclination of the detection device. Therefore, the inclination detector may determine, based on a detection result of the inclination detector, whether and to what extent the inclination of the detection device relative to the projection surface changed from the predetermined inclination.
- Furthermore, according to one or more embodiments of the electronic device, the detection device may further comprise a driver or actuator configured to adjust, based on a determination result of the inclination determinator, an inclination of the light detector relative to the detection device.
- According to one or more embodiments of the above configuration, for example, the orientation of the detection device may be adjusted by the driver or actuator based on the determination result of the inclination determinator. Therefore, the driver or actuator may automatically install the detection device at the predetermined inclination relative to the projection surface without the user reinstalling the detection device by hand. Moreover, the driver or actuator may automatically install the detection device at the predetermined inclination relative to the projection surface even if the inclination of the detection device changes after installation of the detection device.
- Furthermore, according to one or more embodiments of the electronic device, the notification light emitter may be configured to emit a notification light so that the notification light may project the projection images of the line-segment shape on the projection surface as projection images for setting a position of at least an edge of a periphery, for example an outer periphery, of the projection region, nearest to the detection device in a detectable region where the light detector detects the reflected light.
- According to one or more embodiments of the above configuration, for example, the position of the projection region may be set in the detectable region of the detection device by setting the position of at least the edge of the projection region nearest to the detection device based on the projection image of the notification light.
- Furthermore, according to one or more embodiments of the electronic device, the projection device may comprise a light projector configured to project the light, and a regulatory light emitter configured to emit a regulatory light for regulating the projection region.
- According to one or more embodiments of the above configuration, for example, the position of the projection region may be set in the detectable region of the light detector based on the regulatory light for regulating the projection region and the notification light.
- Furthermore, according to one or more embodiments of the electronic device, the detection device may further comprise a region determinator configured to determine, based on a detection result of the light detector, whether a portion or an entirety of the projection region is outside the detectable region.
- For example, each reflected light, such as the light projected from the projection device to the projection region and the regulatory light, may become incident to the light detector. Therefore, the region determinator may be configured to determine whether a portion or the entirety of the projection region is outside the detectable region based on the detection result of the light detector that detects these reflected lights.
- Furthermore, according to one or more embodiments of the electronic device, the region determinator may be further configured to determine whether both endpoints of at least the edge of the outer periphery of the projection region, nearest to the detection device are in the detectable region.
- For example, when the projection region is in the detectable region of the light detector, both endpoints of at least the edge of the projection region nearest to the detection device may be in the detectable region. Moreover, in this situation, both endpoints of an edge farthest from the detection device may often be also in the detectable region. Conversely, even if both endpoints of the edge farthest from the detection device are in the detectable region, there is a high possibility that both endpoints of the edge nearest to the detection device are not in the detectable region. Therefore, whether the projection region is in the detectable region is easily and more reliably determined by determining whether at least both endpoints are in the detectable region.
- Furthermore, according to one or more embodiments of the electronic device, the detection device may further comprise an information output device configured to output information based on a determination result of the region determinator.
- According to one or more embodiments of the above configuration, for example, the user may be informed with various notifications relating to a relationship between the detectable region of the light detector and the projection region based on the determination result of the region determinator.
- Furthermore, according to one or more embodiments of the electronic device, when the region determinator determines that at least a portion of the projection region is outside the detectable region, the information output device may output a correction method for setting the projection region in the detectable region and for correcting at least either an installation position or an orientation of at least one of the projection device and the detection device.
- According to one or more embodiments of the above configuration, for example, the user may be informed with the correction method for setting the projection region in the detectable region when at least a portion of the projection region is determined to be outside the detectable region of the light detector. Therefore, the user may, based on the output correction method, correct at least either the installation position or the orientation of at least one device from among the projection device and the detection device. Therefore, the user may conveniently set the entirety of the projection region in the detectable region.
- Furthermore, according to one or more embodiments of the electronic device, at least one of the projection device and the detection device may further comprise a position determinator configured to determine, based on the detection result of the light detection, a relative position of the projection region relative to the detectable region, and a driver or actuator configured to adjust, based on a determination result of the position determinator, at least either the installation position or the orientation of at least one of the devices.
- According to one or more embodiments of the above configuration, for example, in at least one device from among the projection device and the detection device, the driver or actuator may adjust at least either the installation position or the orientation of at least one of the devices based on the determination result of the position determinator. Therefore, the driver or actuator may automatically set the entirety of the projection region in the detectable region without the user reinstalling the device by hand. Moreover, even if the projection region moves outside the detectable region after installation of each device, the driver or actuator may automatically set the projection region in the detectable region.
- According to one or more embodiments of the electronic device, the electronic device may comprise a detection device detecting the reflected light of the light projected from the projection device even if the projection device and the detection device are disposed separately from each other.
-
-
FIG. 1 is an external perspective view of a projector according to one or more embodiments of a first example of the present invention. -
FIG. 2 is a side view for describing a light path of the projector according to one or more embodiments of the present invention. -
FIG. 3 is a top view for describing the light path of the projector according to one or more embodiments of the present invention. -
FIG. 4A is a side view illustrating a size change of a projection region according to an installation position of a projector according to one or more embodiments of the present invention. -
FIG. 4B is a top view illustrating the size change of the projection region according to the installation position of the projector according to one or more embodiments the present invention. -
FIG. 5 is a block diagram illustrating a configuration of a projector according to one or more embodiments of the first example of the present invention. -
FIG. 6 is a block diagram illustrating a configuration of a sensor according to one or more embodiments of the first example of the present invention. -
FIG. 7 is a flowchart for describing an installation method of the projector according to one or more embodiments of the present invention. -
FIG. 8A is a side view illustrating a sensitivity distribution of a light detector in one or more embodiments of the first example of the present invention. -
FIG. 8B is a plan view illustrating the sensitivity distribution of the light detector in one or more embodiments of the first example of the present invention. -
FIG. 9A is a side view illustrating a sensor installed at a predetermined inclination relative to a projection surface according to one or more embodiments of the present invention. -
FIG. 9B is a top view illustrating the sensor ofFIG. 9A . -
FIG. 10A is a side view illustrating a sensor installed at a first inclination relative to the projection surface according to one or more embodiments of the present invention. -
FIG. 10B is a plan view illustrating the sensor ofFIG. 10A . -
FIG. 11A is a side view illustrating a sensor installed at a second inclination relative to the projection surface according to one or more embodiments of the present invention. -
FIG. 11B is a plan view illustrating the sensor ofFIG. 11A . -
FIG. 12 is a plan view illustrating a positional relationship between a projection region and notification lines in one or more embodiments of the first example of the present invention. -
FIG. 13 is a side view illustrating a sensor installed in a position nearest to the projection region according to one or more embodiments of the present invention. -
FIG. 14 is a side view illustrating a sensor installed in a position farthest from the projection region according to one or more embodiments of the present invention. -
FIG. 15 is a flowchart for describing a calibration process of an installation position of the sensor according to one or more embodiments of the first example of the present invention. -
FIG. 16 is an external perspective view of a projector according to one or more embodiments of a second example of the present invention. -
FIG. 17 is a block diagram illustrating a configuration of a projector according to one or more embodiments of the second example of the present invention. -
FIG. 18 is a plan view illustrating a positional relationship between a projection region and notification lines according to one or more embodiments of the second example of the present invention. -
FIG. 19 is a flowchart for describing a calibration process of an installation position of a sensor according to one or more embodiments of the second example of the present invention. -
FIG. 20 is a block diagram illustrating a configuration of a sensor according to one or more embodiments of a third example of the present invention. -
FIG. 21 is a block diagram illustrating a configuration of a sensor according to a modified example of one or more embodiments of the third example of the present invention. -
FIG. 22 is a block diagram illustrating a configuration of a projector according to one or more embodiments of a fourth example of the present invention. -
FIG. 23 is an external perspective view of a sensor according to one or more embodiments of a fifth example of the present invention. -
FIG. 24 is a side view illustrating a sensor installed at the predetermined inclination relative to the projection surface according to one or more embodiments of the present invention. -
FIG. 25 is a side view illustrating a sensor installed at the first inclination relative to the projection surface according to one or more embodiments of the present invention. -
FIG. 26 is a side view illustrating a sensor installed at the second inclination relative to the projection surface according to one or more embodiments of the present invention. - Embodiments of the present invention will be described below with reference to the drawings.
- An example of electronic device according to one or more embodiments is a projector.
FIG. 1 shows anexemplary projector 1, which may be a projector of a laser light scanning type having a virtual user interface (VUI) function. Theprojector 1 is an example of an electronic device that may be used as an input device because of the VUI function. The VUI function is a virtual input interface where a user may perform operation input on a projection image (such as an image of a keyboard or an operation panel) projected to a projection surface F. -
FIG. 1 is an external perspective view of a projector according to one or more embodiments of a first example of the present invention. Moreover,FIG. 2 is a side view for describing a light path of the projector. Moreover,FIG. 3 is a top view for describing the light path of the projector. Directions parallel to the projection surface F will be described as an X direction and a Y direction and a normal direction of the projection surface F will be described as a Z direction. Moreover, the X direction, the Y direction, and the Z direction are mutually orthogonal. Moreover, inFIGS. 2 and3 , to facilitate understanding of main portions, illustration is omitted for each reflected light of a scanning laser light Ra reflected at the projection surface F and a notification light Rb that will be described below. - As illustrated in
FIGS. 1 to 3 , theprojector 1 may be configured to include aprojector device 2 and asensor 3. Theprojector device 2 and thesensor 3 are disposed separately and independently of each other. That is, theprojector device 2 and thesensor 3 are not housed in a same enclosure but may be housed in separate enclosures. Because of this, theprojector 1 may be made smaller than conventional devices. Moreover, the installation location of theprojector 1 is not restricted as in the conventional devices, and theprojector 1 may be installed comparatively freely. - The
projector device 2 may be a projector device that projects the scanning laser light Ra on the projection surface F, which may be an upper surface of a desk, a screen, a wall, or the like, and may be installed, for example, on a ceiling in a room. Alternatively, theprojector device 2 may be installed on a stand. As illustrated inFIGS. 1 to 3 , theprojector device 2 may project a predetermined projection image (e.g. still image, moving image, or the like) in a projection region Aa by projecting the scanning laser light Ra to the projection region Aa on the projection surface F. This projection image is not limited in particular. - The
sensor 3 is an example of detection device. Thesensor 3 may be a detection device that detects a light incident to alight incidence surface 30 and may be installed, for example, on the projection surface F. Alternatively, thesensor 3 may be set on a stand. Thesensor 3 detects, for example, the reflected light of the scanning laser light Ra reflected at the projection region Aa or at an object U (e.g. a finger of the user, a touch pen, or the like) positioned over the projection region Aa. A light intensity of the reflected light of the scanning laser light Ra reflected at the projection region Aa is different from a light intensity of the reflected light of the scanning laser light Ra reflected at the object U positioned over the projection region Aa. Thesensor 3 may detect and differentiate the reflected light of the scanning laser light Ra reflected at the projection region Aa and the reflected light of the scanning laser light Ra reflected at the object U based on a difference of the light intensities of these reflected lights. When thesensor 3 detects each reflected light from the projection region Aa and the object U, thesensor 3 detects a relative position of the object U relative to the projection image projected to the projection region Aa by the VUI function and detects an input operation by the object U based on this detection result. - Furthermore, the
sensor 3 may emit the notification light Rb of a wavelength in a visible light region from anotification light emitter 37 and, as illustrated inFIGS. 1 to 3 , projects to the projection surface F two notification lines L for setting positions of thesensor 3 and the projection region Aa. These two notification lines L indicate an inclination of thesensor 3 relative to the projection surface F, may be reference lines for setting the position of the projection region Aa, and, in particular, indicate whether thesensor 3 is set at an appropriate inclination relative to the projection surface F. The notification lines L may be used for regulating a position of an edge Ea1, which may be included in an outer periphery of the projection region Aa and is nearest to thesensor 3. The edge Ea1 of the projection region Aa will be referred to hereinbelow as a lower edge. - For example, the
sensor 3 may detect the reflected light from the projection region Aa if the two notification lines L form a virtual region Ab of a triangular shape and if the lower edge Ea1 of the projection region Aa is in the virtual region Ab. Because of this, it is sufficient for the user to install the projector 1 (in particular, thesensor 3 thereof) in an appropriate position while referring to the projection region Aa and the two notification lines L. An exemplary installation method of theprojector 1 and a calibration process of an installation position of thesensor 3 will be described below. - Furthermore, the
projector device 2 may be disposed in any position relative to the projection surface F, independently of thesensor 3. Because of this, a size of the projection region Aa (projection image) may be changed according to a relative positional relationship between the projection surface F and theprojector device 2.FIGS. 4A and4B are a side view and a top view illustrating a size change of the projection region Aa according to an installation position of a projector device. InFIGS. 4A and4B , to facilitate understanding of the main portions, illustration is omitted for the notification light Rb and each reflected light of the scanning laser light Ra reflected at the projection surface F and the notification light Rb. - As illustrated in
FIGS. 4A and4B , when an installation height of theprojector device 2 in the Z direction is lowered, a size of a projection region Aa1 decreases because a distance between the projection region Aa1 on the projection surface F and theprojector device 2 decreases. Meanwhile, when the installation height of theprojector device 2 in the Z direction is raised, a size of a projection region Aa2 increases because a distance between the projection region Aa2 on the projection surface F and theprojector device 2 increases. In this manner, thesensor 3 may detect the reflected light from the projection region Aa, even when the size of the projection region Aa changes, by installing thesensor 3 in an appropriate position relative to the projection region Aa. - Next, a configuration of the
projector device 2 according to one or more embodiments will be described.FIG. 5 is a block diagram illustrating a configuration of a projector device according to one or more embodiments of the first example of the present invention. Theprojector device 2 may include alaser light projector 21, afirst memory 22, afirst controller 23, and afirst communicator 24. - The
laser light projector 21 may project on the projection surface F the scanning laser light Ra for projecting the projection image to the projection region Aa. Thislaser light projector 21 may include alaser diode 211, anoptical system 212, and alaser light driver 213. Thelaser diode 211 will be referred to hereinbelow as an LD. - The
LD 211 may be a light source that emits the laser light Ra of a wavelength in a visible light region. TheLD 211 may be, for example, a light-emitting element that emits a laser light of a single color or may be configured including a plurality of light-emittingelements - The
optical system 212 may be an optical system that projects to the projection surface F the laser light Ra emitted from theLD 211. Theoptical system 212 may be configured including, for example,dichroic mirrors elements collimator lens 2123 that converts the laser light Ra into a parallel light; and ascanning mirror 2124 that adjusts an emission direction of the laser light Ra. - The
laser light driver 213 may be a driver that scans the projection region Aa two dimensionally with the laser light Ra emitted from theLD 211 by driving theLD 211 and theoptical system 212. Thelaser light driver 213 may be configured including, for example, avideo processor 2131 that processes a control signal output from thefirst controller 23, alight source controller 2132 and anLD driver 2133 that control emission and a light intensity of theLD 211 based on the control signal, and amirror controller 2134 and amirror driver 2135 that drive controls a scanning mirror based on the control signal. - For example, the
laser light driver 213 scans the laser light Ra in the Y direction from one end (for example, an end portion on the upper side ofFIG. 3 ) to another end (for example, an end portion on the lower side ofFIG. 2 ) of the projection region Aa based on a horizontal synchronization signal output from thefirst controller 23. Moreover, thelaser light driver 213, once one scan in the Y direction is finished, again scans the laser light Ra in the Y direction after shifting the laser light Ra a predetermined distance in the X direction inFIG. 1 based on a vertical synchronization signal output from thefirst controller 23. By repeating these operations, the projection image is projected to the projection region Aa by the laser light Ra two dimensionally scanning an entire region of the projection region Aa. A video signal synchronized with such a horizontal synchronization signal and vertical synchronization signal is supplied to thevideo processor 2131. Theprojector device 2 may project an image represented by the video signal supplied to thevideo processor 2131. - The
first memory 22 may be a non-volatile storage medium and stores a program, control information, and the like used by thefirst controller 23. Thefirst memory 22 may be a separate component from thefirst controller 23, as inFIG. 5 , but may be included in thefirst controller 23. - The
first controller 23 may control each component of theprojector device 2 using the program, the control information, and the like stored in thefirst memory 22. Thisfirst controller 23 may be realized as a semiconductor chip such as a microprocessing unit (MPU) or a central processing unit (CPU) or may be realized as an electrical circuit. - The
first communicator 24 may be an interface for communication with thesensor 3. Those skilled in the art will appreciate that various known communication interface options may be used. - Next, a configuration of the
sensor 3 according to one or more embodiments will be described.FIG. 6 is a block diagram illustrating a configuration of a sensor according to one or more embodiments of the first example of the present invention. Thesensor 3 may include alight detector 31, asecond memory 32, asecond controller 33, asecond communicator 34, anoperator 35, aninformation output device 36, and thenotification light emitter 37. Additionally, a remote controller (not illustrated) with which the user performs operation input and a reception unit (not illustrated) that receives an input signal from the remote controller may be provided. - The
light detector 31 may be a sensor having a light-receiving element such as a photodiode and detects the light incident to the light incidence surface 30 (seeFIG. 1 ) of thesensor 3. Thelight detector 31 may detect, for example, the reflected light of the scanning laser light Ra reflected at the projection region Aa and at the object U positioned over the projection region Aa. - The
second memory 32 may be a non-volatile storage medium and stores a program, control information, and the like used by thesecond controller 33. Thesecond memory 32 may be a separate component from thesecond controller 33, as inFIG. 6 , but may be included in thesecond controller 33. - The
second controller 33 may control each component of thesensor 3 using the program, the control information, and the like stored in thesecond memory 32. Thesecond controller 33 may be realized as a semiconductor chip such as a microprocessing unit (MPU) or a central processing unit (CPU) or may be realized as an electrical circuit. - The
second controller 33 may include aposition calculator 331 and aregion determinator 332. Theposition calculator 331 and theregion determinator 332 may each be realized as an electrical circuit or may be functional units of thesecond controller 33 realized by using the program and the control information stored in thesecond memory 32. - The
position calculator 331 may calculate a relative position of the object U relative to the projection region Aa based on a detection result (for example, the reflected light of the scanning laser light Ra) of thelight detector 31. This relative position is calculated based on a result of thelight detector 31 detecting, for example, each reflected light of the scanning laser light Ra reflected at the projection region Aa and the object U. - The region determinator 332 may determine whether a portion or an entirety of the projection region Aa is outside a detectable region based on the detection result of the
light detector 31. Moreover, once theregion determinator 332 determines that at least a portion of the projection region Aa is shifted from the detectable region (not illustrated), theregion determinator 332 determines to what extent the projection region Aa is shifted based on the detection result of thelight detector 31. - The
second communicator 34 may be an interface for communication with theprojector device 2. Those skilled in the art will appreciate that various known communication interface options may be used. - The
operator 35 may be an input unit that accepts various operation inputs by the user. - The
information output device 36 may be an external output interface for outputting various information to the user. Theinformation output device 36 may inform the user that a portion or the entirety of the projection region Aa is shifted from the detectable region of the light detector31 when theregion determinator 332 determines that such is the case. Moreover, the user is informed of the calibration method of the installation position (and an orientation) of thesensor 3 to position the entirety of the projection region Aa in the detectable region. The information content and an informing method of theinformation output device 36 is not limited in particular. Theinformation output device 36 may inform by a voice, using an audio output unit (not illustrated) such as a speaker, or may display the informing content on a display unit (not illustrated) such as a liquid crystal display. Alternatively, theinformation output device 36 may display a content (for example, a notification such as "Please move the sensor") to be informed to the user to the projection region Aa on the projection surface F by thesecond communicator 34 communicating with theprojector device 2. - The
notification light emitter 37 may be provided in a plurality of different locations in thesensor 3. Thenotification light emitter 37 may include alight source 371 that emits the notification light Rb of the wavelength in the visible light region in directions of ranges respectively fixed relative to thesensor 3 and may project the two notification lines L (seeFIGS. 1 to 3 ) of a substantially same length on the projection surface F. Thislight source 371 is not limited in particular: for example, a light-emitting element that emits a light in the visible light region, such as a laser diode element or an LED; a discharge tube; or the like may be used. - Next, an exemplary installation method of the
projector 1 according to one or more embodiments will be described.FIG. 7 is a flowchart for describing an installation method of the projector. - First, the
projector device 2 is installed on the ceiling in the room (S101), and the scanning laser light Ra is projected to the projection surface F (S102). Next, thesensor 3 is mounted on the projection surface (S103), and the notification light Rb is projected to project the notification lines L on the projection surface F (S104). Then, the calibration process of the installation position of thesensor 3 is performed to position the projection region Aa in a detectable region of the sensor 3 (S105). When the calibration process is completed, thesensor 3 stops projecting the notification light Rb, ending an installation of theprojector 1. - In the installation method described above, when the calibration process in step 105 finishes, projection of the notification light Rb stops, but the notification light Rb may be projected during the installation process of the
projector 1 or in a period other than during the installation process or may be projected continuously. Moreover, projection and projection stopping of the notification light Rb may be switched according to an operation input by the user accepted by theoperator 35. - Next, an exemplary installation condition of the
projector 1 according to one or more embodiments of the first example of the present invention will be described. -
FIG. 8A is a side view illustrating a sensitivity distribution of thelight detector 31 in one or more embodiments of the first example of the present invention. Moreover,FIG. 8B is a plan view illustrating the sensitivity distribution of thelight detector 31 in one or more embodiments of the first example of the present invention. Moreover,FIG. 9A is a side view illustrating a case in which thesensor 3 is installed at a predetermined inclination relative to the projection surface F. Moreover,FIG. 9B is a plan view illustrating the case in which thesensor 3 is installed at the predetermined inclination relative to the projection surface F. Moreover,FIG. 10A is a side view illustrating a case in which thesensor 3 is installed at a first inclination, which is different from the predetermined inclination, relative to the projection surface F. Moreover,FIG. 10B is a plan view illustrating the case in which thesensor 3 is installed at the first inclination relative to the projection surface F. Moreover,FIG. 11A is a side view illustrating a case in which thesensor 3 is installed at a second inclination, which is different from the predetermined inclination, relative to the projection surface F. Moreover,FIG. 11B is a plan view illustrating the case in which thesensor 3 is installed at the second inclination relative to the projection surface F. To facilitate the description, the case in which thesensor 3 is installed at the predetermined inclination relative to the projection surface F is referred to hereinbelow as a case in which thesensor 3 is installed parallel to the projection surface F. - As illustrated in
FIG. 8A , in a side view, thelight detector 31 may only detect a light incident to thelight incidence surface 30 from a detection region and does not detect a light incident to thelight incidence surface 30 from a dead region. Moreover, as illustrated inFIG. 8B , in a plan view as well, thelight detector 31 may similarly only detect the light incident to thelight incidence surface 30 from the detection region and does not detect the light incident to thelight incidence surface 30 from the dead region. InFIGS. 8A and8B , the two-dot chain line passing through thelight incidence surface 30 illustrates a boundary between the detection region and the dead region of thelight detector 31. - That the detection region of the
light detector 31 cover over the entirety of the projection region Aa at a uniform height may be an installation condition theprojection device 1 must satisfy in order to appropriately detect the object U positioned over the projection region Aa. The two notification lines L described above are reference lines for setting the position of thesensor 3 and the projection region Aa so such an installation condition is satisfied. - As illustrated in
FIG. 9A , the boundary (seeFIG. 8A ) of the detection region in the side view is made to be parallel to the projection surface F when thesensor 3 is installed parallel to the projection surface F. - As illustrated in
FIGS. 9A and9B , when thesensor 3 is installed parallel to the projection surface F, an emission direction of each notification light Rb is fixed relative to thesensor 3 so each notification light Rb emitted toward an endpoint of each notification line L near thesensor 3 passes through a same point on the projection surface F. Moreover, when thesensor 3 is installed parallel to the projection surface F, the emission direction of each notification light Rb may be fixed relative to thesensor 3 so each notification light Rb emitted toward an endpoint of each notification line L away from thesensor 3 passes through along a boundary line between the detection region and the dead region. - By the emission direction of each notification light Rb being fixed in this manner, an intersection Cb of the notification lights Rb is positioned on the projection surface F in the case in which the
sensor 3 is installed parallel to the projection surface F. Because of this, two notification lines L of a line-segment shape contacting each other at an endpoint Vb are projected to the projection surface F. Moreover, in the plan view, the virtual region Ab of the triangular shape formed by the two notification lines L is included in the detection region of thesensor 3. - As illustrated in
FIGS. 10A and10B , in the case in which thesensor 3 is installed at the first inclination, which is not parallel to the projection surface F due to, for example, alight incidence surface 30 side of thesensor 3 being lifted up by aforeign object 4, the intersection Cb of the notification lights Rb is positioned above the projection surface F. Because of this, the two notification lines L separated from each other are projected on the projection surface F. - As illustrated in
FIGS. 11A and11B , in the case in which thesensor 3 is installed at the second inclination, which is not parallel to the projection surface F due to, for example, a side of thesensor 3 opposite thelight incidence surface 30 being lifted up by theforeign object 4, the intersection Cb of the notification lights Rb is positioned below the projection surface F. Because of this, the two notification lines L intersecting each other are projected on the projection surface F. - The user may determine whether the
sensor 3 is disposed parallel to the projection surface F by whether the two notification lines L projected on the projection surface F are seen contacting each other at the endpoints and may adjust disposal of thesensor 3 so that thesensor 3 becomes parallel to the projection surface F. - Furthermore, description will be continued for exemplary installation conditions of the
projector 1 according to one or more embodiments of the first example of the present invention. The description hereinbelow will refer to a case in which thesensor 3 is installed parallel to the projection surface F (seeFIGS. 9A and9B ). -
FIG. 12 is a plan view illustrating a positional relationship between a projection region and notification lines according to one or more embodiments of the first example of the present invention. Moreover,FIG. 13 is a side view illustrating a case in which the sensor is installed in a position nearest to the projection region. Moreover,FIG. 14 is a side view illustrating a case in which the sensor is installed in a position farthest from the projection region. To facilitate understanding of the main components, inFIG. 12 , illustration is omitted for the scanning laser light Ra, the notification light Rb, and the reflected lights thereof, and inFIGS. 13 and 14 , illustration is omitted for the scanning laser light Ra and each reflected light. - As illustrated in
FIG. 12 , the two notification lines L may be projected substantially axisymetrically relative to a one-dot chain line X-X. This one-dot chain line X-X passes through a midpoint of an endpoint (referred to hereinbelow as the endpoint Vb) nearest to thesensor 3 and an endpoint (referred to hereinbelow as an endpoint Eb) farthest from thesensor 3; each endpoint is included in the outer periphery of the virtual region Ab formed between the two notification lines L. Moreover, viewed from the normal direction (Z direction) of the projection surface F, the one-dot chain line X-X passes through an orthogonal projection point O to the projection surface F of a center point of thelight incidence surface 30. - As illustrated in
FIG. 12 , thesensor 3 may be installed on the projection surface F so that the lower edge Ea1 of the projection region Aa is positioned in the virtual region Ab between the two notification lines L. At this time, the lower edge Ea1 of the projection region Aa may be substantially parallel to an edge Eb of the virtual region Ab. - Furthermore, as illustrated in
FIG. 13 , thesensor 3 may be brought nearer to the projection region Aa until both endpoints of the lower edge Ea1 of the projection region Aa are substantially in a same position as points Q1, Q2, which are points on the notification lines L nearest to each endpoint. Moreover, as illustrated inFIG. 14 , thesensor 3 may be distanced from the projection region Aa until the lower edge Ea1 of the projection region Aa is substantially in a same position as the edge Eb of the virtual region Ab. -
- Here, in
formula 1 described above, a gap (that is, a shortest distance in the Y direction) between one endpoint P1 in the lower edge Ea1 of the projection region Aa and the one-dot chain line X-X is defined as wa1, and a gap between another endpoint P2 and the one-dot chain line X-X is defined as wa2. Moreover, a gap between an endpoint of the upper edge Eb of the virtual region Ab between the two notification lines L and the one-dot chain line X-X is defined as Wb. - Furthermore, in
formula 2 described above, a gap (that is, a shortest distance in the X direction) between the one endpoint P1 in the lower edge Ea1 of the projection region Aa and the edge Eb of the virtual region Ab is defined as ha1, and a gap between the other endpoint P2 and an upper edge Eb2 is defined as ha2. Moreover, a distance in the X direction between the endpoint Vb of the virtual region Ab and the orthogonal projection point O is defined as Hb1, and a distance in the X direction between the edge Eb of the virtual region Ab and the orthogonal projection point O is defined as Hb2. However, 0 < Hb1 < Hb2. - Furthermore, in
formula 3 described above, an angle formed by a virtual line passing through the one endpoint P1 in the lower edge Ea1 of the projection region Aa and the endpoint Vb of the virtual region Ab and the one-dot chain line X-X is defined as θa1, and an angle formed by a virtual line passing through the other endpoint P2 and the endpoint Vb of the virtual region Ab and the one-dot chain line X-X is defined as θa2. Moreover, an angle formed by a virtual line passing through the point Q1 on one notification line L nearest to the endpoint P1 and the endpoint Vb of the virtual region Ab and the one-dot chain line X-X is defined as θb1. Moreover, an angle formed by a virtual line passing through the point Q2 on another notification line L nearest to the endpoint P2 and the endpoint Vb of the virtual region Ab and the one-dot chain line X-X is defined as θb2. - Both endpoints P1, P2 of the lower edge Ea1 of the projection region Aa are away from the outer periphery of the virtual region Ab (that is, 0 < wa1, wa2 < Wb in
formula 1 described above; 0 < ha1, ha2 < (Hb2-Hb2) informula 2 described above; and θa1 < θb1 and θa2 < θb2 informula 3 described above; wa1 = wa2 informula 1 described above and θa1 = θa2 informula 3 described above). By configuring in this manner, thesensor 3 may reliably detect the reflected light from the entirety of the projection region Aa because the lower edge Ea1 of the projection region Aa may be more accurately positioned in the virtual region Ab. - Next, a calibration process of the installation position of the
sensor 3 according to one or more embodiments will be described.FIG. 15 is a flowchart for describing a calibration process of an installation position of the sensor in one or more embodiments of the first example of the present invention. When the calibration process inFIG. 15 starts, the scanning laser light Ra is being projected to the projection surface F from theprojector device 2 and the notification light Rb is being emitted from thesensor 3. - First, the
sensor 3 detects the reflected light of the scanning laser light Ra reflected at the projection region Aa (step S201). Moreover, thesensor 3, based on a detection result of the reflected light, determines whether reflected lights from each endpoint P1 to P4 (seeFIG. 12 ) of the lower edge Ea1 and an edge Ea2, which is farthest from thesensor 3, included in the outer periphery of the projection region Aa are detected (step S202). The edge Ea2 of the projection region Aa will be referred to hereinbelow as an upper edge. - If all the reflected lights from the four endpoints P1 to P4 are detected (YES at step S202), the user is informed with a predetermined notification (step S203). For example, the user is informed that the
sensor 3 is installed in the appropriate position. Additionally, the user may also be informed with a guide to an installation method where the lower edge Ea1 of the projection region Aa becomes substantially parallel to the lower edge Eb1 (or the upper edge Eb2) of the virtual region Ab between the two notification lines. Then, after the user is informed that the calibration process is completed, the calibration process inFIG. 15 ends. - Furthermore, if the reflected light from any of the four endpoints P1 to P4 is not detected (NO at step S202), whether the reflected lights from both endpoints P1, P2 of the lower edge Ea1 of the projection region Aa are detected is determined (step S204).
- If the reflected lights from both of the two endpoints P1, P2 are detected (YES at step S204), the
sensor 3 further determines whether the reflected light from one endpoint P4 on the upper edge Ea2 of the projection region Aa is detected (step S205). - If the reflected light from the one endpoint P4 is detected (YES at step S205), the user is informed to reinstall the orientation of the
sensor 3 in a direction away from the endpoint P4 (step S206). That is, the user is informed to reinstall the orientation of thesensor 3 in, for example, a counterclockwise manner viewed from the normal direction (Z direction) of the projection surface F inFIG. 12 . The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S201. - Furthermore, if the reflected light from the one endpoint P4 is not detected (NO at step S205), the user is informed to reinstall the orientation of the
sensor 3 in a direction nearer to the endpoint P4 (step S207). That is, the user is informed to reinstall the orientation of thesensor 3 in a clockwise manner viewed from the normal direction of the projection surface F inFIG. 12 . The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S201. - Furthermore, if the reflected lights from both endpoints P1, P2 of the lower edge Ea1 are not detected (NO at step S204), whether the reflected light from one of either of both of the two endpoints P1, P2 is detected is determined (step S208).
- If the reflected light from one of the endpoints is detected (YES at step S208), the
sensor 3 determines whether the reflected light from the endpoint P1 of the lower edge Ea1 is detected (step S209). If the reflected light from the endpoint P1 of the lower edge Ea1 is detected (YES at step S209), the user is informed to reinstall the orientation of thesensor 3 in the direction away from the endpoint P1 (that is, counterclockwise inFIG. 12 ) (step S210). The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S201. - Furthermore, if the reflected light from the endpoint P1 of the lower edge Ea1 is not detected (NO at step S209), the user is informed to reinstall the orientation of the
sensor 3 in the direction nearer to the endpoint P1 (that is, clockwise inFIG. 12 ) (step S211). The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S201. - Furthermore, if the reflected light from both of the two endpoints P1, P2 are not detected at all in step S208 (NO at step S208), the user is informed to reinstall the
sensor 3 away from the projection region Aa in the X direction (step S212). The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S201. - The calibration process of the installation position of the
sensor 3 in one or more embodiments the first example of the present invention was described above. In step S205 of the calibration process described above, whether the reflected light from the endpoint P3, instead of the reflected light from the endpoint P4, is detected may be determined. It is needless to say that in this situation, processes in steps S206 and S207 are interchanged. That is, in step S206, the user is informed to reinstall the orientation of thesensor 3 in the direction away from the endpoint P3 (that is, clockwise inFIG. 12 ). Moreover, in step S207, the user is informed to reinstall the orientation of thesensor 3 in the direction nearer to the endpoint P3 (that is, counterclockwise inFIG. 12 ). - Furthermore, in step S209 of the calibration process described above, whether the reflected light from the endpoint P2, instead of the reflected light from the endpoint P1, is detected may be determined. It is needless to say that in this situation, processes in steps S210 and S211 are interchanged. That is, in step S210, the user is informed to reinstall the orientation of the
sensor 3 in the direction away from the endpoint P2 (that is, clockwise inFIG. 12 ). Moreover, in step S211, the user is informed to reinstall the orientation of thesensor 3 in the direction nearer to the endpoint P2 (that is, counterclockwise inFIG. 12 ). - As above, the
projector 1 according to one or more embodiments the first example of the present invention comprises theprojector device 2 that projects the scanning laser light Ra to the projection region Aa on the projection surface F and thesensor 3 disposed away from theprojector device 2. Thesensor 3 includes thelight detector 31 and thenotification light emitter 37. Thelight detector 31 detects the reflected light of the scanning laser light Ra projected from theprojector device 2. Thenotification light emitter 37 emits to the projection surface F the notification light Rb for indicating the inclination of thesensor 3 relative to the projection surface F in the detectable region (not illustrated) where thelight detector 31 detects the reflected light of the scanning laser light Ra and setting the position of the projection region Aa. - According to one or more embodiments of the above configuration, the
projector 1 may further be made smaller in size because theprojector device 2 and thesensor 3 are disposed separately from each other. Moreover, the restriction on the installation location of theprojector 1 may also be mitigated. Moreover, the projection region Aa may be set to a desired size according to the relative position of theprojector device 2 relative to the projection surface F because theprojector device 2 may be disposed in any position without being restricting by the installation position of thesensor 3. Moreover, disposition of thesensor 3 may be adjusted based on the notification light Rb emitted from thenotification light emitter 37 to set the projection region Aa in the detectable region (not illustrated) of thelight detector 31. Therefore, thesensor 3 may detect the reflected light of the scanning laser light Ra projected from theprojector device 2 even when theprojector device 2 and thesensor 3 are disposed separately from each other. - Furthermore, in the
projector 1 according to one or more embodiments the first example of the present invention, the notification light Rb projects the notification lines L on the projection surface F. These notification lines L are projection images for setting the position of at least the lower edge Ea1, which is included in the outer periphery of the projection region Aa and is nearest to thesensor 3, in the detectable region (not illustrated). - By configuring in this manner, the position of the projection region Aa may be set in the detectable region (not illustrated) of the
sensor 3 by setting the position of at least the lower edge Ea1, which is nearest to thesensor 3, of the projection region Aa based on the projection images (notification lights L) of the notification light Rb. - Furthermore, in the
projector 1 according to one or more embodiments the first example of the present invention, thesensor 3 further includes theregion determinator 332 that determines, based on the detection result of thelight detector 31, whether a portion or the entirety of the projection region Aa is outside the detectable region. - The reflected light of the scanning laser light Ra projected from the
projector device 2 to the projection region Aa, for example, is incident to thelight detector 31. Therefore, theregion determinator 332 may be made to determine whether a portion or the entirety of the projection region Aa is outside the detectable region based on the detection result of thelight detector 31 that detects this reflected light. - Furthermore, in the
projector 1 according to one or more embodiments the first example of the present invention, theregion determinator 332 determines whether both endpoints P1, P2 of at least the lower edge Ea1, which is included in the outer periphery of the projection region Aa and is nearest to thesensor 3, are in the detectable region. - When the projection region Aa is in the detectable region of the
light detector 31, both endpoints P1, P2 of at least the lower edge Ea1, which is nearest to thesensor 3, of the projection region Aa are in the detectable region. Moreover, in this situation, both endpoints P3, P4 of the upper edge Ea2, which is farthest from thesensor 3, are often also in the detectable region. Conversely, even if both endpoints P3, P4 of the upper edge Ea2, which is farthest from thesensor 3, are in the detectable region, there is a high possibility that both endpoints P1, P2 of the lower edge Ea1, which is nearest to thesensor 3, are not in the detectable region. Therefore, whether the projection region Aa is in the detectable region is more reliably and easily determined by determining whether at least both endpoints P1, P2 are in the detectable region. - Furthermore, in the
projector 1 according to one or more embodiments the first example of the present invention, thesensor 3 further includes theinformation output device 36 that informs based on the determination result of theregion determinator 332. - By configuring in this manner, the user may be informed with various notifications relating to the relationship between the detectable region (not illustrated) of the
light detector 31 and the projection region Aa based on the determination result of theregion determinator 332. - Furthermore, in the
projector 1 according to one or more embodiments the first example of the present invention, theinformation output device 36, when theregion determinator 332 determines that at least a portion of the projection region Aa is outside the detectable region, outputs a correction method for setting the projection region Aa in the detectable region. The output correction method is a method for correcting the installation position and/or the orientation of thesensor 3. - By configuring the projector in this manner, the user may be informed with the correction method for setting the projection region Aa in the detectable region when at least a portion of the projection region Aa is determined to be outside the detectable region of the
light detector 31. Therefore, the user may correct the installation position and/or the orientation relative to thesensor 3 based on the informed correction method. Therefore, the user may conveniently set the entirety of the projection region Aa in the detectable region. - Next, one or more embodiments of a second example of the present invention will be described.
-
FIG. 16 is an external perspective view of the projector according to one or more embodiments of the second example of the present invention. InFIG. 16 , to facilitate understanding of the main portions, illustration is omitted for the scanning laser light Ra, the notification light Rb, and the reflected lights thereof. - As illustrated in
FIG. 16 , in one or more embodiments of the second example of the present invention, a regulatory light Rc for regulating the projection region Aa is projected from theprojector device 2, and projection images thereof (spots S1 to S4) are projected to the projection surface F. Thesensor 3 detects a reflected light of the regulatory light Rc reflected at each spot S1 to S4 and informs based on a detection result thereof. Then, the user installs theprojector 1 while referring to the two spots S1, S2 positioned near both ends of the lower edge Ea1 of the projection region Aa and to the two notification lines L (virtual region Ab). Other than this, one or more embodiments of the second example of the present invention are similar to the first example of the present invention. Hereinbelow, same reference numbers will be used in configurations similar to those in the first example, and the description thereof will be omitted. -
FIG. 17 is a block diagram illustrating a configuration of the projector according to one or more embodiments of the second example of the present invention. Theprojector device 2 may further comprise aregulatory light emitter 25. Thisregulatory light emitter 25 may include alight source 251 that emits the regulatory light Rc of the wavelength in the visible light region and projects on the projection surface F a projection image configured including the four spots S1 to S4. While thelight source 251 of the regulatory light Rc is not limited in particular, the light-emitting element that emits the light in the visible light region, such as the laser diode element or the LED; the discharge tube; or the like may be illustrated as such. - Next, exemplary installation conditions of the
projector 1 according to one or more embodiments of the second example of the present invention will be described.FIG. 18 is a plan view illustrating the positional relationship between the projection region and the notification lines in one or more embodiments of the second example of the present invention. InFIG. 18 , to facilitate understanding of the main portions, illustration is omitted for the scanning laser light Ra, the notification light Rb, the regulatory light Rc, and the reflected lights thereof. - As illustrated in
FIG. 18 , thesensor 3 may be installed on the projection surface F so that the spots S1, S2 are positioned in the virtual region Ab between the two notification lines L. At this time, a virtual line segment connecting the spots S1, S2 may be substantially parallel to the lower edge Eb1 and the upper edge Eb2 of the virtual region Ab. - Furthermore, as illustrated in
FIG. 18 , the four spots S1 to S4 projected by theregulatory light emitter 25 may be respectively projected to regions near the four endpoints P1 to P4 of the projection region Aa. A center point of each spot S1 to S4 may be separated from each endpoint P1 to P4 of the projection region Aa only by m (m ≥ 0). InFIG. 18 , gaps between each center point of the spots S1 to S4 and each endpoint P1 to P4 (that is, a minimum distance between the two) are the same but may be different. Moreover, each shape of the spots S1 to S4 are not limited in particular. Each shape may be, for example, circular, as inFIG. 18 ; polygonal (n-angle shaped: n is a positive integer not less than 3); or cross-shaped. Moreover, the shape of each spot S1 to S4 may all be the same, or at least one shape may be different. - These spots S1 to S4 are used to set the position of the projection region Aa. In one or more embodiments of the second example of the present invention, the
sensor 3 may detect the reflected light from the projection region Aa if the spots S1, S2 near both endpoints P1, P2 of the lower edge Ea1 of the projection region Aa are in the virtual region Ab between the two notification lines L. Because of this, the user may install the projector 1 (in particular, the sensor 3) in the appropriate position while referring to the two spots S1, S2 and the two notification lines L (virtual region Ab). - These four spots S1 to S4 are only projected in an installation period of the projector 1 (for example, during the calibration process of the installation position of the sensor 3) but may be projected in periods other than this or may be continuously projected. Moreover, projection and projection stopping of the four spots S1 to S4 may be switched according to an operation input by the user accepted by the
operator 35. -
- Here, in
formula 4 described above, a gap (that is, a shortest distance in the Y direction) between the center point of the spot S1 and the one-dot chain line X-X is defined as wa3, and a gap between the center point of the spot S2 and the one-dot chain line X-X is defined as wa4. Moreover, the gap between the endpoint of the edge Eb of the virtual region Ab between the two notification lines L and the one-dot chain line X-X is defined as Wb. - Moreover, in formula 5 described above, a gap (that is, a shortest distance in the X direction) between the center point of the spot S1 and the edge Eb of the virtual region Ab is defined as ha3, and a gap between the center point of the spot S2 and upper edge Eb2 is defined as ha4. Moreover, the distance in the X direction between the endpoint Vb of the virtual region Ab and the orthogonal projection point O is defined as Hb1, and the distance in the X direction between the edge Eb of the virtual region Ab and the orthogonal projection point O is defined as Hb2. However, 0 < Hb1 < Hb2.
- Furthermore, in formula 6 described above, an angle formed by a virtual line passing through the center point of the spot S1 and the endpoint Vb of the virtual region Ab and the one-dot chain line X-X is defined as θa3, and an angle formed by a virtual line passing through the center point of the spot S2 and the endpoint Vb of the virtual region Ab and the one-dot chain line X-X is defined as θa4. Moreover, an angle formed by a virtual line passing through a point Q3 on the notification line L nearest to the spot S1 and the endpoint Vb of the virtual region Ab and the one-dot chain line X-X is defined as θb3. Moreover, an angle formed by a virtual line passing through a point Q4 on the notification line L nearest to the spot S2 and the endpoint Vb of the virtual region Ab and the one-dot chain line X-X is defined as θb4.
- Both endpoints P1, P2 of the lower edge Ea1 of the projection region Aa are away from the outer periphery of the virtual region Ab (that is, 0 < wa3, Wa4 < Wb in
formula 4 described above; 0 < ha3, ha4 < (Hb2-Hb1) in formula 5 described above; and θa3 < θb3 and θa4 < θb4 in formula 6 described above; wa3 = wa4 informula 4 described above and θa3 = θa4 in formula 6 described above). By configuring in this manner, thesensor 3 may reliably detect the reflected light from the entirety of the projection region Aa because the spots S1, S2 may be more accurately positioned in the virtual region Ab. - Next, a calibration process of the installation position of the
sensor 3 according to one or more embodiments will be specifically described.FIG. 19 is a flowchart for describing the calibration process of the installation position of the sensor in one or more embodiments of the second example of the present invention. When the calibration process inFIG. 19 starts, the regulatory light Rc is being projected to the projection surface F from theprojector device 2 and the notification light Rb is being emitted from thesensor 3. - First, the
sensor 3 detects the reflected light of the regulatory light Rc reflected at the projection region Aa (step S301). Moreover, thesensor 3, based on the detection result of the reflected light, determines whether reflected lights from the four spots S1 to S4 are all detected (step S302). - If the reflected lights from the four spots S1 to S4 are all detected (YES at step S302), the user is informed with the predetermined notification (step S303). For example, the user is informed that the
sensor 3 is installed in the appropriate position. Additionally, the user may also be informed with a guide to an installation method where the virtual line segment connecting the spots S1, S2 (that is, the lower edge Ea1 of the projection region Aa) becomes substantially parallel to the lower edge Eb2 of the virtual region Ab between the two notification lines L. Then, after the user is informed that the calibration process is completed, the calibration process inFIG. 15 ends. - Furthermore, if the reflected light from any of the four spots S1 to S4 are not detected (NO at step S302), whether the reflected lights from both spots S1, S2 on a lower edge Ea1 side of the projection region Aa are detected is determined (step S304).
- If the reflected lights from both of the two spots S1, S2 are detected (YES at step S304), the
sensor 3 further determines whether the reflected light from one spot S4 on an upper edge Ea2 side of the projection region Aa is detected (step S305). - If the reflected light from the one spot S4 is detected (YES at step S305), the user is informed to reinstall the orientation of the
sensor 3 in a direction away from the spot S4 (step S306). That is, the user is informed to reinstall the orientation of thesensor 3 in a counterclockwise manner viewed from the normal direction (Z direction) of the projection surface F inFIG. 18 . The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S301. - If the reflected light from the one spot S4 is not detected (NO at step S305), the user is informed to reinstall the orientation of the
sensor 3 in a direction nearer to the spot S4 (step S307). That is, the user is informed to reinstall the orientation of thesensor 3 in a clockwise manner viewed from the normal direction of the projection surface F inFIG. 18 . The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S301. - Furthermore, if the reflected light from both of the two spots S1, S2 on the lower edge Ea1 side are not detected (NO at step S304), whether the reflected light from one of either of the two spots S1, S2 is detected is determined (step S308).
- If the reflected light from one of the spots is detected (YES at step S308), the
sensor 3 determines whether the reflected light from the spot S1 on the lower edge Ea1 side is detected (step S309). If the reflected light from the spot S1 is detected (YES at step S309), the user is informed to reinstall the orientation of thesensor 3 in the direction away from the spot S1 (that is, counterclockwise inFIG. 18 ) (step S310). The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S301. - Furthermore, if the reflected light from the spot S1 on the lower edge Ea1 side is not detected (NO at step S309), the user is informed to reinstall the orientation of the
sensor 3 in a direction nearer to the spot S1 (that is, clockwise inFIG. 18 ) (step S311). The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S301. - Furthermore, if the reflected light from both of the two spots S1, S2 are not detected at all in step S308 (NO at step S308), the user is informed to reinstall the
sensor 3 away from the projection region Aa in the X direction (step S312). The user thus informed may reinstall so that the position of thesensor 3 is at the appropriate position. Then, the calibration process returns to step S301. - The calibration process of the installation position of the
sensor 3 in one or more embodiments of the second example of the present invention was described above. In step S305 of the calibration process described above, whether the reflected light from the spot S3, instead of the reflected light from the spot S4, is detected may be determined. It is needless to say that in this situation processes in steps S306 and S307 are interchanged. That is, in step S306, the user is informed to reinstall the orientation of thesensor 3 in a direction away from the spot S3 (that is, clockwise inFIG. 18 ). Moreover, in step S307, the user is informed to reinstall the orientation of thesensor 3 in a direction nearer to the spot S3 (that is, counterclockwise inFIG. 18 ). - Furthermore, in step S309 of the calibration process described above, whether the reflected light from the spot S2, instead of the reflected light from the spot S1, is detected may be determined. It is needless to say that in this situation processes in steps S310 and S311 are interchanged. That is, in step S310, the user is informed to reinstall the orientation of the
sensor 3 in a direction away from the spot S2 (that is, clockwise inFIG. 18 ). Moreover, in step S311, the user is informed to reinstall the orientation of thesensor 3 in a direction nearer to the spot S2 (that is, counterclockwise inFIG. 18 ). - As described above, the
projector 1 according to one or more embodiments of the second example of the present invention may comprise theprojector device 2 including thelaser light projection 21 that projects the scanning laser light Ra and theregulatory light emitter 25 that emits the regulatory light Rc for regulating the projection region Aa. - By configuring the
projector 1 in this manner, the position of the projection region Aa may be set in the detectable region of thelight detector 31 based on the regulatory light Rc (or the spots S1 to S4) for regulating the projection region Aa and the notification light Rb (or the notification lines L). - Next, one or more embodiments of a third example of the present invention will be described. In the third example, the
sensor 3 may comprise a mechanism that automatically adjusts the installation position and/or the orientation of thesensor 3 based on the detection result of the reflected light by thelight detector 31. Other than this, the third example may be similar to one or more embodiments of the first or second examples of the present invention. Hereinbelow, the same reference numbers will be used in configurations similar to those of the first and second examples, and the description thereof will be omitted. -
FIG. 20 is a block diagram illustrating the configuration of the sensor according to one or more embodiments of the third example of the present invention. As illustrated inFIG. 20 , thesensor 3 may further comprise anactuator 38. Moreover, thesecond controller 33 may further comprise aposition determinator 333. The position determinator 333 may be realized as an electrical circuit or may be a functional unit of thesecond controller 33 realized by using the program and the control information stored in thesecond memory 32. - The position determinator 333 determines the relative position of the projection region Aa relative to the detectable region of the
light detector 31 based for example on the detection result by thelight detector 31 of each reflected light (for example, the reflected lights of the scanning laser light Ra and the regulatory light Rc) from the projection surface F. Moreover, theposition determinator 333 may, based on a determination result thereof, calculate position correction amounts of the installation position and/or the orientation of thesensor 3 for positioning the projection region Aa in the detectable region of thelight detector 31. In this situation, theposition determinator 333 may calculate position correction amounts where the lower edge Ea1 of the projection region Aa is substantially parallel to the lower edge Eb2 in the virtual region Ab between the two notification lines L. - The
actuator 38 may be a driver that automatically adjusts the installation position and/or the orientation of thesensor 3. Theactuator 38 drives thesensor 3 in the X direction, the Y direction, the Z direction, and rotation directions with each direction as axes thereof based on the determination result of theposition determinator 333. For example, theactuator 38 drives thesensor 3 so that the projection region Aa enters the detectable region based on the relative position of the projection region Aa determined by theposition determinator 333. Alternatively, thesensor 3 may be driven based on the position correction amounts calculated by theposition determinator 333. By configuring in this manner, there is no need for the user to reinstall thesensor 3 by hand. Therefore, the installation position and/or the orientation of thesensor 3 may be conveniently calibrated. Theactuator 38 may drive thesensor 3 according to an operation input by the user accepted by theoperator 35. - As described above, in one or more embodiments of the third example of the present invention, the
sensor 3 may further include the position determinator 333 and theactuator 38. The position determinator 333 determines, based on the detection result of thelight detector 31, the relative position of the projection region Aa relative to the detectable region. Theactuator 38 adjusts the installation position and/or the orientation of thesensor 3 based on the determination result of theposition determinator 333. - By configuring in this manner, in the
sensor 3, the installation position and/or the orientation of thesensor 3 may be adjusted by theactuator 38 based on the determination result of theposition determinator 333. Therefore, the entirety of the projection region Aa may be automatically set in the detectable region without the user reinstalling thesensor 3 by hand. Moreover, even if the projection region Aa moves outside the detectable region after installation of eachdevice - Next, a modified example of one or more embodiments of the third example of the present invention will be described. In the modified example of the third example, the
sensor 3 may include a mechanism that detects the inclination of thesensor 3 and automatically adjusts the installation position and/or the orientation of thesensor 3 based on the detection result. Other than this, the modified example of the third example may be similar to one or more embodiments of the first or second examples of the present invention. Hereinbelow, the same reference numbers will be used in configurations similar to those in one or more embodiments of the first or second examples of the present invention, and the description thereof will be omitted. -
FIG. 21 is a block diagram illustrating the configuration of the sensor according to the modified example of one or more embodiments of the third example of the present invention. As illustrated inFIG. 21 , thesensor 3 may include theactuator 38 and aninclination detector 39. Moreover, thesecond controller 33 may include aninclination determinator 334. Theinclination determinator 334 may be realized as an electrical circuit or may be a functional unit of thesecond controller 33 realized by using the program and the control information stored in thesecond memory 32. - The
inclination detector 39 may be, for example, a sensor that detects a change in an attitude, such as a gyro sensor, and detects a change amount in the inclination of thesensor 3. - The
inclination determinator 334 determines a shift amount from an origin point of the inclination of thesensor 3 based on a detection result by theinclination detector 39 of the change amount in the inclination, the origin point being an inclination when thesensor 3 is installed parallel to the projection surface F. Theinclination determinator 334 may recognize the origin point according to, for example, an operation input by the user accepted by theoperator 35. Moreover, theinclination determinator 334 may determine the shift amount by adding, from the origin point, the change amount in the inclination detected by theinclination detector 39. Moreover, theinclination determinator 334 may calculate the position correction amounts of the installation position and/or the orientation of thesensor 3 for positioning thesensor 3 parallel to the projection surface F based on the determination result of the shift amount. - The
actuator 38 may automatically adjust the installation position and/or the orientation of thesensor 3. Theactuator 38 may drive thesensor 3 in the rotation direction with the axis thereof in the Y direction based on the determination result of theinclination determinator 334. For example, theactuator 38 drives thesensor 3 so that thesensor 3 becomes parallel to the projection surface F based on the shift amount of the inclination of thesensor 3 determined by theinclination determinator 334. Alternatively, thesensor 3 may be driven based on the position correction amounts calculated by theposition determinator 333. By configuring in this manner, there is no need for the user to reinstall thesensor 3 by hand. Therefore, the inclination of thesensor 3 may be conveniently calibrated. - As described above, in the modified example of one or more embodiments of the third example of the present invention, the
sensor 3 may include theinclination determinator 334, theinclination detector 39, and theactuator 38. Theinclination determinator 334 determines the shift amount of the inclination of thesensor 3 from an origin point direction parallel to the projection surface F based on the detection result of theinclination detector 39. Theactuator 38 adjusts the orientation of thesensor 3 based on the determination result of theinclination determinator 334. - By configuring in this manner, in the
sensor 3, the orientation of thesensor 3 is adjusted by theactuator 38 based on the determination result of theinclination determinator 334. Therefore, thesensor 3 may automatically be set parallel to the projection surface F without the user reinstalling thesensor 3 by hand. Moreover, thesensor 3 may be installed parallel to the projection surface F even if the inclination of thesensor 3 changes after installation of thesensor 3. - The
actuator 38 is described as automatically adjusting the installation position and/or the orientation of thesensor 3. For example, theactuator 38 may automatically adjust an orientation of thelight detector 31 relative to thesensor 3. According to such a configuration, thesensor 3, even when inclined from the projection surface F, may adjust the orientation of thelight detector 31 so that the detection region (seeFIG. 8A ) of thelight detector 31 covers over the entirety of the projection region Aa at the uniform height. - Next, one or more embodiments of a fourth example of the present invention will be described. In the fourth example, the
projector device 2 may include a mechanism that automatically adjusts an installation position and an orientation of theprojector device 2 based on the detection result of thelight detector 31. Moreover, the position of the projection region Aa relative to the detectable region (not illustrated) of thesensor 3 may be adjusted by adjusting the installation direction and/or the orientation of theprojector device 2. Other than this, the fourth example may be similar to the first or second examples. Hereinbelow, the same reference numbers will be used in configurations similar to those in the first or second example, and the description thereof will be omitted. -
FIG. 22 is a block diagram illustrating the configuration of the projector device according to the fourth example. As illustrated inFIG. 22 , theprojector device 2 may include anactuator 26 and aregulatory light emitter 25 of a similar configuration to that of the second example. Moreover, thefirst controller 23 may include aposition determinator 231. The position determinator 231 may be realized as an electrical circuit or may be a functional unit of thesecond controller 23 realized by using the program and the control information stored in thesecond memory 22. - The position determinator 231 may determine, based on the detection result of the
light detector 31 thefirst communicator 24 receives from thesensor 3, the relative position of the projection region Aa relative to the detectable region. Moreover, theposition determinator 231 may, based on a determination result thereof, calculate the position correction amounts of the installation position and/or the orientation of thesensor 3 for positioning the projection region Aa in the detectable region of thelight detector 31. In this situation, theposition detector 231 may calculate the position correction amounts where the lower edge Ea1 of the projection region Aa is substantially parallel to the lower edge Eb2 in the virtual region Ab between the two notification lines L. - The
actuator 26 may automatically adjust the installation position and/or the orientation of theprojector device 2. Theactuator 26 may drive theprojector device 2 in the X direction, the Y direction, the Z direction, and rotation directions with each direction as axes thereof based on the determination result of theposition determinator 231. For example, theactuator 26 drives theprojector device 2 so that the projection region Aa enters the detectable region based on the relative position of the projection region Aa determined by theposition determinator 231. Alternatively, theprojector device 2 may be driven based on the position correction amounts calculated by theposition determinator 231. By configuring in this manner, there is no need for the user to reinstall theprojector device 2 by hand. Therefore, the installation position and/or the orientation of theprojector device 2 may be conveniently calibrated. Theactuator 26 may drive theprojector device 2 according to an operation input by the user accepted by theoperator 35. - As described above, in the fourth example, the
projector device 2 may include the position determinator 231 and theactuator 26. The position determinator 231 determines, based on the detection result of thelight detector 31, the relative position of the projection region Aa relative to the detectable region. Theactuator 26 adjusts the installation position and/or the orientation of theprojector device 2 based on the determination result of theposition determinator 231. - By configuring in this manner, in the
projector device 2, the installation position and the orientation of theprojector device 2 are adjusted by theactuator 26 based on the determination result of theposition determinator 231. Therefore, the entirety of the projection region Aa may be automatically set in the detectable region without the user reinstalling theprojector device 2 by hand. Moreover, the projection region Aa may be automatically set in the detectable region even if the projection region Aa moves outside the detectable region after installation of eachdevice - In the fourth example, only the
projector device 2 is of a configuration (seeFIG. 20 ) comprising the position determinator 231 and theactuator 26, but thesensor 3 may be of a configuration (seeFIG. 19 ) including the position determinator 333 and theactuator 38, similar to the third example. By configuring in this manner, the installation positions and the orientations of both theprojector device 2 and thesensor 3 may be automatically adjusted. Moreover, at least either the installation positions or the orientations of theprojector device 2 and thesensor 3 may be adjusted. - In this manner, in the modified example of the fourth example, the
projector device 2 may include the position determinator 231 and theactuator 26. Moreover, thesensor 3 may include the position determinator 333 and theactuator 38. Each of the position determinators 213, 333 determine, based on the detection result of thelight detector 31, the relative position of the projection region Aa relative to the detectable region. Theactuator 26 adjusts at least either the installation position or the orientation of theprojector device 2 based on the determination result of theposition determinator 231. Moreover, theactuator 38 adjusts at least either the installation position or the orientation of thesensor 3 based on the determination result of theposition determinator 333. - By configuring in this manner, at least either the installation positions or the orientations of the
projector device 2 and thesensor 3 are adjusted by theactuators projector device 2 and thesensor 3 by hand. Moreover, the projection region Aa may be automatically set in the detectable region even if the projection region Aa moves outside the detectable region after installation of eachdevice - Next, one or more embodiments of a fifth example of the present invention will be described. In the first to fourth examples, the notification light Rb projects (see
FIG. 1 ) the notification lines L, which are projection images of a line-segment shape, to the projection surface F, but in the fifth example, the notification light Rb projects notification points S, which are projection images of a spot shape, to the projection surface F. -
FIG. 23 is an external perspective view of thesensor 3 according to one or more embodiments of the fifth example. Moreover,FIG. 24 is a side view illustrating case in which thesensor 3 is installed at the predetermined inclination relative to the projection surface F. Moreover,FIG. 25 is a side view illustrating the case in which thesensor 3 is installed at the first inclination, which is different from the predetermined inclination, relative to the projection surface F. Moreover,FIG. 26 is the side view illustrating the case in which thesensor 3 is installed at the second inclination, which is different from the predetermined inclination, relative to the projection surface F. To facilitate the description, the case in which thesensor 3 is installed at the predetermined inclination relative to the projection surface F is referred to hereinbelow as the case in which thesensor 3 is installed parallel to the projection surface F. - As illustrated in
FIG. 23 , in the fifth example, as one example, thenotification light emitter 37 is disposed lined up in the Z direction. Moreover, as illustrated inFIG. 24 , in the case in which thesensor 3 is installed parallel to the projection surface F, the emission direction of each notification light Rb is fixed relative to thesensor 3 so each notification light Rb passes through the same point on the projection surface F. - By the emission direction of each notification light Rb being fixed in this manner, the intersection Cb of the notification lights Rb is positioned on the projection surface F in the case in which the
sensor 3 is installed parallel to the projection surface F. Because of this, the notification points S of the spot shape that overlap each other are projected to the projection surface F. - As illustrated in
FIG. 25 , when thesensor 3 is installed at the first inclination that is not parallel to the projection surface F due to, for example, thelight incidence surface 30 side of thesensor 3 being lifted up by theforeign object 4, the intersection Cb of the notification lights Rb is positioned above the projection surface F. Because of this, two notification points S separated from each other are projected on the projection surface F. - As illustrated in
FIG. 26 , when thesensor 3 is installed at the second inclination that is not parallel to the projection surface F due to, for example, the side of thesensor 3 opposite thelight incidence surface 30 being lifted up by theforeign object 4, the intersection Cb of the notification lights Rb is positioned below the projection surface F. Because of this, two notification points S separated from each other are projected on the projection surface F. - The user may determine whether the
sensor 3 is disposed parallel to the projection surface F by whether the two notification points S projected on the projection surface F are overlapping each other so as to appear to be one point and may adjust disposal of thesensor 3 so that thesensor 3 becomes parallel to the projection surface F. - An example where the
notification light emitter 37 is disposed lined up in the Z direction is described, but, for example, the notification points S may be projected to four positions corresponding to both ends of each of the two notification lines L described in the first example by emitting the notification light Rb from thenotification light emitter 37 disposed lined up in the Y direction. In the case in which thesensor 3 is disposed parallel to the projection surface F, endpoints of each notification line L near thesensor 3 are positioned at a same point on the projection surface F (seeFIG. 9A andFIG. 9B ). Because when disposal of thesensor 3 is adjusted so that the notification points S projected to the positions of these endpoints overlap each other so as to appear to be one point the overlapping notification points S indicate the endpoint Vb of the virtual region Ab and the other two notification points S indicate both ends of the edge Eb of the virtual region Ab, these notification points S may be used as reference points for regulating the position of the projection region Aa (seeFIG. 12 ). - Embodiments of the present invention were described above. The embodiments described above are examples and various modifications are possible in each component and combinations of each process thereof, which modifications are understood by those skilled in the art to be within the scope of the present invention.
- Furthermore, in the first to fourth examples described above, the shape of the notification lines L is a solid line, but the scope of application of the present invention is not limited to this example. The shape of the notification lines L is not limited in particular and may be, for example, a dashed line, a one-dot chain line, or a two-dot chain line, or may be a dot line configured by a plurality of spots lined up in series. Moreover, instead of the notification lines L, a triangular projection image corresponding to the virtual region Ab may be projected by projecting the notification light Rb.
- Furthermore, in the first to fourth examples described above, the two notification lines L are used to regulate the position of the lower edge Ea1 of the projection region Aa, but the scope of application of the present invention is not limited to this example. The two notification lines L may be used to regulate a position of the entirety of the projection region. That is, the
sensor 3 may be of a configuration that may detect the reflected light from the projection region Aa and the object U positioned thereover when a portion or the entirety of the projection region Aa is positioned in the virtual region Ab between the two notification lines L. - Furthermore, in the first to third examples described above, the installation position and the orientation of the
sensor 3 are calibrated, and in the fourth example described above, the installation position and the orientation of theprojector device 2 are calibrated, but the scope of application of the present invention is not limited to these examples. The installation position and the orientation of one of either theprojector device 2 or thesensor 3 may be calibrated. Moreover, at least either the installation positions or the orientations of theprojector device 2 and thesensor 3 may be adjusted. - Furthermore, in the first to fifth examples described above, the
information output device 36 may output the correction method for setting the projection region Aa in the detectable region when at least a portion of the projection region Aa is determined to be outside the detectable region. In this situation, the correction method is a method for correcting at least either the installation position or the orientation of at least one device from among theprojector device 2 and thesensor 3. By configuring in this manner, the user may be informed with the correction method for setting the projection region Aa in the detectable region when at least a portion of the projection region Aa is determined to be outside the detectable region of thelight detector 31. Therefore, the user may correct at least either the installation position or the orientation of at least one device from among theprojector device 2 and thesensor 3 based on the informed correction method. Therefore, the user may conveniently set the entirety of the projection region Aa in the detectable region. - Furthermore, in the first to fifth examples described above, the
projector device 2 projects the scanning laser light Ra of the wavelength in the visible light region toward the projection surface F, but the scope of application of the present invention is not limited to this example. A scanning laser light Ra of a wavelength outside the visible light region (for example, infrared light or ultraviolet light) may be projected. - Furthermore, in the first to fifth examples, the
projector 1 is described as an example of electronic device, but the electronic device is not limited to this example. The present invention is applicable to any electronic device configured including a projection device and a detection device that may be disposed separately and independently from each other. For example, the present invention may be a device that detects a position of an object positioned over a predetermined region mounted or printed with a photograph, a painting, or the like. In this situation, the projection device projects a predetermined light for regulating the predetermined region mounted or printed with a picture of a keyboard or the like as the projection region Aa. The detection device detects the position of the object U based on the reflected light from the object U positioned over this predetermined region. - Although the disclosure has been described with respect to only a limited number of embodiments, those skilled in the art, having benefit of this disclosure, will appreciate that various other embodiments may be devised without departing from the scope of the present invention. Accordingly, the scope of the invention should be limited only by the attached claims.
-
- 1
- Projector
- 2
- Projector device
- 21
- Laser light projector
- 211
- Laser diode (LD)
- 212
- Optical system
- 213
- Laser light driver
- 22
- First memory
- 23
- First controller
- 231
- Position determinator
- 24
- First communicator
- 25
- Regulatory light emitter
- 251
- Light source
- 26
- Actuator
- 3
- Sensor
- 30
- Light incidence surface
- 31
- Light detector
- 32
- Second memory
- 33
- Second controller
- 331
- Position calculator
- 332
- Region determinator
- 333
- Position determinator
- 334
- Inclination determinator
- 34
- Second communicator
- 35
- Operator
- 36
- Information output device
- 37
- Notification light emitter
- 371
- Light source
- 38
- Actuator
- 39
- Inclination detector
- F
- Projection surface
- Aa
- Projection region
- Ea1
- Lower edge
- Ea2
- Upper edge
- Ra
- Laser light
- Rb
- Notification light
- L
- Notification line
- Ab
- Virtual region
- Vb
- Endpoint
- Eb
- Edge
- Rc
- Regulatory light
- S1 to S4, Sb
- Spot
- U
- Object
- P1 to P4
- Endpoint
Claims (12)
- An electronic device (1), comprising:a projection device (2) configured to project a light to a projection region (Aa) on a projection surface (F); anda detection device (3) separately disposed from the projection device (2) andcomprising:a light detector (31) configured to detect a reflected light of the light projected from the projection device (2); anda notification light emitter (37) configured to emit to the projection surface (F) a notification light (Rb) for indicating an inclination of the light detector (31) relative to the projection surface (F), whereinthe notification light emitter (37) is further configured to emit the notification light (Rb) in directions respectively fixed relative to the detection device (3) from a plurality of locations in the detection device (3), each notification light (Rb) passing through a same point on the projection surface (F) when the detection device is installed at a predetermined inclination relative to the projection surface (F).
- The electronic device (1) according to claim 1, wherein, when the detection device (3) is installed at the predetermined inclination relative to the projection surface (F), each notification light (Rb) projects projection images of a spotted shape that overlap each other at the same point on the projection surface (F).
- The electronic device (1) according to claim 1, wherein, when the detection device (3) is installed at the predetermined inclination relative to the projection surface (F), each notification light (Rb) projects projection images of a line-segment shape where the same point on the projection surface (F) is an endpoint of each line-segment shape.
- The electronic device (1) according to any of claim 1 to 3, wherein the detection device (3) further comprises:an inclination detector (39) configured to detect an inclination of the detection device (3); andan inclination determinator (334) configured to determine, based on a detection result of the inclination detector (39), whether an inclination of the detection device (3) relative to the projection surface (F) is at the predetermined inclination.
- The electronic device (1) according to claim 4, wherein the detection device (3) further comprises an actuator (38) configured to adjust, based on a determination result of the inclination determinator (334), an inclination of the light detector relative to the detection device (3).
- The electronic device (1) according to claim 3, wherein the notification light (Rb) projects the projection images of the line-segment shape on the projection surface (F) as projection images for setting a position of at least an edge (Ea1) of an outer periphery of the projection region (Aa), nearest to the detection device (3) in a detectable region where the light detector (31) detects the reflected light.
- The electronic device (1) according to claim 6, wherein the projection device (2) comprises:a light projector (21) configured to project the light; anda regulatory light emitter (25) configured to emit a regulatory light (Rc) for regulating the projection region (Aa).
- The electronic device (1) according to claim 6 or 7, wherein the detection device (3) further comprises a region determinator (332) configured to determine, based on a detection result of the light detector (31), whether a portion or an entirety of the projection region (Aa) is outside the detectable region.
- The electronic device (1) according to claim 8, wherein the region determinator (332) is further configured to determine whether both endpoints of at least the edge (Ea1) of the outer periphery of the projection region (Aa), nearest to the detection device (3) are in the detectable region.
- The electronic device (1) according to claim 8 or claim 9, wherein the detection device (3) further comprises an information output device (36) configured to output information based on a determination result of the region determinator (332).
- The electronic device (1) according to claim 10, wherein, when the region determinator (332) determines that at least a portion of the projection region (Aa) is outside the detectable region, the information output device (36) informs a correction method for setting the projection region (Aa) in the detectable region and for correcting at least either an installation position or an orientation of at least one of the projection device (2) and the detection device (3).
- The electronic device (1) according to any of claims 6 to 11, wherein at least one of the projection device (2) and the detection device (3) further comprises:a position determinator (231, 333) configured to determine, based on the detection result of the light detection, a relative position of the projection region (Aa) relative to the detectable region; andan actuator (26, 38) configured to adjust, based on a determination result of the position determinator (231, 333), at least either the installation position or the orientation of at least one of the devices (2, 3).
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013155000A JP2015026219A (en) | 2013-07-25 | 2013-07-25 | Electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2829955A2 true EP2829955A2 (en) | 2015-01-28 |
EP2829955A3 EP2829955A3 (en) | 2015-02-25 |
Family
ID=51176213
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14176814.3A Withdrawn EP2829955A3 (en) | 2013-07-25 | 2014-07-11 | Electronic device |
Country Status (3)
Country | Link |
---|---|
US (1) | US20150029405A1 (en) |
EP (1) | EP2829955A3 (en) |
JP (1) | JP2015026219A (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014206634A (en) * | 2013-04-12 | 2014-10-30 | 船井電機株式会社 | Electronic apparatus |
JP6555958B2 (en) * | 2015-07-21 | 2019-08-07 | キヤノン株式会社 | Information processing apparatus, control method therefor, program, and storage medium |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012108233A (en) | 2010-11-16 | 2012-06-07 | Nikon Corp | Electronic device |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7050177B2 (en) * | 2002-05-22 | 2006-05-23 | Canesta, Inc. | Method and apparatus for approximating depth of an object's placement onto a monitored region with applications to virtual interface devices |
US6710770B2 (en) * | 2000-02-11 | 2004-03-23 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
JP2003029201A (en) * | 2001-07-11 | 2003-01-29 | Canon Inc | Picture projecting device and picture correcting method |
EP1512047A4 (en) * | 2002-06-12 | 2008-03-05 | Silicon Optix Inc | Automatic keystone correction system and method |
KR100631779B1 (en) * | 2005-10-07 | 2006-10-11 | 삼성전자주식회사 | Data input apparatus and method for data input detection using the same |
US20080018591A1 (en) * | 2006-07-20 | 2008-01-24 | Arkady Pittel | User Interfacing |
JP2011099994A (en) * | 2009-11-06 | 2011-05-19 | Seiko Epson Corp | Projection display device with position detecting function |
JP2014206634A (en) * | 2013-04-12 | 2014-10-30 | 船井電機株式会社 | Electronic apparatus |
-
2013
- 2013-07-25 JP JP2013155000A patent/JP2015026219A/en active Pending
-
2014
- 2014-07-11 US US14/328,877 patent/US20150029405A1/en not_active Abandoned
- 2014-07-11 EP EP14176814.3A patent/EP2829955A3/en not_active Withdrawn
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012108233A (en) | 2010-11-16 | 2012-06-07 | Nikon Corp | Electronic device |
Also Published As
Publication number | Publication date |
---|---|
US20150029405A1 (en) | 2015-01-29 |
EP2829955A3 (en) | 2015-02-25 |
JP2015026219A (en) | 2015-02-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060017890A1 (en) | Image display method, image display apparatus, light scattering means, and image display program | |
US8294693B2 (en) | Portable input device, method for calibration thereof, and computer readable recording medium storing program for calibration | |
CN102365865B (en) | Multiprojection display system and screen forming method | |
US9753192B2 (en) | Reflector, adjustment method, and position detection apparatus | |
US9817301B2 (en) | Projector, projection system, and control method of projector | |
US7954957B2 (en) | Projector, multi-screen system, projector control method, computer program product, and information storage medium | |
JP5069038B2 (en) | Rear projection display | |
EP2790404A2 (en) | Electronic device | |
JP2006313116A (en) | Distance tilt angle detection device, and projector with detection device | |
EP2829955A2 (en) | Electronic device | |
JP5010202B2 (en) | Projector and projection image adjustment method | |
US20180300017A1 (en) | Display device and method of controlling display device | |
CN104698726A (en) | Optical unit, projection display apparatus and imaging apparatus | |
WO2021220608A1 (en) | Control method, projection device, and control program | |
JP2015139087A (en) | Projection device | |
KR101955257B1 (en) | Light curtain installation method and bidirectional display device | |
JP2017169195A (en) | Projection system and calibration device | |
JP2017129767A (en) | Reflector, method of adjusting light-emitting device, and detection system | |
JP2005148381A (en) | Projector, movement detecting method, and video adjusting method | |
JP5140973B2 (en) | Measuring surface tilt measuring device, projector and measuring surface tilt measuring method | |
JP5532090B2 (en) | Measuring surface tilt measuring device, projector and measuring surface tilt measuring method | |
WO2023162688A1 (en) | Control device, control method, and control program | |
JP7281606B2 (en) | Adjustment support device, adjustment support method, adjustment support program, and projection system | |
US11343479B2 (en) | Control method for position detecting device, position detecting device, and projector | |
JP2008242099A (en) | Manufacture device for optical device, position adjusting method, position adjusting program and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
17P | Request for examination filed |
Effective date: 20140711 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G03B 21/00 20060101ALI20150116BHEP Ipc: G06F 3/042 20060101ALI20150116BHEP Ipc: H04N 9/00 20060101ALI20150116BHEP Ipc: G06F 3/041 20060101AFI20150116BHEP |
|
R17P | Request for examination filed (corrected) |
Effective date: 20150825 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20160307 |