US20140118629A1 - Video projection system for mobile device - Google Patents
Video projection system for mobile device Download PDFInfo
- Publication number
- US20140118629A1 US20140118629A1 US14/147,081 US201414147081A US2014118629A1 US 20140118629 A1 US20140118629 A1 US 20140118629A1 US 201414147081 A US201414147081 A US 201414147081A US 2014118629 A1 US2014118629 A1 US 2014118629A1
- Authority
- US
- United States
- Prior art keywords
- projection
- projection surface
- mobile device
- image
- component
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/12—Projectors or projection-type viewers; Accessories therefor adapted for projection of either still pictures or motion pictures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/66—Transforming electric information into light information
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/28—Reflectors in projection beam
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
- H04N5/7475—Constructional details of television projection apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/317—Convergence or focusing systems
Definitions
- a visual interface is a common type of interface used on modern electronic and computing devices, including robotic devices or “robots.”
- the typical visual interface takes the form of a display, such as a video monitor or touch screen, attached to the device.
- the flat, rectangular display has become an iconic form of visual interface.
- Paradigms as to the methods of device interaction are often associated with devices that include such typical visual interfaces or displays.
- the expected method of interacting with a device having a typically visual interface or display may include the use of a mouse, keyboard, remote control, and, increasingly, touch interfaces.
- the presence of a flat rectangular display screen on a robot similarly implies to a user that such traditional methods of device interaction are to be employed.
- many robots are intended to accept other methods of interaction that may be more efficient than the aforementioned traditional methods of interaction, and therefore the use of a non-typical visual interface on robotic devices may avoid the paradigms of device interaction and thereby enhance the efficiency of interaction with such devices.
- the claimed subject matter in one embodiment, generally provides a mobile device such as a robot including a processor, a memory storing components executable by the processor, and a projection assembly.
- the projection assembly includes a projector, a lens, a movable mirror, and a first projection surface integral with a surface of the mobile device.
- the components include a projection component and a control component.
- the projection component determines the projection parameters, and projects the image dependent upon the projection parameters.
- the control component causes the movable mirror to reflect the image onto a first projection surface, and causes the projector and the lens to focus the image onto the first projection surface.
- a non-transitory memory storing a plurality of processor-executable components including a projection component and a control component.
- the projection component determines one or more projection parameters and causes a projector to project an image dependent at least in part upon the projection parameters.
- the control component causes the image to be projected onto a target projection surface as a projected image.
- Yet another embodiment of the claimed subject matter relates to a method of projecting an image via an image projection system.
- the method includes providing to a projector the images to be projected and projection parameters.
- the method further includes configuring the projection system to project the images upon a target projection surface.
- the method also includes projecting the image onto the target projection surface dependent at least in part upon the projection parameters.
- FIG. 1 is a block diagram of a robotic device or robot having one embodiment of a video projection system in accordance with the subject innovation
- FIG. 2 is a block diagram of an environment that facilitates communications between the robot and one or more remote devices;
- FIG. 3A is a block diagram of the robot of FIG. 1 , and shows the video projection system of the subject innovation in a first exemplary configuration
- FIG. 3B is a block diagram of the robot of FIG. 1 , showing the video projection system of the subject innovation in second exemplary configuration;
- FIG. 3C is a block diagram of the robot of FIG. 1 , and illustrates the video projection system of the subject innovation in a third exemplary configuration
- FIG. 4 is a process flow diagram of one embodiment of a method of projecting an image according to the subject innovation.
- ком ⁇ онент can be a process running on a processor, an object, an executable, a program, a function, a library, a subroutine, and/or a computer or a combination of software and hardware.
- both an application running on a server and the server can be a component.
- One or more components can reside within a process and a component can be localized on one computer and/or distributed between two or more computers.
- the term “processor” is generally understood to refer to a hardware component, such as a processing unit of a computer system.
- the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter.
- article of manufacture as used herein is intended to encompass a computer program accessible from any non-transitory computer-readable device, or media.
- Non-transitory computer-readable storage media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, and magnetic strips, among others), optical disks (e.g., compact disk (CD), and digital versatile disk (DVD), among others), smart cards, and flash memory devices (e.g., card, stick, and key drive, among others).
- computer-readable media generally (i.e., not necessarily storage media) may additionally include communication media such as transmission media for wireless signals and the like.
- FIG. 1 is a block diagram of a robotic device or “robot” 100 capable of communicating with a remotely-located computing device by way of a network connection.
- a “robot”, as the term will be used herein, is an electro-mechanical machine that includes computer hardware and software that causes the robot to perform functions independently and without assistance from a user.
- the robot 100 can include a head portion 102 and a body portion 104 , wherein the head portion 102 is movable with respect to the body portion 104 .
- the robot 100 can include a head rotation module 106 that operates to couple the head portion 102 with the body portion 104 , wherein the head rotation module 106 can include one or more motors that can cause the head portion 102 to rotate with respect to the body portion 104 .
- the head rotation module 106 may rotate the head portion 102 with respect to the body portion 104 up to 45° in either direction.
- the head rotation module 106 can allow the head portion 102 to rotate 90° in either direction relative to the body portion 104 .
- the head rotation module 106 can allow the head portion 102 to rotate 90° in either direction relative to the body portion 104 .
- the head rotation module 106 can facilitate rotation of the head portion 102 190° in either direction with respect to the body portion 104 .
- the head rotation module 106 can facilitate rotation of the head portion 102 with respect to the body portion 102 in either angular direction.
- the head portion 102 may include an antenna 108 that is configured to receive and transmit wireless signals.
- the antenna 108 can be configured to receive and transmit Wi-Fi signals, Bluetooth signals, infrared (IR) signals, sonar signals, radio frequency (RF), signals or other suitable signals.
- the antenna 108 can be configured to receive and transmit data to and from a Cellular tower, the Internet or the cloud in a cloud computing environment.
- the robot 100 may communicate with a remotely-located computing device, another robot, control device, such as a handheld, or other devices (not shown) using the antenna 108 .
- the robot 100 further includes at least one projection system 110 configured to display information to one or more individuals that are proximate to the robot 100 , which projection system 110 will be more particularly described hereinafter.
- the head portion 102 of the robot 100 includes the projection system 110 and one or more projection surfaces 111 configured to display an image projected by projection system 110 to an individual that is proximate to the robot 100 .
- the robot 100 may be alternately configured with one or more projection systems 110 and projection surfaces 111 included as part of the body portion 104 , included as part of the head portion 102 and part of the body portion 104 , or in other suitable combinations.
- a video camera 112 disposed on the head portion 102 may be configured to capture video of an environment of the robot 100 .
- the video camera 112 can be a high definition video camera that facilitates capturing video and still images that are in, for instance, 720p format, 720i format, 1080p format, 1080i format, or other suitable high definition video format.
- the video camera 112 can be configured to capture relatively low resolution data in a format that is suitable for transmission to the remote computing device by way of the antenna 108 .
- the video camera 112 As the video camera 112 is mounted in the head portion 102 of the robot 100 , through utilization of the head rotation module 106 , the video camera 112 can be configured to capture live video data of a relatively large portion of an environment of the robot 100 .
- the robot 100 may further include one or more sensors 114 .
- the sensors 114 may include any type of sensor that can utilized by the robot 100 in determining conditions and parameters of its environment, and enable the robot 100 to perform autonomous or semi-autonomous navigation.
- the sensors 114 may include a depth sensor, an infrared sensor, a camera, a cliff sensor that is configured to detect a drop-off in elevation proximate to the robot 100 , a GPS sensor, an accelerometer, a gyroscope, or other suitable sensor type.
- the body portion 104 of the robot 100 may include a battery 116 that is operable to provide power to other modules in the robot 100 .
- the battery 116 may be, for instance, a rechargeable battery.
- the robot 100 may include an interface that allows the robot 100 to be coupled to a power source, such that the battery 116 can be recharged.
- the body portion 104 of the robot 100 can also include one or more non-transitory computer-readable storage media, such as memory 118 .
- Memory 118 may include magnetic storage devices (e.g., hard disk, floppy disk, and magnetic strips, among others), optical disks (e.g., compact disk (CD), and digital versatile disk (DVD), among others), smart cards, flash memory devices (e.g., card, stick, and key drive, among others), or other suitable types of non-transitory computer-readable storage media.
- a number of components 119 or sets of instructions are included within memory 118 .
- a processor 120 such as a microprocessor, may also be included in the body portion 104 .
- the components 119 or sets of instructions are executable by the processor 120 , wherein execution of such components 119 facilitates the subject innovation as well as controlling and/or communicating with one or more of the other components, systems, and modules of the robot.
- the processor 120 can be in communication with the other components, systems and modules of the robot 100 by way of any suitable interface, such as a bus hosted by a motherboard.
- the processor 120 functions as the “brains” of the robot 100 .
- the processor 120 may be utilized to process data and/or commands received from a remote device as well as other systems and modules of the robot 100 , and cause the robot 100 to perform in a manner that is desired by a user of such robot 100 .
- the components 119 further facilitate, for example, autonomous and manual navigation of the robot 100 .
- the body portion 104 of the robot 100 can further include one or more sensors 122 , wherein such sensors 122 can include any suitable sensor that can output data that can be utilized by the robot 100 to determine conditions and parameters of its environment, and that can be utilized in connection with autonomous or semi-autonomous navigation.
- the sensors 122 may include sonar sensors, location sensors, infrared sensors, a camera, a cliff sensor, and/or the like.
- Data that is captured by the sensors 122 and the sensors 114 can be provided to the processor 120 which, by executing on or more of the components 119 stored within memory 118 , can process the data and autonomously navigate the robot 100 based at least in part upon the data captured by the sensors.
- a drive motor 124 may be disposed in the body portion 104 of the robot 100 .
- the drive motor 124 may be operable to drive wheels 126 and/or 128 of the robot 100 .
- the wheel 126 can be a driving wheel while the wheel 128 can be a steering wheel that can act to pivot to change the orientation of the robot 100 .
- each of the wheels 126 and 128 can have a steering mechanism to change the orientation of the robot 100 .
- the drive motor 124 is shown as driving both of the wheels 126 and 128 , it is to be understood that the drive motor 124 may drive only one of the wheels 126 or 128 while another drive motor can drive the other of the wheels 126 or 128 .
- the wheel 126 represents 2 drive wheels, driven by 2 independent motors, and a single steering wheel, unpowered by any motor.
- the wheel 128 represents the steering wheel.
- the wheels 126 or 128 represent more than 2 physical wheels, all of which may be driven by one or more motors.
- the wheels 126 and 128 may represent various combinations of the wheel arrangements described above.
- the processor 120 can transmit signals to the head rotation module 106 and/or the drive motor 124 to control orientation of the head portion 102 with respect to the body portion 104 , and/or to control the orientation and position of the robot 100 .
- the body portion 104 of the robot 100 can further include speakers 132 and a microphone 134 .
- Data captured by way of the microphone 134 can be transmitted to the remote computing device by way of the antenna 108 .
- a user at the remote computing device can receive a real-time audio/video feed and may experience the environment of the robot 100 .
- the speakers 132 can be employed to output audio data to one or more individuals that are proximate to the robot 100 .
- This audio information can be a multimedia file that is retained in the memory 118 of the robot 100 , audio files received by the robot 100 from the remote computing device by way of the antenna 108 , real-time audio data from a web-cam or microphone at the remote computing device, etc.
- the robot 100 has been shown in a particular configuration and with particular modules included therein, it is to be understood that the robot can be configured in a variety of different manners, and these configurations are contemplated and are intended to fall within the scope of the hereto-appended claims.
- the head rotation module 106 can be configured with a tilt motor so that the head portion 102 of the robot 100 can tilt up and down within a vertical plane and pivot about a horizontal axis.
- the robot 100 may not include two separate portions, but may include a single unified body, wherein the entire robot body can be turned to allow the capture of video data by way of the video camera 112 .
- the robot 100 can have a unified body structure, but the video camera 112 can have a motor, such as a servomotor, associated therewith that allows the video camera 112 to alter position to obtain different views of an environment. Modules that are shown to be in the body portion 104 can be placed in the head portion 102 of the robot 100 , and vice versa. It is also to be understood that the robot 100 has been provided as an exemplary mobile device, and solely for the purposes of explanation, and as such is not intended to be limiting as to a particular mobile device or in regard to the scope of the hereto-appended claims.
- FIG. 2 is a block diagram showing an environment 200 that facilitates reception by the robot 100 of commands and/or data from, and the transmission by robot 100 of sensor and other data to, one or more remote devices. More particularly, the environment 200 includes a wireless access point 202 , a network 204 , and a remote device 206 .
- the robot 100 is configured to receive and transmit data wirelessly via the antenna 108 .
- the robot 100 initializes on power up and communicates with the wireless access point 202 and establishes its presence with the wireless access point 202 .
- the robot 100 may then obtain a connection to one or more networks 204 by way of the wireless access point 202 .
- the networks 204 may include a cellular network, the Internet, a proprietary network such as an intranet, or other suitable network.
- the remote device 206 can have applications executing thereon that facilitate communication with the robot 100 by way of the network 204 .
- a communication channel can be established between the remote device 206 and the robot 100 by way of the network 204 through various actions such as handshaking, authentication, and other similar methods.
- the remote device 206 may include a desktop computer, a laptop computer, a mobile telephone or smart phone, a mobile multimedia device, a gaming console, or other suitable remote device.
- the remote device 206 can include or have associated therewith a display or touch screen (not shown) that can present data, images, and other information, and provide a graphical user interface to a user 208 pertaining to navigation, control, and the environment surrounding the robot 100 .
- the projection system 110 is disposed within the head portion 102 of the robot 100 , which head portion 102 includes the projection surface 111 , a projection surface shutter 302 , a projection window 304 , a projection window shutter 306 , and a projector 310 .
- the robot 100 may be alternately configured with one or more projection systems 110 , including the projection surface 111 , a projection surface shutter 302 , a projection window 304 , a projection window shutter 306 and the projector 310 included as part of the body portion 104 , included as part of both the head portion 102 and the body portion 104 , or in other suitable combinations.
- one or more projection systems 110 including the projection surface 111 , a projection surface shutter 302 , a projection window 304 , a projection window shutter 306 and the projector 310 included as part of the body portion 104 , included as part of both the head portion 102 and the body portion 104 , or in other suitable combinations.
- the projection surface 111 may be configured, in one embodiment, as a translucent light-diffusing surface, such as a polymer or plastic surface that is coated with, has embedded therein, or otherwise includes, diffusive particles, and which may be curved or otherwise contoured to match the shape and/or contour of the outer surface of the head portion 102 .
- the projection surface 111 may be one portion or region of the external surface, an internal surface, or may form a portion of both the inner and outer surfaces of the robot 100 . When the projection surface 111 is illuminated or otherwise displaying an image, the regions thereof that are not illuminated may appear integrated with the skin or outer surface of the robot 100 .
- the projection surface 111 is configured to conceal the inner components of the robot 100 , including the projection surface shutter 302 , the projection window 304 , the projection window shutter 306 , and the projector 310 , and the other internal components of the projection system 110 and of the robot 100 , from the view of an observer of the robot regardless of whether an image is being projected onto the projection surface 111 .
- the images projected onto the projection surface 111 may appear to float on the outer surface of the robot 100 , rather than appearing as framed within a typical illuminated rectangular display area. In other words, to an observer of the robot 100 the projection surface 111 may not be visually distinguishable from the skin or outer surface of the robot 100 . When no images are being projected onto the projection surface 111 it may appear, when viewed from outside the robot 100 , that the projection surface 111 is integral with and substantially indistinguishable visually from the outer surface of the robot 100 .
- the projection surface 111 may be tinted or otherwise configured to match, for example, the appearance and texture of the skin or outer surface of the robot 100 .
- the projection surface 111 may be tinted black to match and blend in with the outer surface or skin of the robot 100 that is also tinted or otherwise black in color. Further, the images projected onto the projection surface 111 appear as if they were generated from within the body of the robot 100 , thereby effectively unifying the visual expression in the form of the images projected onto the projection surface 111 with the functionality and modes of interaction of the robot 100 .
- the projection surface shutter 302 and the projection window shutter 306 are both operable to selectively prevent or permit the passage of light, and thus the projected image, onto the projection surface 111 and through the projection window 304 , respectively.
- the projection surface shutter 302 and the projection window shutter 306 may be configured as opaque mechanical shutter members that include respective drives or motor mechanisms (not shown) that move or otherwise actuate the projection surface shutter 302 and the projection window shutter 306 between open and closed positions.
- the projection surface shutter 302 and the projection window shutter 306 may be configured as electronic or electro-chemical shutter members that are configured to selectively permit and/or prevent the passage of light, and thus the projected image, onto the projection surface 111 and through projection window 304 , respectively.
- the projection surface shutter 302 and the projection window shutter 306 may be configured as a surface or panel that may be integral or otherwise associated with the corresponding projection surface 111 and the projection window 304 , and which may include, be coated with, or have embedded therein, a material, such as a phosphor, liquid crystals or other suitable material, that in one state permit the passage of light and in another state block or otherwise prevents the passage of light.
- a material such as a phosphor, liquid crystals or other suitable material
- Projection window 304 is transparent to light.
- projection window 304 may be configured as a transparent portion of or window defined by head portion 102 .
- Projection window 304 may be configured as a transparent polymer or plastic window or section of the head portion 102 of the robot 100 .
- the projection surface 111 may be translucent.
- a translucent projection surface may conceal the internal components without a shutter.
- the projection system 110 further includes a projector 310 having a lens 312 , and a movable mirror 320 .
- the projector 310 may be configured, for example, as a digital graphic projector, and in the embodiment shown is disposed within head portion 102 of the robot 100 .
- the projector 310 is configured to project light and images that are focused, at least in part, by the lens 312 , which is movable or otherwise configured to focus the projected light and images onto a desired projection surface via the movable minor 320 .
- the movable minor 320 is configured as a mirror or other reflective element, and reflects or otherwise redirects the light and images projected by the projector 310 .
- movable minor may comprise a plurality of movable minor or reflective elements.
- the movable minor 320 is selectively movable to a variety of positions or angles to thereby reflect or otherwise redirect to corresponding projection surfaces the light and images projected by the projector 310 .
- the movable minor 320 includes, in one embodiment, a drive or motor mechanism (not shown) that moves, for example by rotation, the movable mirror 320 to a desired position, angle or other orientation, to thereby project the images onto a desired or target projection surface, as is more particularly described hereinafter.
- the robot 100 includes in memory 118 a control component 330 and a projection component 340 .
- the control component 330 when executed by the processor 120 configures the projection system 110 , including the projection surface shutter 302 , the projection window shutter 306 , the projector 310 , the lens 312 , and the movable minor 320 to project an image onto a target projection surface, as determined or indicated by the processor 120 . More particularly, in the exemplary embodiment shown in FIG.
- the control component 330 when executed by the processor 120 causes the projector 310 and the lens 312 to project an image onto the movable minor 320 with a focal plane on the projection surface 111 , and positions or otherwise places the movable mirror 320 in a position to project the image from the projector 310 onto the projection surface 111 of the robot 100 or through the projection window 304 and onto an external projection surface.
- the control component 330 also places the projection surface shutter 302 and the projection window shutter 306 in appropriate positions or states for projecting the image onto the indicated target projection surface.
- the control component 330 configures the projection system 110 to project the image onto an indicated target projection surface.
- the projection component 340 when executed by the processor 120 provides to the projector 310 projection data and the images and information to be projected. Further, the projection component 340 when executed by the processor 120 may be configured to adjust the projection parameters, based at least in part upon the data provided by the sensors 114 and the sensors 122 , as will also be more particularly described hereinafter.
- the projection system 110 is shown in a first exemplary configuration that is suitable for projecting an image onto the projection surface 111 .
- the projection component 340 when executed by the processor 120 provides to the projector 310 the projection data and the images to be projected. Execution of the control component 330 by the processor 120 causes the movable minor 320 to be placed in a first position 322 from which the movable minor 320 is configured to reflect the image from the projector 310 onto the projection surface 111 of the robot 100 , and causes the projector 310 and the lens 312 to project an image via the movable mirror 320 in the first position 322 , onto the projection surface 111 .
- the focal plane of the image is on the projection surface 111 .
- the control component 330 may place the movable minor 320 in the first position 322 dependent at least in part upon the target projection surface indicated by the processor 120 .
- the processor 120 may analyze the type of images to be projected or their characteristics and based thereon determine an appropriate target projection surface from the available projection surfaces, or may determine the target projection surface based on input from a user of the robot 100 .
- execution of the control component 330 by the processor 120 further causes the projection surface shutter 302 to open or enter a state that permits the passage of light and, thus, permits the image to be projected onto the projection surface 111 , and may cause the projection window shutter 306 to close or otherwise enter a state that prevents the passage of light and, thus, prevents any images from being projected through the projection window 304 and onto an external surface.
- a user may be prevented from looking through the projection window 304 when it is not in use for external projection.
- the projection system 110 is shown in a second exemplary configuration that is suitable for projecting an image through the projection window 304 and onto a second position or location upon an external projection surface.
- the projection component 340 when executed by the processor 120 provides to the projector 310 the projection data and the images to be projected. Execution of the control component 330 by the processor 120 causes the movable minor 320 to be placed in a second position 324 from which the movable minor 320 is configured to reflect the image from the projector 310 through the projection window 304 and onto the second position of the external projection surface, and causes the projector 310 and the lens 312 to project an image via the movable minor 320 in the second position 324 .
- the image may be focused on the projection surface 111 .
- the control component 330 may place the movable minor 320 in the second position 324 dependent at least in part upon the target projection surface indicated by the processor 120 .
- execution of the control component 330 by the processor 120 further causes the projection surface shutter 302 to close or enter a state that prevents the passage of light and, thus, prevents any images from being seen through or being projected onto the projection surface 111 , and may cause the projection window shutter 306 to open or otherwise enter a state that permits the passage of light and, thereby, permits the projection of images through the projection window 304 and onto a second position or location upon the external projection surface.
- the projection system 110 is shown in a third exemplary configuration that is suitable for projecting an image through the projection window 304 and onto a third position or location upon the external projection surface.
- the projection component 340 when executed by the processor 120 provides to the projector 310 the projection data and the images to be projected. Execution of the control component 330 by the processor 120 causes the movable minor 320 to be placed in a third position 326 from which the movable minor 320 is configured to reflect the image from the projector 310 through the projection window 304 and onto the third position of the external projection surface, and causes the projector 310 and the lens 312 to project an image via the movable minor 320 in the third position 326 .
- the image is focused on the projection surface 111 .
- the control component 330 may place the movable minor 320 in the third position 326 dependent at least in part upon the target projection surface indicated by the processor 120 .
- execution of the control component 330 by the processor 120 further causes the projection surface shutter 302 to close or enter a state that prevents the passage of light and, thus, prevents any images from being seen or being projected onto the projection surface 111 , and may cause the projection window shutter 306 to open or otherwise enter a state that permits the passage of light and, thereby, permits the projection of images through the projection window 304 and onto the third position or location upon the external projection surface.
- the projection component 340 when executed by the processor 120 provides to the projector 310 projection data and the images and information to be projected, which projection data includes and/or determines the projection parameters including, for example, the resolution, focal point and focus, size and orientation of image to be projected, brightness, contrast, aspect ratio, and other parameters.
- the projection component 340 may be configured to adjust and otherwise compensate the projection parameters and characteristics of the image to be projected dependent at least in part upon the characteristics of the target projection surface and the projection angle (i.e., the angle of the optical axis).
- projection component 340 when executed by the processor 120 may be configured to, via the sensors 114 and 122 , determine various characteristics of the target projection surface, such as the distance to the projection surface, the color and curvature of the projection surface, the projection angle, and other relevant characteristics. Based at least in part upon the determined characteristics of the target projection surface, the projection component 340 may adjust the projected content, or alter the projection parameters and image characteristics to cause the projector 310 , or to cause an adjustment to the position of the lens 312 or of the movable mirror 320 , to compensate for undesirable image distortion effects that may occur.
- the projection component 340 as executed by the processor 120 may determine based at least in part upon sensor data from the sensors 114 and 122 that the target projection surface is a curved surface, such as the projection surface 111 or a curved external projection surface. The projection component 340 may then, as executed by the processor 120 , make appropriate corrections or adjustments to the projection parameters, such as increasing depth of field by decreasing the projector aperture, and the characteristics of the image to be projected to compensate for any distortion that may result from projecting the image onto the curved projection surface.
- the projection component 340 as executed by the processor 120 may determine that the angle of the optical axis or projection angle exceeds a predetermined threshold that may, if not compensated for, result in the projected image having a type of distortion known as the keystone effect wherein the projected image appears trapezoidal with the top of the image appearing narrower than the bottom.
- the projection component 340 may be further configured to make appropriate corrections or adjustments to the projection parameters and the characteristics of the image to be projected to compensate for the keystone distortion. Further, the projection component 340 may invoke the control component 330 to make certain adjustments, such as adjusting the position or angle of the movable mirror 320 .
- execution of the control component 330 by the processor 120 disposes the movable mirror 320 in one of three predetermined positions, i.e., first position 322 , second position 324 and third position 326 .
- the control component 330 may be alternately configured to, for example, place the movable minor 320 in any number of predetermined positions to thereby project images onto any number of corresponding locations or positions on a target projection surface.
- target projection surface includes the internal projection surface 111 , the external projection surface, and any other surface onto which an image is or is desired to be projected.
- the method 400 includes providing 410 to a projector the images or other information to be projected and projection data.
- the projection data may include certain projection parameters including, for example, the resolution, focal point and focus, size and orientation of the images to be projected, brightness, contrast, aspect ratio, and other projection parameters.
- the method 400 further includes configuring 420 the projection system, which includes placing one or more minors or other reflective elements in a position that corresponds to the indicated target projection surface and actuating or otherwise disposing projection image shutters into appropriate positions or states dependent at least in part upon the indicated target projection surface and/or the projection data provided in the providing 410 step. More particularly, if shutters are associated with one or more target projection surfaces, the shutters must be open or closed, or placed in corresponding appropriate states, to permit the projected image to pass or not pass, and thereby to project the image onto the target projection surface only while preventing the projection of an image onto any non-indicated projection surface. This may also prevent the light for that image from being seen at any other intermediate location except as a final projected image at the desired projected image location.
- the method 400 further includes projecting 430 the image onto the mirror and, thereby, the target projection surface.
- the projection 430 step projects the images according to the projection parameters.
- the projection 430 process determines the characteristics of the target projection surface, such as by one or more sensors, and dependent at least in part thereon adjusts the projection parameters and characteristics of the image to be projected, or otherwise compensates for the characteristics of the target projection surface.
- the projecting 430 step may determine the distance from the projector to the target projection surface, the color and curvature of the projection surface, and the projection angle, and adjust the projection parameters based on those characteristics.
- Projecting 430 the image further includes projecting the image on a target projection surface that may be configured, in one embodiment, as a translucent light-diffusing surface, such as a polymer or plastic surface that is coated with, has embedded therein, or otherwise includes, diffusive particles, and which may be curved or otherwise contoured to match the shape and/or contour of an outer surface of a robot.
- the target projection surface may be one portion or region of the external surface of the robot, an internal surface of the robot, or may form a portion of both the inner and outer surfaces of the robot.
- Projecting 430 the image includes projecting an image that appears to be integrated with the skin or outer surface of the robot, rather than being framed within the confines of a conventional display panel.
- projecting 430 the image further includes concealing the inner components of the robot from view, whether or not an image is being displayed.
- Projecting 430 the image may also include projecting the image upon an external projection surface that is not associated with the robot.
- the terms (including a reference to a “means”) used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., a functional equivalent), even though not structurally equivalent to the disclosed structure, which performs the function in the herein illustrated exemplary aspects of the claimed subject matter.
- the innovation includes a system as well as a computer-readable storage media having computer-executable instructions for performing the acts and/or events of the various methods of the claimed subject matter.
- one or more components may be combined into a single component providing aggregate functionality or divided into several separate sub-components, and any one or more middle layers, such as a management layer, may be provided to communicatively couple to such sub-components in order to provide integrated functionality.
- middle layers such as a management layer
- Any components described herein may also interact with one or more other components not specifically described herein but generally known by those of skill in the art.
Abstract
Description
- This application is a continuation of co-pending U.S. patent application Ser. No. 13/169,040, filed Jun. 27, 2011 (the entire contents of which are hereby incorporated by reference as though fully set forth herein).
- A visual interface is a common type of interface used on modern electronic and computing devices, including robotic devices or “robots.” The typical visual interface takes the form of a display, such as a video monitor or touch screen, attached to the device. The flat, rectangular display has become an iconic form of visual interface. Paradigms as to the methods of device interaction are often associated with devices that include such typical visual interfaces or displays. For example, the expected method of interacting with a device having a typically visual interface or display may include the use of a mouse, keyboard, remote control, and, increasingly, touch interfaces. The presence of a flat rectangular display screen on a robot similarly implies to a user that such traditional methods of device interaction are to be employed. However, many robots are intended to accept other methods of interaction that may be more efficient than the aforementioned traditional methods of interaction, and therefore the use of a non-typical visual interface on robotic devices may avoid the paradigms of device interaction and thereby enhance the efficiency of interaction with such devices.
- The following is a brief summary of subject matter that is described in greater detail herein. This summary is not intended to be limiting as to the scope of the claims. It is intended to neither identify key or critical elements of the claimed subject matter nor delineate the scope of the subject innovation. Its sole purpose is to present some concepts of the claimed subject matter in a simplified form as a prelude to the more detailed description that is presented later.
- The claimed subject matter, in one embodiment, generally provides a mobile device such as a robot including a processor, a memory storing components executable by the processor, and a projection assembly. The projection assembly includes a projector, a lens, a movable mirror, and a first projection surface integral with a surface of the mobile device. The components include a projection component and a control component. The projection component determines the projection parameters, and projects the image dependent upon the projection parameters. The control component causes the movable mirror to reflect the image onto a first projection surface, and causes the projector and the lens to focus the image onto the first projection surface.
- Another embodiment of the claimed subject matter relates to a non-transitory memory storing a plurality of processor-executable components including a projection component and a control component. The projection component determines one or more projection parameters and causes a projector to project an image dependent at least in part upon the projection parameters. The control component causes the image to be projected onto a target projection surface as a projected image.
- Yet another embodiment of the claimed subject matter relates to a method of projecting an image via an image projection system. The method includes providing to a projector the images to be projected and projection parameters. The method further includes configuring the projection system to project the images upon a target projection surface. The method also includes projecting the image onto the target projection surface dependent at least in part upon the projection parameters.
-
FIG. 1 is a block diagram of a robotic device or robot having one embodiment of a video projection system in accordance with the subject innovation; -
FIG. 2 is a block diagram of an environment that facilitates communications between the robot and one or more remote devices; -
FIG. 3A , is a block diagram of the robot ofFIG. 1 , and shows the video projection system of the subject innovation in a first exemplary configuration; -
FIG. 3B , is a block diagram of the robot ofFIG. 1 , showing the video projection system of the subject innovation in second exemplary configuration; -
FIG. 3C , is a block diagram of the robot ofFIG. 1 , and illustrates the video projection system of the subject innovation in a third exemplary configuration; and -
FIG. 4 is a process flow diagram of one embodiment of a method of projecting an image according to the subject innovation. - The claimed subject matter is described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the subject innovation. It may be evident, however, that the claimed subject matter may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the subject innovation.
- As utilized herein, terms “component,” “system,” “client” and the like are intended to refer to a computer-related entity, either hardware, software (e.g., in execution), and/or firmware, or a combination thereof. For example, a component can be a process running on a processor, an object, an executable, a program, a function, a library, a subroutine, and/or a computer or a combination of software and hardware.
- By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and a component can be localized on one computer and/or distributed between two or more computers. The term “processor” is generally understood to refer to a hardware component, such as a processing unit of a computer system.
- Furthermore, the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter. The term “article of manufacture” as used herein is intended to encompass a computer program accessible from any non-transitory computer-readable device, or media.
- Non-transitory computer-readable storage media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, and magnetic strips, among others), optical disks (e.g., compact disk (CD), and digital versatile disk (DVD), among others), smart cards, and flash memory devices (e.g., card, stick, and key drive, among others). In contrast, computer-readable media generally (i.e., not necessarily storage media) may additionally include communication media such as transmission media for wireless signals and the like.
- Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter. Moreover, the word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs.
-
FIG. 1 is a block diagram of a robotic device or “robot” 100 capable of communicating with a remotely-located computing device by way of a network connection. A “robot”, as the term will be used herein, is an electro-mechanical machine that includes computer hardware and software that causes the robot to perform functions independently and without assistance from a user. Therobot 100 can include ahead portion 102 and abody portion 104, wherein thehead portion 102 is movable with respect to thebody portion 104. Therobot 100 can include ahead rotation module 106 that operates to couple thehead portion 102 with thebody portion 104, wherein thehead rotation module 106 can include one or more motors that can cause thehead portion 102 to rotate with respect to thebody portion 104. As an example, thehead rotation module 106 may rotate thehead portion 102 with respect to thebody portion 104 up to 45° in either direction. In another example, thehead rotation module 106 can allow thehead portion 102 to rotate 90° in either direction relative to thebody portion 104. In yet another example, thehead rotation module 106 can allow thehead portion 102 to rotate 90° in either direction relative to thebody portion 104. In still yet another example, thehead rotation module 106 can facilitate rotation of thehead portion 102 190° in either direction with respect to thebody portion 104. Thehead rotation module 106 can facilitate rotation of thehead portion 102 with respect to thebody portion 102 in either angular direction. - The
head portion 102 may include anantenna 108 that is configured to receive and transmit wireless signals. For instance, theantenna 108 can be configured to receive and transmit Wi-Fi signals, Bluetooth signals, infrared (IR) signals, sonar signals, radio frequency (RF), signals or other suitable signals. Theantenna 108 can be configured to receive and transmit data to and from a Cellular tower, the Internet or the cloud in a cloud computing environment. Further, therobot 100 may communicate with a remotely-located computing device, another robot, control device, such as a handheld, or other devices (not shown) using theantenna 108. - The
robot 100 further includes at least oneprojection system 110 configured to display information to one or more individuals that are proximate to therobot 100, whichprojection system 110 will be more particularly described hereinafter. In the embodiment ofFIG. 1 , thehead portion 102 of therobot 100 includes theprojection system 110 and one or more projection surfaces 111 configured to display an image projected byprojection system 110 to an individual that is proximate to therobot 100. In other embodiments, therobot 100 may be alternately configured with one ormore projection systems 110 and projection surfaces 111 included as part of thebody portion 104, included as part of thehead portion 102 and part of thebody portion 104, or in other suitable combinations. - A
video camera 112 disposed on thehead portion 102 may be configured to capture video of an environment of therobot 100. In an example, thevideo camera 112 can be a high definition video camera that facilitates capturing video and still images that are in, for instance, 720p format, 720i format, 1080p format, 1080i format, or other suitable high definition video format. Thevideo camera 112 can be configured to capture relatively low resolution data in a format that is suitable for transmission to the remote computing device by way of theantenna 108. As thevideo camera 112 is mounted in thehead portion 102 of therobot 100, through utilization of thehead rotation module 106, thevideo camera 112 can be configured to capture live video data of a relatively large portion of an environment of therobot 100. - The
robot 100 may further include one ormore sensors 114. Thesensors 114 may include any type of sensor that can utilized by therobot 100 in determining conditions and parameters of its environment, and enable therobot 100 to perform autonomous or semi-autonomous navigation. For example, thesensors 114 may include a depth sensor, an infrared sensor, a camera, a cliff sensor that is configured to detect a drop-off in elevation proximate to therobot 100, a GPS sensor, an accelerometer, a gyroscope, or other suitable sensor type. - The
body portion 104 of therobot 100 may include abattery 116 that is operable to provide power to other modules in therobot 100. Thebattery 116 may be, for instance, a rechargeable battery. In such a case, therobot 100 may include an interface that allows therobot 100 to be coupled to a power source, such that thebattery 116 can be recharged. - The
body portion 104 of therobot 100 can also include one or more non-transitory computer-readable storage media, such asmemory 118.Memory 118 may include magnetic storage devices (e.g., hard disk, floppy disk, and magnetic strips, among others), optical disks (e.g., compact disk (CD), and digital versatile disk (DVD), among others), smart cards, flash memory devices (e.g., card, stick, and key drive, among others), or other suitable types of non-transitory computer-readable storage media. A number ofcomponents 119 or sets of instructions are included withinmemory 118. - A
processor 120, such as a microprocessor, may also be included in thebody portion 104. As will be described in greater detail below, thecomponents 119 or sets of instructions are executable by theprocessor 120, wherein execution ofsuch components 119 facilitates the subject innovation as well as controlling and/or communicating with one or more of the other components, systems, and modules of the robot. Theprocessor 120 can be in communication with the other components, systems and modules of therobot 100 by way of any suitable interface, such as a bus hosted by a motherboard. In an embodiment, theprocessor 120 functions as the “brains” of therobot 100. For instance, theprocessor 120 may be utilized to process data and/or commands received from a remote device as well as other systems and modules of therobot 100, and cause therobot 100 to perform in a manner that is desired by a user ofsuch robot 100. Thecomponents 119 further facilitate, for example, autonomous and manual navigation of therobot 100. - The
body portion 104 of therobot 100 can further include one ormore sensors 122, whereinsuch sensors 122 can include any suitable sensor that can output data that can be utilized by therobot 100 to determine conditions and parameters of its environment, and that can be utilized in connection with autonomous or semi-autonomous navigation. For example, thesensors 122 may include sonar sensors, location sensors, infrared sensors, a camera, a cliff sensor, and/or the like. Data that is captured by thesensors 122 and thesensors 114 can be provided to theprocessor 120 which, by executing on or more of thecomponents 119 stored withinmemory 118, can process the data and autonomously navigate therobot 100 based at least in part upon the data captured by the sensors. - A
drive motor 124 may be disposed in thebody portion 104 of therobot 100. Thedrive motor 124 may be operable to drivewheels 126 and/or 128 of therobot 100. For example, thewheel 126 can be a driving wheel while thewheel 128 can be a steering wheel that can act to pivot to change the orientation of therobot 100. Additionally, each of thewheels robot 100. Furthermore, while thedrive motor 124 is shown as driving both of thewheels drive motor 124 may drive only one of thewheels wheels wheel 126 represents 2 drive wheels, driven by 2 independent motors, and a single steering wheel, unpowered by any motor. In such an embodiment, thewheel 128 represents the steering wheel. In another embodiment, thewheels wheels - Upon receipt of data from the
sensors processor 120 can transmit signals to thehead rotation module 106 and/or thedrive motor 124 to control orientation of thehead portion 102 with respect to thebody portion 104, and/or to control the orientation and position of therobot 100. - The
body portion 104 of therobot 100 can further includespeakers 132 and amicrophone 134. Data captured by way of themicrophone 134 can be transmitted to the remote computing device by way of theantenna 108. Accordingly, a user at the remote computing device can receive a real-time audio/video feed and may experience the environment of therobot 100. Thespeakers 132 can be employed to output audio data to one or more individuals that are proximate to therobot 100. This audio information can be a multimedia file that is retained in thememory 118 of therobot 100, audio files received by therobot 100 from the remote computing device by way of theantenna 108, real-time audio data from a web-cam or microphone at the remote computing device, etc. - While the
robot 100 has been shown in a particular configuration and with particular modules included therein, it is to be understood that the robot can be configured in a variety of different manners, and these configurations are contemplated and are intended to fall within the scope of the hereto-appended claims. For instance, thehead rotation module 106 can be configured with a tilt motor so that thehead portion 102 of therobot 100 can tilt up and down within a vertical plane and pivot about a horizontal axis. Alternatively, therobot 100 may not include two separate portions, but may include a single unified body, wherein the entire robot body can be turned to allow the capture of video data by way of thevideo camera 112. In still yet another embodiment, therobot 100 can have a unified body structure, but thevideo camera 112 can have a motor, such as a servomotor, associated therewith that allows thevideo camera 112 to alter position to obtain different views of an environment. Modules that are shown to be in thebody portion 104 can be placed in thehead portion 102 of therobot 100, and vice versa. It is also to be understood that therobot 100 has been provided as an exemplary mobile device, and solely for the purposes of explanation, and as such is not intended to be limiting as to a particular mobile device or in regard to the scope of the hereto-appended claims. -
FIG. 2 is a block diagram showing anenvironment 200 that facilitates reception by therobot 100 of commands and/or data from, and the transmission byrobot 100 of sensor and other data to, one or more remote devices. More particularly, theenvironment 200 includes awireless access point 202, anetwork 204, and aremote device 206. Therobot 100 is configured to receive and transmit data wirelessly via theantenna 108. In an exemplary embodiment, therobot 100 initializes on power up and communicates with thewireless access point 202 and establishes its presence with thewireless access point 202. Therobot 100 may then obtain a connection to one ormore networks 204 by way of thewireless access point 202. For example, thenetworks 204 may include a cellular network, the Internet, a proprietary network such as an intranet, or other suitable network. - The
remote device 206 can have applications executing thereon that facilitate communication with therobot 100 by way of thenetwork 204. For example, and as will be understood by one of ordinary skill in the art, a communication channel can be established between theremote device 206 and therobot 100 by way of thenetwork 204 through various actions such as handshaking, authentication, and other similar methods. Theremote device 206 may include a desktop computer, a laptop computer, a mobile telephone or smart phone, a mobile multimedia device, a gaming console, or other suitable remote device. Theremote device 206 can include or have associated therewith a display or touch screen (not shown) that can present data, images, and other information, and provide a graphical user interface to auser 208 pertaining to navigation, control, and the environment surrounding therobot 100. - With reference to
FIG. 3A , one embodiment of aprojection system 110 of the subject innovation is illustrated. As shown inFIG. 3A , theprojection system 110 is disposed within thehead portion 102 of therobot 100, whichhead portion 102 includes theprojection surface 111, aprojection surface shutter 302, aprojection window 304, aprojection window shutter 306, and aprojector 310. In alternate embodiments, therobot 100 may be alternately configured with one ormore projection systems 110, including theprojection surface 111, aprojection surface shutter 302, aprojection window 304, aprojection window shutter 306 and theprojector 310 included as part of thebody portion 104, included as part of both thehead portion 102 and thebody portion 104, or in other suitable combinations. - The
projection surface 111 may be configured, in one embodiment, as a translucent light-diffusing surface, such as a polymer or plastic surface that is coated with, has embedded therein, or otherwise includes, diffusive particles, and which may be curved or otherwise contoured to match the shape and/or contour of the outer surface of thehead portion 102. Theprojection surface 111 may be one portion or region of the external surface, an internal surface, or may form a portion of both the inner and outer surfaces of therobot 100. When theprojection surface 111 is illuminated or otherwise displaying an image, the regions thereof that are not illuminated may appear integrated with the skin or outer surface of therobot 100. Additionally, theprojection surface 111 is configured to conceal the inner components of therobot 100, including theprojection surface shutter 302, theprojection window 304, theprojection window shutter 306, and theprojector 310, and the other internal components of theprojection system 110 and of therobot 100, from the view of an observer of the robot regardless of whether an image is being projected onto theprojection surface 111. - In contrast to a conventional display, the images projected onto the
projection surface 111 may appear to float on the outer surface of therobot 100, rather than appearing as framed within a typical illuminated rectangular display area. In other words, to an observer of therobot 100 theprojection surface 111 may not be visually distinguishable from the skin or outer surface of therobot 100. When no images are being projected onto theprojection surface 111 it may appear, when viewed from outside therobot 100, that theprojection surface 111 is integral with and substantially indistinguishable visually from the outer surface of therobot 100. Theprojection surface 111 may be tinted or otherwise configured to match, for example, the appearance and texture of the skin or outer surface of therobot 100. For example, theprojection surface 111 may be tinted black to match and blend in with the outer surface or skin of therobot 100 that is also tinted or otherwise black in color. Further, the images projected onto theprojection surface 111 appear as if they were generated from within the body of therobot 100, thereby effectively unifying the visual expression in the form of the images projected onto theprojection surface 111 with the functionality and modes of interaction of therobot 100. - The
projection surface shutter 302 and theprojection window shutter 306 are both operable to selectively prevent or permit the passage of light, and thus the projected image, onto theprojection surface 111 and through theprojection window 304, respectively. In one embodiment, theprojection surface shutter 302 and theprojection window shutter 306 may be configured as opaque mechanical shutter members that include respective drives or motor mechanisms (not shown) that move or otherwise actuate theprojection surface shutter 302 and theprojection window shutter 306 between open and closed positions. In another embodiment, theprojection surface shutter 302 and theprojection window shutter 306 may be configured as electronic or electro-chemical shutter members that are configured to selectively permit and/or prevent the passage of light, and thus the projected image, onto theprojection surface 111 and throughprojection window 304, respectively. In that embodiment, theprojection surface shutter 302 and theprojection window shutter 306 may be configured as a surface or panel that may be integral or otherwise associated with the correspondingprojection surface 111 and theprojection window 304, and which may include, be coated with, or have embedded therein, a material, such as a phosphor, liquid crystals or other suitable material, that in one state permit the passage of light and in another state block or otherwise prevents the passage of light. -
Projection window 304 is transparent to light. In the embodiment shown,projection window 304 may be configured as a transparent portion of or window defined byhead portion 102.Projection window 304 may be configured as a transparent polymer or plastic window or section of thehead portion 102 of therobot 100. - In one embodiment, there is no
shutter 302 present. In such an embodiment, theprojection surface 111 may be translucent. A translucent projection surface may conceal the internal components without a shutter. - The
projection system 110 further includes aprojector 310 having alens 312, and amovable mirror 320. Theprojector 310 may be configured, for example, as a digital graphic projector, and in the embodiment shown is disposed withinhead portion 102 of therobot 100. Theprojector 310 is configured to project light and images that are focused, at least in part, by thelens 312, which is movable or otherwise configured to focus the projected light and images onto a desired projection surface via themovable minor 320. Themovable minor 320 is configured as a mirror or other reflective element, and reflects or otherwise redirects the light and images projected by theprojector 310. In an alternate embodiment, movable minor may comprise a plurality of movable minor or reflective elements. Themovable minor 320 is selectively movable to a variety of positions or angles to thereby reflect or otherwise redirect to corresponding projection surfaces the light and images projected by theprojector 310. Themovable minor 320 includes, in one embodiment, a drive or motor mechanism (not shown) that moves, for example by rotation, themovable mirror 320 to a desired position, angle or other orientation, to thereby project the images onto a desired or target projection surface, as is more particularly described hereinafter. - As shown in
FIG. 3A , therobot 100 includes in memory 118 acontrol component 330 and aprojection component 340. Generally, thecontrol component 330 when executed by theprocessor 120 configures theprojection system 110, including theprojection surface shutter 302, theprojection window shutter 306, theprojector 310, thelens 312, and themovable minor 320 to project an image onto a target projection surface, as determined or indicated by theprocessor 120. More particularly, in the exemplary embodiment shown inFIG. 3A , thecontrol component 330 when executed by theprocessor 120 causes theprojector 310 and thelens 312 to project an image onto themovable minor 320 with a focal plane on theprojection surface 111, and positions or otherwise places themovable mirror 320 in a position to project the image from theprojector 310 onto theprojection surface 111 of therobot 100 or through theprojection window 304 and onto an external projection surface. Thecontrol component 330 also places theprojection surface shutter 302 and theprojection window shutter 306 in appropriate positions or states for projecting the image onto the indicated target projection surface. Thus, thecontrol component 330 configures theprojection system 110 to project the image onto an indicated target projection surface. - The
projection component 340 when executed by theprocessor 120 provides to theprojector 310 projection data and the images and information to be projected. Further, theprojection component 340 when executed by theprocessor 120 may be configured to adjust the projection parameters, based at least in part upon the data provided by thesensors 114 and thesensors 122, as will also be more particularly described hereinafter. - With continuing reference to the exemplary embodiment of
FIG. 3A , theprojection system 110 is shown in a first exemplary configuration that is suitable for projecting an image onto theprojection surface 111. Theprojection component 340 when executed by theprocessor 120 provides to theprojector 310 the projection data and the images to be projected. Execution of thecontrol component 330 by theprocessor 120 causes themovable minor 320 to be placed in afirst position 322 from which themovable minor 320 is configured to reflect the image from theprojector 310 onto theprojection surface 111 of therobot 100, and causes theprojector 310 and thelens 312 to project an image via themovable mirror 320 in thefirst position 322, onto theprojection surface 111. The focal plane of the image is on theprojection surface 111. - The
control component 330 may place themovable minor 320 in thefirst position 322 dependent at least in part upon the target projection surface indicated by theprocessor 120. For example, theprocessor 120 may analyze the type of images to be projected or their characteristics and based thereon determine an appropriate target projection surface from the available projection surfaces, or may determine the target projection surface based on input from a user of therobot 100. - In the exemplary embodiment of
FIG. 3A , execution of thecontrol component 330 by theprocessor 120 further causes theprojection surface shutter 302 to open or enter a state that permits the passage of light and, thus, permits the image to be projected onto theprojection surface 111, and may cause theprojection window shutter 306 to close or otherwise enter a state that prevents the passage of light and, thus, prevents any images from being projected through theprojection window 304 and onto an external surface. In one embodiment, a user may be prevented from looking through theprojection window 304 when it is not in use for external projection. - With reference to
FIG. 3B , theprojection system 110 is shown in a second exemplary configuration that is suitable for projecting an image through theprojection window 304 and onto a second position or location upon an external projection surface. Theprojection component 340 when executed by theprocessor 120 provides to theprojector 310 the projection data and the images to be projected. Execution of thecontrol component 330 by theprocessor 120 causes themovable minor 320 to be placed in asecond position 324 from which themovable minor 320 is configured to reflect the image from theprojector 310 through theprojection window 304 and onto the second position of the external projection surface, and causes theprojector 310 and thelens 312 to project an image via themovable minor 320 in thesecond position 324. The image may be focused on theprojection surface 111. Thecontrol component 330 may place themovable minor 320 in thesecond position 324 dependent at least in part upon the target projection surface indicated by theprocessor 120. - In the exemplary embodiment of
FIG. 3B , execution of thecontrol component 330 by theprocessor 120 further causes theprojection surface shutter 302 to close or enter a state that prevents the passage of light and, thus, prevents any images from being seen through or being projected onto theprojection surface 111, and may cause theprojection window shutter 306 to open or otherwise enter a state that permits the passage of light and, thereby, permits the projection of images through theprojection window 304 and onto a second position or location upon the external projection surface. - With reference to
FIG. 3C , theprojection system 110 is shown in a third exemplary configuration that is suitable for projecting an image through theprojection window 304 and onto a third position or location upon the external projection surface. Theprojection component 340 when executed by theprocessor 120 provides to theprojector 310 the projection data and the images to be projected. Execution of thecontrol component 330 by theprocessor 120 causes themovable minor 320 to be placed in athird position 326 from which themovable minor 320 is configured to reflect the image from theprojector 310 through theprojection window 304 and onto the third position of the external projection surface, and causes theprojector 310 and thelens 312 to project an image via themovable minor 320 in thethird position 326. The image is focused on theprojection surface 111. Thecontrol component 330 may place themovable minor 320 in thethird position 326 dependent at least in part upon the target projection surface indicated by theprocessor 120. - In the exemplary embodiment of
FIG. 3C , execution of thecontrol component 330 by theprocessor 120 further causes theprojection surface shutter 302 to close or enter a state that prevents the passage of light and, thus, prevents any images from being seen or being projected onto theprojection surface 111, and may cause theprojection window shutter 306 to open or otherwise enter a state that permits the passage of light and, thereby, permits the projection of images through theprojection window 304 and onto the third position or location upon the external projection surface. - The
projection component 340 when executed by theprocessor 120 provides to theprojector 310 projection data and the images and information to be projected, which projection data includes and/or determines the projection parameters including, for example, the resolution, focal point and focus, size and orientation of image to be projected, brightness, contrast, aspect ratio, and other parameters. In one embodiment, theprojection component 340 may be configured to adjust and otherwise compensate the projection parameters and characteristics of the image to be projected dependent at least in part upon the characteristics of the target projection surface and the projection angle (i.e., the angle of the optical axis). More particularly,projection component 340 when executed by theprocessor 120 may be configured to, via thesensors projection component 340 may adjust the projected content, or alter the projection parameters and image characteristics to cause theprojector 310, or to cause an adjustment to the position of thelens 312 or of themovable mirror 320, to compensate for undesirable image distortion effects that may occur. - For example, the
projection component 340 as executed by theprocessor 120 may determine based at least in part upon sensor data from thesensors projection surface 111 or a curved external projection surface. Theprojection component 340 may then, as executed by theprocessor 120, make appropriate corrections or adjustments to the projection parameters, such as increasing depth of field by decreasing the projector aperture, and the characteristics of the image to be projected to compensate for any distortion that may result from projecting the image onto the curved projection surface. As a further example, theprojection component 340 as executed by theprocessor 120 may determine that the angle of the optical axis or projection angle exceeds a predetermined threshold that may, if not compensated for, result in the projected image having a type of distortion known as the keystone effect wherein the projected image appears trapezoidal with the top of the image appearing narrower than the bottom. Theprojection component 340 may be further configured to make appropriate corrections or adjustments to the projection parameters and the characteristics of the image to be projected to compensate for the keystone distortion. Further, theprojection component 340 may invoke thecontrol component 330 to make certain adjustments, such as adjusting the position or angle of themovable mirror 320. - In the embodiments shown, execution of the
control component 330 by theprocessor 120 disposes themovable mirror 320 in one of three predetermined positions, i.e.,first position 322,second position 324 andthird position 326. However, it is to be understood that thecontrol component 330 may be alternately configured to, for example, place themovable minor 320 in any number of predetermined positions to thereby project images onto any number of corresponding locations or positions on a target projection surface. As used herein, the term target projection surface includes theinternal projection surface 111, the external projection surface, and any other surface onto which an image is or is desired to be projected. - With reference now to
FIG. 4 , a method of projecting an image is illustrated. Themethod 400 includes providing 410 to a projector the images or other information to be projected and projection data. The projection data may include certain projection parameters including, for example, the resolution, focal point and focus, size and orientation of the images to be projected, brightness, contrast, aspect ratio, and other projection parameters. - The
method 400 further includes configuring 420 the projection system, which includes placing one or more minors or other reflective elements in a position that corresponds to the indicated target projection surface and actuating or otherwise disposing projection image shutters into appropriate positions or states dependent at least in part upon the indicated target projection surface and/or the projection data provided in the providing 410 step. More particularly, if shutters are associated with one or more target projection surfaces, the shutters must be open or closed, or placed in corresponding appropriate states, to permit the projected image to pass or not pass, and thereby to project the image onto the target projection surface only while preventing the projection of an image onto any non-indicated projection surface. This may also prevent the light for that image from being seen at any other intermediate location except as a final projected image at the desired projected image location. - The
method 400 further includes projecting 430 the image onto the mirror and, thereby, the target projection surface. Theprojection 430 step projects the images according to the projection parameters. In one embodiment, theprojection 430 process determines the characteristics of the target projection surface, such as by one or more sensors, and dependent at least in part thereon adjusts the projection parameters and characteristics of the image to be projected, or otherwise compensates for the characteristics of the target projection surface. For example, the projecting 430 step may determine the distance from the projector to the target projection surface, the color and curvature of the projection surface, and the projection angle, and adjust the projection parameters based on those characteristics. - Projecting 430 the image further includes projecting the image on a target projection surface that may be configured, in one embodiment, as a translucent light-diffusing surface, such as a polymer or plastic surface that is coated with, has embedded therein, or otherwise includes, diffusive particles, and which may be curved or otherwise contoured to match the shape and/or contour of an outer surface of a robot. The target projection surface may be one portion or region of the external surface of the robot, an internal surface of the robot, or may form a portion of both the inner and outer surfaces of the robot. Projecting 430 the image includes projecting an image that appears to be integrated with the skin or outer surface of the robot, rather than being framed within the confines of a conventional display panel. In this embodiment, projecting 430 the image further includes concealing the inner components of the robot from view, whether or not an image is being displayed. Projecting 430 the image may also include projecting the image upon an external projection surface that is not associated with the robot.
- While the systems, methods and flow diagram described above have been described with respect to robots, it is to be understood that various other devices that utilize or include display technology can utilize aspects described herein. For instance, various industrial equipment, automobile displays, consumer products, and the like may apply the inventive concepts disclosed herein.
- What has been described above includes examples of the subject innovation. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the claimed subject matter, but one of ordinary skill in the art may recognize that many further combinations and permutations of the subject innovation are possible. Accordingly, the claimed subject matter is intended to embrace all such alterations, modifications, and variations that fall within the spirit and scope of the appended claims.
- In particular and in regard to the various functions performed by the above described components, devices, circuits, systems and the like, the terms (including a reference to a “means”) used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., a functional equivalent), even though not structurally equivalent to the disclosed structure, which performs the function in the herein illustrated exemplary aspects of the claimed subject matter. In this regard, it will also be recognized that the innovation includes a system as well as a computer-readable storage media having computer-executable instructions for performing the acts and/or events of the various methods of the claimed subject matter.
- There are multiple ways of implementing the subject innovation, e.g., an appropriate API, tool kit, driver code, operating system, control, standalone or downloadable software object, etc., which enables applications and services to use the techniques described herein. The claimed subject matter contemplates the use from the standpoint of an API (or other software object), as well as from a software or hardware object that operates according to the techniques set forth herein. Thus, various implementations of the subject innovation described herein may have aspects that are wholly in hardware, partly in hardware and partly in software, as well as in software.
- The aforementioned systems have been described with respect to interaction between several components. It can be appreciated that such systems and components can include those components or specified sub-components, some of the specified components or sub-components, and/or additional components, and according to various permutations and combinations of the foregoing. Sub-components can also be implemented as components communicatively coupled to other components rather than included within parent components (hierarchical).
- Additionally, it can be noted that one or more components may be combined into a single component providing aggregate functionality or divided into several separate sub-components, and any one or more middle layers, such as a management layer, may be provided to communicatively couple to such sub-components in order to provide integrated functionality. Any components described herein may also interact with one or more other components not specifically described herein but generally known by those of skill in the art.
- In addition, while a particular feature of the subject innovation may have been disclosed with respect to only one of several implementations, such features may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms “includes,” “including,” “has,” “contains,” variants thereof, and other similar words are used in either the detailed description or the claims, these terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/147,081 US20140118629A1 (en) | 2011-06-27 | 2014-01-03 | Video projection system for mobile device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/169,040 US8690358B2 (en) | 2011-06-27 | 2011-06-27 | Video projection device for mobile device having a first projection surface integral with a surface of the mobile device |
US14/147,081 US20140118629A1 (en) | 2011-06-27 | 2014-01-03 | Video projection system for mobile device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/169,040 Continuation US8690358B2 (en) | 2011-06-27 | 2011-06-27 | Video projection device for mobile device having a first projection surface integral with a surface of the mobile device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140118629A1 true US20140118629A1 (en) | 2014-05-01 |
Family
ID=47361515
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/169,040 Expired - Fee Related US8690358B2 (en) | 2011-06-27 | 2011-06-27 | Video projection device for mobile device having a first projection surface integral with a surface of the mobile device |
US14/147,081 Abandoned US20140118629A1 (en) | 2011-06-27 | 2014-01-03 | Video projection system for mobile device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/169,040 Expired - Fee Related US8690358B2 (en) | 2011-06-27 | 2011-06-27 | Video projection device for mobile device having a first projection surface integral with a surface of the mobile device |
Country Status (1)
Country | Link |
---|---|
US (2) | US8690358B2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10126636B1 (en) * | 2015-06-18 | 2018-11-13 | Steven Glenn Heppler | Image projection system for a drum |
US20210195149A1 (en) * | 2019-12-20 | 2021-06-24 | Everseen Limited | System and method for displaying video data in a target environment |
US11422443B2 (en) * | 2019-07-30 | 2022-08-23 | Toyota Research Institute, Inc. | Guided mobile platform with light projector |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8930019B2 (en) * | 2010-12-30 | 2015-01-06 | Irobot Corporation | Mobile human interface robot |
US8836730B1 (en) * | 2011-08-19 | 2014-09-16 | Google Inc. | Methods and systems for modifying a display of a field of view of a robotic device to include zoomed-in and zoomed-out views |
US8955989B2 (en) * | 2012-01-23 | 2015-02-17 | Spectra Logic Corporation | Systems for and methods of creating a visual display associated with a data storage library robot |
US8977396B2 (en) * | 2012-03-20 | 2015-03-10 | Sony Corporation | Mobile robotic assistant for multipurpose applications |
CN103793133A (en) * | 2013-12-19 | 2014-05-14 | 弗徕威智能机器人科技(上海)有限公司 | Multi-screen interaction system and multi-screen interaction method applied to intelligent service robots |
US9436068B2 (en) * | 2014-09-17 | 2016-09-06 | The Boeing Company | Portable constant resolution visual system (CRVS) with defined throw distance |
CN105721842B (en) * | 2016-04-08 | 2018-08-24 | 海尔优家智能科技(北京)有限公司 | Projecting method, device and the robot of robot with projecting function |
CN106896829B (en) * | 2017-03-17 | 2023-05-16 | 中国南方电网有限责任公司超高压输电公司曲靖局 | Inspection system and method based on full coverage of inspection robot substation meter |
CN107030733B (en) * | 2017-06-19 | 2023-08-04 | 合肥虹慧达科技有限公司 | Wheeled robot |
US11219837B2 (en) * | 2017-09-29 | 2022-01-11 | Sony Interactive Entertainment Inc. | Robot utility and interface device |
DE102017218162A1 (en) * | 2017-10-11 | 2019-04-11 | BSH Hausgeräte GmbH | Household assistant with projector |
JP7124624B2 (en) * | 2018-10-12 | 2022-08-24 | トヨタ自動車株式会社 | entertainment systems and programs |
CN213186348U (en) * | 2020-08-27 | 2021-05-11 | 广景视睿科技(深圳)有限公司 | Projection equipment |
KR20230022717A (en) * | 2021-08-09 | 2023-02-16 | 삼성전자주식회사 | Electronic apparatus and controlling method thereof |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6031587A (en) * | 1996-01-31 | 2000-02-29 | Canon Kabushiki Kaisha | Optical member, optical instrument including the same, and method of manufacturing the optical member |
US6211976B1 (en) * | 1998-09-14 | 2001-04-03 | Digilens, Inc. | Holographic projection system |
US20100265473A1 (en) * | 2007-11-28 | 2010-10-21 | Kyocera Corporation | Projection Device |
US20100302516A1 (en) * | 2008-02-25 | 2010-12-02 | ORSAM Gesellschaft mit beschraenkter Haftung | Projector for projecting an image and corresponding method |
US8262232B2 (en) * | 2008-09-17 | 2012-09-11 | Kabushiki Kaisha Toshiba | Display device emitting a light flux and mobile apparatus including the display device |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6489934B1 (en) | 2000-07-07 | 2002-12-03 | Judah Klausner | Cellular phone with built in optical projector for display of data |
JP3733889B2 (en) * | 2001-10-05 | 2006-01-11 | 日産自動車株式会社 | Display device |
US7281807B2 (en) | 2003-07-16 | 2007-10-16 | Honeywood Technologies, Llc | Positionable projection display devices |
US6953252B2 (en) | 2003-08-28 | 2005-10-11 | Hewlett-Packard Development Company, L.P. | Projector having a plurality of retro-reflectors |
AU2003263656A1 (en) | 2003-09-23 | 2005-04-11 | Everest Barjau Delgado | 3d image projection system |
WO2006098402A1 (en) * | 2005-03-17 | 2006-09-21 | Brother Kogyo Kabushiki Kaisha | Projector |
GB0615433D0 (en) * | 2006-08-04 | 2006-09-13 | Univ York | Display systems |
US7670012B2 (en) | 2006-09-05 | 2010-03-02 | Hewlett-Packard Development Company, L.P. | Projection system and methods |
KR20090022053A (en) * | 2007-08-29 | 2009-03-04 | 삼성전자주식회사 | Apparatus and method for beam-projecting and adjusting beam-projected image automatically |
JP5460341B2 (en) * | 2010-01-06 | 2014-04-02 | キヤノン株式会社 | Three-dimensional measuring apparatus and control method thereof |
US8414349B2 (en) * | 2011-06-01 | 2013-04-09 | Nintendo Co., Ltd. | Remotely controlled mobile device control system |
-
2011
- 2011-06-27 US US13/169,040 patent/US8690358B2/en not_active Expired - Fee Related
-
2014
- 2014-01-03 US US14/147,081 patent/US20140118629A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6031587A (en) * | 1996-01-31 | 2000-02-29 | Canon Kabushiki Kaisha | Optical member, optical instrument including the same, and method of manufacturing the optical member |
US6211976B1 (en) * | 1998-09-14 | 2001-04-03 | Digilens, Inc. | Holographic projection system |
US20100265473A1 (en) * | 2007-11-28 | 2010-10-21 | Kyocera Corporation | Projection Device |
US20100302516A1 (en) * | 2008-02-25 | 2010-12-02 | ORSAM Gesellschaft mit beschraenkter Haftung | Projector for projecting an image and corresponding method |
US8262232B2 (en) * | 2008-09-17 | 2012-09-11 | Kabushiki Kaisha Toshiba | Display device emitting a light flux and mobile apparatus including the display device |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10126636B1 (en) * | 2015-06-18 | 2018-11-13 | Steven Glenn Heppler | Image projection system for a drum |
US11422443B2 (en) * | 2019-07-30 | 2022-08-23 | Toyota Research Institute, Inc. | Guided mobile platform with light projector |
US20210195149A1 (en) * | 2019-12-20 | 2021-06-24 | Everseen Limited | System and method for displaying video data in a target environment |
US11146765B2 (en) * | 2019-12-20 | 2021-10-12 | Everseen Limited | System and method for displaying video data in a target environment |
AU2020407533B2 (en) * | 2019-12-20 | 2023-10-05 | Everseen Limited | System and method for displaying video in a target environment |
Also Published As
Publication number | Publication date |
---|---|
US8690358B2 (en) | 2014-04-08 |
US20120327315A1 (en) | 2012-12-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8690358B2 (en) | Video projection device for mobile device having a first projection surface integral with a surface of the mobile device | |
US8936366B2 (en) | Illuminated skin robot display | |
JP5845783B2 (en) | Display device, display control method, and program | |
TWI755762B (en) | Target tracking method, intelligent mobile device and storage medium thereof | |
KR101966127B1 (en) | robot cleaner system and a control method of the same | |
KR20180126707A (en) | Apparatus and method for controlling display of hologram, vehicle system | |
CN106846410B (en) | Driving environment imaging method and device based on three dimensions | |
US9703288B1 (en) | System and method for aerial system control | |
KR102211341B1 (en) | Display apparatus | |
EP3145170B1 (en) | Method and apparatus for controlling positioning of camera device, camera device and terminal device | |
CN107102804B (en) | Control device, control method, and recording medium | |
EP2929511B1 (en) | Annular view for panorama image | |
US9001190B2 (en) | Computer vision system and method using a depth sensor | |
EP3618408B1 (en) | Electronic device | |
US10704730B2 (en) | Small-sized camera gimbal and electronic device having same | |
US20140300688A1 (en) | Imaging apparatus and method of controlling the same | |
US20200167996A1 (en) | Periphery monitoring device | |
US10931926B2 (en) | Method and apparatus for information display, and display device | |
CN108701440B (en) | Information processing apparatus, information processing method, and program | |
KR20160022856A (en) | remote controller for a robot cleaner and a control method of the same | |
KR101666500B1 (en) | Method for controlling display of hologram image in mobile terminal and apparatus for displaying hologram image using the same | |
US20190258245A1 (en) | Vehicle remote operation device, vehicle remote operation system and vehicle remote operation method | |
US20210191680A1 (en) | Display device and method for providing vr image | |
US9811160B2 (en) | Mobile terminal and method for controlling the same | |
KR20080075640A (en) | Projector with four direction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LARSEN, GLEN C.;SANCHEZ, RUSSELL;SIGNING DATES FROM 20140104 TO 20140107;REEL/FRAME:031917/0746 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417 Effective date: 20141014 Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454 Effective date: 20141014 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |