US20140298271A1 - Electronic device including projector and method for controlling the electronic device - Google Patents

Electronic device including projector and method for controlling the electronic device Download PDF

Info

Publication number
US20140298271A1
US20140298271A1 US14/152,238 US201414152238A US2014298271A1 US 20140298271 A1 US20140298271 A1 US 20140298271A1 US 201414152238 A US201414152238 A US 201414152238A US 2014298271 A1 US2014298271 A1 US 2014298271A1
Authority
US
United States
Prior art keywords
electronic device
pointer
touch screen
gesture
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/152,238
Other versions
US9569065B2 (en
Inventor
Antoni JAKUBIAK
Pawel ZBOROWSKI
Adam STRUPCZEWSKI
Magda TALAREK
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Jakubiak, Antoni, STRUPCZEWSKI, ADAM, Talarek, Magda, Zborowski, Pawel
Publication of US20140298271A1 publication Critical patent/US20140298271A1/en
Application granted granted Critical
Publication of US9569065B2 publication Critical patent/US9569065B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04164Connections between sensors and controllers, e.g. routing lines between electrodes and connection pads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/048023D-info-object: information is displayed on the internal or external surface of a three dimensional manipulable object, e.g. on the faces of a cube that can be rotated by the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/54Details of telephonic subscriber devices including functional features of a projector or beamer module assembly

Definitions

  • the present invention relates to an electronic device including a projector and a method for controlling the electronic device. More particularly, the present invention relates to an electronic device for displaying a rendered screen on a screen of the electronic device or for projecting and displaying the rendered screen through a projector and a method for controlling the electronic device.
  • the smart phones or the tablet PCs typically include touch screens, and users may manipulate the smart phones or the tablet PCs by inputting a predetermined gesture onto the touch screens.
  • the smart phones or the tablet PCs emphasize portability, and thus the size of the touch screen of the smart phone or the tablet PC is limited.
  • the user may experience a difficulty in viewing the moving image or the still image due to the limited-size touch screen.
  • a smart phone or tablet PC may project and display an image onto a screen by using a projector module.
  • the user views the projected image rather than viewing the image displayed on the limited-size touch screen.
  • the user in order to input a particular command, the user inputs a particular gesture onto the touch screen.
  • the user needs to input the particular gesture while observing the touch screen, but cannot check this process on a projected image.
  • the user-input gesture is input using a hand or a stylus pen and a smart phone or tablet PC according to the related art does not display a position of the hand or stylus pen on the touch screen, the user is unable to check (e.g., confirm) the desired input to the touch screen.
  • the smart phone or tablet PC merely projects and displays rendered data and cannot project and display a gesture that the user physically inputs onto the touch screen.
  • aspects of the present invention are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, aspects of the present invention provide an electronic device including a projector which adds a pointer corresponding to a gesture sensed on a touch screen to and display the pointer on a projected image and a method for controlling the electronic device.
  • a method for controlling an electronic device that executes and displays an application includes displaying an execution screen of the application on a touch screen, projecting a projection image corresponding to the execution screen of the application, displaying a pointer on the projection image, and moving and displaying the pointer on the projection image to correspond to a gesture detected on the touch screen.
  • an electronic device in accordance with another aspect of the present invention, includes a touch screen for displaying an execution screen of an application, a projector module for projecting and displaying a projection image corresponding to the execution screen of the application, and a controller for displaying a pointer on the projection image, and for moving and displaying the pointer on the projection image to correspond to a gesture detected on the touch screen.
  • FIG. 1 is a schematic block diagram illustrating an electronic device according to an exemplary embodiment of the present invention
  • FIG. 2 is a flowchart illustrating a method for controlling an electronic device including a projector module according to an exemplary embodiment of the present invention
  • FIGS. 3A through 3I are conceptual diagrams illustrating an electronic device according to exemplary embodiments of the present invention.
  • FIGS. 4A and 4B are flowcharts illustrating a method for controlling an electronic device according to exemplary embodiments of the present invention
  • FIG. 5 is a flowchart illustrating a method for controlling an electronic device according to an exemplary embodiment of the present invention
  • FIGS. 6A and 6B are conceptual diagrams illustrating an electronic device according to an exemplary embodiment of the present invention.
  • FIGS. 7A and 7B are flowcharts illustrating a method for controlling an electronic device according to exemplary embodiments of the present invention.
  • FIGS. 8A and 8B are conceptual diagrams illustrating an electronic device according to exemplary embodiments of the present invention.
  • FIG. 9 is a flowchart illustrating a method for controlling an electronic device according to an exemplary embodiment of the present invention.
  • FIG. 10 is a screen illustrating an electronic device's touch screen according to exemplary embodiment of the invention.
  • Exemplary embodiments of the present invention will be provided to achieve the above-described technical aspects of the present invention.
  • defined entities may have the same names, to which the present invention is not limited.
  • exemplary embodiments of the present invention can be implemented with same or ready modifications in a system having a similar technical background.
  • a device e.g., electronic device
  • a device may refer to mobile devices such as a cellular phone, a Personal Digital Assistant (PDA), a digital camera, a portable game console, and an digital audio player, a Portable/Personal Multimedia Player (PMP), a handheld e-book, a portable lap-top Personal Computer (PC), a Global Positioning System (GPS) navigation, and the like capable of wireless communication or network communication consistent with that disclosed herein.
  • PDA Personal Digital Assistant
  • PMP Portable/Personal Multimedia Player
  • PC portable lap-top Personal Computer
  • GPS Global Positioning System
  • FIG. 1 is a schematic block diagram illustrating an electronic device according to an exemplary embodiment of the present invention.
  • an electronic device 100 may be connected with an external device (not illustrated) by using a mobile communication module 120 , a sub communication module 130 , and a connector 165 .
  • the “external device” may include another device (not illustrated), a cellular phone (not illustrated), a smart phone (not illustrated), a tablet PC (not illustrated), a server (not illustrated), and the like.
  • the electronic device 100 may include a controller 110 , a mobile communication module 120 , a sub communication module 130 , a multimedia module 140 , a camera module 150 , a GPS module 155 , an input/output module 160 , a sensor module 170 , a storage unit 175 , a power supply unit 180 , a touch screen 190 and a touch screen controller 195 .
  • the sub communication module 130 may include at least one of a Wireless Local Area Network (WLAN) module 131 and a short-range communication module 132 (e.g., a Near Field Communications (NFC) module).
  • WLAN Wireless Local Area Network
  • NFC Near Field Communications
  • the multimedia module 140 may include at least one of a broadcast communication module 141 , an audio playback module 142 , and a video playback module 143 .
  • the camera module 150 may include at least one of a first camera 151 and a second camera 152 .
  • the input/output module 160 may include at least one of a button 161 , a microphone 162 , a speaker 163 , a vibration motor 164 , a connector 165 , and a keypad 166 .
  • the controller 110 may include a Central Processing Unit (CPU) 111 , a Read Only Memory (ROM) 112 in which a control program for controlling the electronic device 100 may be stored, and a Random Access Memory (RAM) 113 which stores a signal or data input from the electronic device 100 or which is used as a memory region for a task performed in the electronic device 100 .
  • the CPU 111 may include a various number of core processors (e.g., the CPU may include a single core, a dual core, a triple core, a quad core processor).
  • the CPU 111 , the ROM 112 , and the RAM 113 may be interconnected through an internal bus.
  • the controller 110 controls the mobile communication module 120 , the multimedia module 140 , the camera module 150 , the GPS module 155 , the input/output module 160 , the sensor module 170 , the storing unit 175 , the power supply unit 180 , the touch screen 190 , and the touch screen controller 195 .
  • the mobile communication module 120 enables the electronic device 100 to be connected with an external device through mobile communication by using at least one antenna or plural-antennas (not illustrated) under control of the controller 110 .
  • the mobile communication module 120 transmits/receives a wireless signal for voice call, video call, a text message (e.g., Short Messaging Service (SMS)), a multimedia message (e.g., Multi Media Service (MMS)), and the like with a cellular phone (not illustrated), a smart phone (not illustrated), a tablet PC, or another electronic device (not illustrated) which has a phone number input to the electronic device 100 .
  • SMS Short Messaging Service
  • MMS Multi Media Service
  • the sub communication module 130 may include at least one of the WLAN module 131 and the short-range communication module 132 .
  • the sub communication module 130 may include either the WLAN module 131 or the short-range communication module 132 or both of the WLAN module 131 and the short-range communication module 132 .
  • the WLAN module 131 may be connected to the Internet through a wireless Access Point (AP) (not illustrated) under control of the controller 110 .
  • the WLAN module 131 supports the WLAN standard IEEE802.11x of the Institute of Electrical and Electronics Engineers (IEEE).
  • the short-range communication module 132 may wirelessly perform short-range communication between the electronic device 100 and an image forming apparatus (not illustrated) under control of the controller 110 .
  • the short-range communication may include Bluetooth, Infrared Data Association (IrDA), or the like.
  • the electronic device 100 may include at least one of the mobile communication module 120 , the WLAN module 131 , and the short-range communication module 132 .
  • the electronic device 100 may include a combination of the mobile communication module 120 , the WLAN module 131 , and the short-range communication module 132 .
  • the multimedia module 140 may include the broadcast communication module 141 , the audio playback module 142 , or the video playback module 143 .
  • the broadcast communication module 141 receives a broadcast signal (e.g., a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like) and broadcast additional information (e.g., Electric Program Guide (EPG), Electric Service Guide (ESG), and the like) transmitted from a broadcasting station (not shown) via a broadcast communication antenna (not illustrated) under control of the controller 110 .
  • a broadcast signal e.g., a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like
  • broadcast additional information e.g., Electric Program Guide (EPG), Electric Service Guide (ESG), and the like
  • the audio playback module 142 may play a digital audio file (e.g., a file having a file extension such as ‘mp3’, ‘wma’, ‘ogg’, ‘wav’, and the like) stored or received under control of the controller 110 .
  • the video playback module 143 may play a digital video file (e.g., a file having a file extension such as ‘mpeg’, ‘mpg’, ‘mp4’, ‘avi’, ‘mov’, ‘mkv’, and the like) stored or received under control of the controller 110 .
  • the video playback module 143 may also play a digital audio file.
  • the multimedia module 140 may include the audio playback module 142 and the video playback module 143 , except for the broadcast communication module 141 . According to exemplary embodiments of the present invention, the audio playback module 142 and/or the video playback module 143 of the multimedia module 140 may be included in the controller 100 .
  • the camera module 150 may include at least one of the first camera 151 and the second camera 152 which capture a still image or a video under control of the controller 110 .
  • the first camera 151 or the second camera 152 may include an auxiliary light source (e.g., a flash (not illustrated)) which provides light of an amount necessary for photographing.
  • first camera 151 may be positioned on the front surface of the electronic device 100
  • the second camera 152 may be positioned on the rear surface of the electronic device 100 .
  • the first camera 151 and the second camera 152 may be positioned adjacent to each other (e.g., a space between the first camera 151 and the second camera 152 is greater than 1 cm and less than 8 cm) to capture a 3-Dimensional (3D) still image or a 3D moving image.
  • a space between the first camera 151 and the second camera 152 is greater than 1 cm and less than 8 cm
  • the GPS module 155 receives electric waves from a plurality of GPS satellites (not illustrated) in the Earth's orbit, and calculates a location of the electronic device 100 by using a time of arrival from the GPS satellite (not illustrated) to the electronic device 100 .
  • the input/output module 160 may include at least one of the button 161 , the microphone 162 , the speaker 163 , the vibration motor 164 , the connector 165 , and the keypad 166 .
  • the button 161 may be formed on at least one of a front surface, a side surface, and a rear surface of a housing (or case) of the electronic device 100 , and may include at least one of a power/lock button, a volume button, a menu button, a home button, a back button, and a search button.
  • the microphone 162 receives voice or sound and generates a corresponding electric signal under control of the controller 110 .
  • the speaker 163 outputs sound corresponding to various signals (e.g., wireless data, broadcast data, a digital audio file, a digital video file, a captured image, or the like) of the mobile communication module 120 , the sub communication module 130 , the multimedia module 140 , or the camera module 150 to the outside of the electronic device 100 under control of the controller 110 .
  • the speaker 163 may output sound corresponding to a function executed by the electronic device 100 (e.g., button manipulation sound corresponding to a phone call or a ring back tone).
  • One or more speakers 163 may be formed in a proper position or proper positions of a housing of the electronic device 100 .
  • the vibration motor 164 may convert an electrical signal into mechanical vibration under control of the controller 110 .
  • the vibration motor operates.
  • a single vibration motor or multiple vibration motors may be formed in the housing of the electronic device 100 .
  • the vibration motor may operate in response to a user's touch on the touch screen 190 and continuous movement of the touch on the touch screen 190 .
  • the connector 165 may be used as an interface for connecting the electronic device 100 with an external device (not illustrated) or a power source (not illustrated). Under control of the controller 110 , through a wired cable connected to the connector 165 , data stored in the storing unit 175 of the electronic device 100 may be transmitted to the external device (not illustrated) or data may be received from the external device (not illustrated). For example, through the wired cable connected to the connector 165 , power may be input from the power source (not illustrated) or a battery (not illustrated) may be charged.
  • the keypad 166 may receive a key input from the user for control of the electronic device 100 .
  • the keypad 166 may include a physical keypad (not illustrated) formed in the electronic device 100 or a virtual keypad (not illustrated) displayed on the touch screen 190 .
  • the physical keypad (not illustrated) formed in the electronic device 100 may be excluded depending on the performance or structure of the electronic device 100 .
  • the sensor module 170 may include at least one sensor for detecting a state of the electronic device 100 .
  • the sensor module 170 may include a proximity sensor for detecting user's proximity to the electronic device 100 , an illumination sensor (not illustrated) for detecting an amount of light around the electronic device 100 , or a motion sensor (not illustrated), such as a geo-magnetic sensor or an acceleration sensor, for detecting a motion of the electronic device 100 (e.g., a rotation of the electronic device 100 , or an acceleration or a vibration applied to the electronic device 100 ).
  • the at least one sensor may detect a state of the electronic device 100 , generate a signal corresponding to detection, and transmit the generated signal to the controller 110 .
  • the at least one sensor of the sensor module 170 may be added or removed according to implementation of the electronic device 100 .
  • the storage unit 175 stores a signal or data which is input/output corresponding to operations of the mobile communication module 120 , the sub communication module 130 , the multimedia module 140 , the camera module 150 , the GPS module 155 , the input/output module 160 , the sensor module 170 , or the touch screen 190 , under control of the controller 110 .
  • the storage unit 175 may also store a control program and applications for control of the electronic device 100 and/or the controller 110 .
  • the term “storage unit” may include the storage unit 175 , the ROM 112 and the RAM 113 in the controller 110 , a memory card (not illustrated) mounted in the portable terminal 100 (e.g., a Secure Digital (SD) card, a memory stick), and the like.
  • the storage unit 175 may include a non-volatile memory, a volatile memory, a Hard Disk Drive (HDD), a Solid State Drive (SSD), and the like.
  • a projector module 177 projects and displays a rendered image.
  • the projector module 177 may include a light source for emitting light to be used in projection, a light-modulator for modulating light incident from the light source according to an image signal, and a lens unit for projecting the light incident from the light-modulator onto a screen.
  • the power supply unit 180 may supply power to at least one battery (not illustrated) disposed in the housing of the electronic device 100 under control of the controller 110 .
  • the at least one batter (not illustrated) supplies power to the electronic device 100 .
  • the power supply unit 180 may supply power input from an external power source (not illustrated) through a wired cable connected with the connector 165 to the electronic device 100 .
  • the touch screen 190 may provide a user interface corresponding to various services (e.g., call, data transmission, broadcasting, picture/moving image capturing, and the like) to the user.
  • the touch screen 190 may transmit an analog signal corresponding to at least one touch input to the user interface to the touch screen controller 195 .
  • the touch screen 190 may sense (e.g., detect) at least one touch through a user's body (e.g., a finger including a thumb) or a touch-possible input means (e.g., a stylus pen).
  • the touch screen 190 may sense (e.g., detect) continuous movement of one of the at least one touch.
  • the touch screen 190 may transmit an analog signal corresponding to continuous movement of the sensed (e.g., detected) touch to the touch screen controller 195 .
  • the touch may include a contactless (e.g., a detectable distance between the touch screen 190 and the user's body or the touch-possible input means) touch as well as a contact between the touch screen 190 and the user's body or the touch-possible input means.
  • the detectable distance may vary depending on the performance or structure of the electronic device 100 .
  • the touch screen 190 may include, for example, a first touch panel 190 a and a second touch panel 190 b .
  • the first touch panel 190 a may measure a touch or approach of a part of the user's body.
  • the first touch panel 190 a may be of a resistive type, a capacitive type, an infrared type, an acoustic wave type, or the like.
  • the second touch panel 190 b may measure a touch or approach of a device such as a stylus pen.
  • the second touch panel 190 b may be of an Electromagnetic Resonance (EMR) measurement type.
  • EMR Electromagnetic Resonance
  • the touch screen controller 195 converts an analog signal received from the touch screen 10 into a digital signal (e.g., X and Y coordinates) for transmission to the controller 110 .
  • the controller 110 then controls the touch screen 190 by using the digital signal received from the touch screen controller 195 .
  • the controller 110 may select or execute a shortcut icon (not illustrated) displayed on the touch screen 190 in response to a touch.
  • touch screen controller 195 may be included in the controller 110 .
  • the touch screen controller 195 may include a first touch panel controller 195 a for controlling the first touch panel 190 a and a second touch panel controller 195 b for controlling the second touch panel 190 b.
  • the controller 110 detects various user inputs received through the camera module 150 , the input/output module 160 , the sensor module 170 , and the touch screen 190 .
  • the user inputs may include not only touches, but also various forms of information input to the electronic device 100 , such as user's gesture, voice, eye movement, and biomedical signal.
  • the controller 110 may control the overall operation of the electronic device 100 to perform a predetermined operation or function corresponding to the detected user input.
  • FIG. 2 is a flowchart illustrating a method for controlling an electronic device including projector module according to an exemplary embodiment of the present invention.
  • FIGS. 3A through 3I are conceptual diagrams illustrating an electronic device according to exemplary embodiments of the present invention.
  • FIG. 10 is a screen illustrating an electronic device's touch screen according to exemplary embodiment of the invention.
  • FIG. 2 The control method illustrated in FIG. 2 will be described in more detail with reference to FIGS. 3A through 3I .
  • step S 201 the electronic device 100 executes and displays an application.
  • the electronic device 100 including the projector module 177 may display a launcher application on the touch screen 190 .
  • the launcher application displays icons 301 , 302 , 303 , and 304 corresponding to applications installed on the electronic device 100 on the touch screen 190 .
  • a user 10 selects a map application icon 301 which corresponds to one of the icons 301 , 302 , 303 , and 304 displayed on the launcher application.
  • exemplary embodiments of the present invention such as the example illustrated in FIG.
  • the user 10 inputs a gesture of touching the map application icon 301 by using a part of the body (e.g., a finger), to execute the application.
  • a part of the body e.g., a finger
  • the user 10 may also select the map application icon 301 by using a coordinate designating device such as a stylus pen.
  • the user 10 may also select the map application icon 301 by inputting various gestures as well as the touch gesture.
  • An application execution method may be easily changed and carried out, and an artisan will understand and appreciate that the scope of the present invention is not limited by the application execution method.
  • the controller 110 executes an application and display an application execution screen 310 as illustrated in FIG. 3B .
  • the application execution screen 310 corresponding to the map application may include a map indicating positions of “A Theater”, “B Sports”, and “C Hair shop” and a street nearby and a plurality of function keys 311 , 312 , 313 , and 314 .
  • the first function key 311 may be a function key for displaying a map
  • the second function key 312 may be a function key for displaying an aerial photograph
  • the third function key 313 may be a function key for enlarging the map
  • the fourth function key 314 may be a function key for reducing the map.
  • step S 203 the electronic device 100 executes a projection display mode.
  • step S 205 projects and displays an application execution screen through the projector module 177 .
  • the controller 110 controls the projector module 177 to project the same image as the application execution screen 310 displayed on the touch screen 190 .
  • the projector module 177 generates and projects a projection image 320 which is the same as the application execution screen 310 .
  • the projection image 320 includes first through fourth projection function keys 321 through 324 corresponding to the first through fourth function keys 311 through 314 .
  • the controller 110 controls to display the screen including a check box to select one of a normal mode and a touch screen pointer mode on the application execution screen 310 .
  • the normal mode may be a mode for dispatching a gesture sensed (e.g., detected) on the touch screen 190 to the application.
  • the touch screen pointer mode may be a mode for using the gesture sensed on the touch screen 190 for movement of the pointer on the projection image 320 , instead of dispatching the sensed gesture to the application.
  • projector setting screen 380 includes a check box to select a touch pad 381 and a racing mode 383 .
  • a projector setting screen 380 includes a adjust focus menu 385 and a rotate projection menu 387 and a enable quick pad menu 389 .
  • the check box to select a touch pad 381 is for switching between the normal mode and the touch screen pointer mode.
  • the check box to select a racing mode 383 is for activating a new mode dedicated for racing games.
  • the adjust focus menu 385 is for adjusting the focus
  • the adjust focus menu 385 includes a scroll bar and a auto menu.
  • the check box to an auto focus mode is for activating an auto focus function.
  • the rotate projection menu 387 is for selecting landscape or portrait projection mode.
  • enable quick pad 389 is for activating a presentation mode.
  • the controller 110 displays a toggling switch 315 for determining (e.g., for selecting) one of a normal mode and a touch screen pointer mode (or a touch screen pointer mode) on the application execution screen 310 .
  • step S 207 the controller 110 determines whether the electronic device 100 is operating the normal mode or the touch screen pointer mode. For example, in FIG. 3C , if the toggling switch 315 is set to be positioned to the left, the controller 110 may set the normal mode; if the toggling switch 315 is set to be positioned to the right, the controller 110 may set the touch screen pointer mode. For example, in step S 207 , the controller 110 determines whether the electronic device 100 is operating in a touch screen pointer mode.
  • step S 207 If the controller 110 determines that the electronic device 100 is not operating in the touch screen pointer mode in step S 207 , then the controller 110 proceeds to step S 209 in which the controller 110 operatively outputs an event corresponding to a gesture input to the touch screen 190 . Thereafter, the controller 110 proceeds to step S 205 .
  • the toggling switch 315 is set to be positioned to the left and thus the normal mode is set.
  • the controller 110 may send the continuous touch gesture 331 sensed (e.g., detected) on the touch screen 190 to the map application.
  • the map application may be set to output a map corresponding to a region located to the right of the currently displayed region (e.g., the map may operatively pan left).
  • the controller 110 displays an application execution screen 340 including the map corresponding to the right region, which is the output of the map application.
  • the application execution screen 340 includes “A Theater”, “B Sports”, and “D Food” and geographical features of a street nearby.
  • “A Theater” and “B Sports” displayed to the right on the application execution screen 320 are displayed to the left on the application execution screen 340 .
  • the application execution screen 340 displays the map corresponding to the left region of the application execution screen 320 .
  • the gesture sensed on the touch screen 190 is sent to the application.
  • the application outputs an event corresponding to the sensed gesture (detected) in step S 209 .
  • the event corresponding to the sensed (e.g., detected) gesture is a change in the display screen
  • the electronic device 100 may project and display the changed screen 350 on the touch screen 190 .
  • the electronic device 100 may also project the changed screen.
  • the display screen change as the event corresponding to the sensed (e.g., detected) gesture is merely an example, and the electronic device 100 may also output events in various forms, such as voice, light, vibration, or the like, to correspond to the sensed (e.g., detected) gesture.
  • step S 207 If the controller 110 determines that the electronic device 100 is operating in the touch screen pointer mode in step S 207 , then the controller 110 proceeds to step S 211 . For example, referring to FIG. 2 , if the controller 110 sets the touch screen pointer mode, and the controller 110 proceeds to display the pointer corresponding to the gesture sensed (e.g., detected) on the touch screen 190 .
  • FIG. 3F is a conceptual diagram illustrating a case in which the toggling switch 315 is set to be positioned to the right according to an exemplary embodiment of the present invention.
  • the touch screen pointer mode is set.
  • the controller 110 displays a pointer 361 on the application execution screen 310 .
  • the projector module 177 displays a projected pointer 362 corresponding to the pointer 361 on the projection image 320 .
  • controller 110 intercepts the leftward continuous touch gesture 371 sensed (e.g., detected) on the touch screen 190 and thus does not dispatch the continuous touch gesture 371 to the map application.
  • the controller 110 moves and displays the pointer 361 to correspond to the leftward continuous touch gesture 371 .
  • the controller 110 moves the pointer 361 according to the leftward continuous gesture 371 .
  • the controller 110 controls the touch screen 190 to move the pointer 361 to the left and displays the pointer 361 ′ to correspond to the leftward continuous touch gesture 371 .
  • the controller 110 controls the projector module 177 to move the projected pointer 362 to the left and displays the projected pointer 362 ′.
  • the application execution screen 310 and the projection image 320 do not change.
  • the touch screen pointer mode may be set to move the pointer 361 .
  • the toggling switch 315 may be moved to the left to set the normal mode.
  • the user 10 may perform manipulation for displaying the map corresponding to the right region by inputting the leftward continuous touch gesture in the normal mode.
  • the user 10 may also move the toggling switch 315 back to the right (e.g., to set the electronic device 100 to the touch screen pointer mode) to generate the pointer 361 and may perform manipulation for moving the pointer 361 to a desired point.
  • FIG. 3I is a conceptual diagram illustrating the electronic device according to an exemplary embodiment of the present invention.
  • the touch screen pointer mode when the touch screen pointer mode is set, only the toggling switch 315 may be displayed on the touch screen 190 . Also if the touch screen pointer mode is activated, then the touch screen may be inactivated to save a battery.
  • the projector module 177 may display the application execution screen 320 , while the touch screen 190 does not display a corresponding application execution screen. Rather, the touch screen 190 may display the toggling switch 315 and/or another screen.
  • FIGS. 4A and 4B are flowcharts illustrating a method for controlling an electronic device according to exemplary embodiments of the present invention.
  • step S 401 the electronic device executes the touch screen pointer mode.
  • step S 403 the electronic device senses (e.g., detects) a continuous touch gesture which is input to the touch screen.
  • step S 405 the electronic device determines the distance and direction of the sensed (e.g., detected) continuous touch gesture.
  • step S 407 the electronic device determines the moving distance and direction of the pointer based on the determined distance and direction. For example, the electronic device determines a position to which the pointer is to be moved (or the pointer's position after movement).
  • step S 409 the electronic device moves and displays the pointer to the determined position.
  • step S 401 the electronic device executes the touch screen pointer mode in step S 401 .
  • step S 403 the electronic device senses (e.g., detects) a continuous touch gesture which is input to the touch screen.
  • step S 411 the electronic device determines coordinates on the touch screen, which correspond to a start point and an end point of the continuous touch gesture.
  • the coordinates of the start point on the touch screen may correspond to (tx1, ty) and the coordinates of the end point on the touch screen may correspond to (tx2, ty2).
  • tx1 and tx2 correspond to horizontal positions
  • ty1 and y2 correspond to vertical positions
  • each value is determined on a pixel basis.
  • the start point of the continuous touch gesture may be a point at which a down gesture is input.
  • the down gesture may be a touch on the touch screen by a user's finger or a coordinate designating device.
  • the end point of the continuous touch gesture may be a point corresponding to an up gesture.
  • the up gesture may be removal of the touch of the user's finger or the coordinate designating device from the touch screen.
  • a gesture for moving the coordinates of the touch point between the up gesture and the down gesture may be called a move gesture.
  • step S 413 the controller 110 calculates a horizontal distance dX and a vertical distance dY based on the coordinates of the start point and the end point of the continuous touch gesture in step S 413 .
  • the controller 110 may calculate the horizontal distance dX and the vertical distance dY by using Equation 1:
  • step S 415 the controller 110 scales a distance based on a scale factor.
  • a horizontal scale factor may correspond to stX and a vertical scale factor may correspond to stY.
  • the controller 110 may scale a pointer's moving distance by using Equation 2:
  • Equation (2) Pdx indicates a horizontal moving distance of the scaled pointer and Pdy indicates a vertical moving distance of the scaled pointer.
  • stX is a result of dividing a width of a projector screen by a width of a touch screen
  • stY is a result of dividing a height of the projector screen by a height of the touch screen.
  • step S 417 the controller 110 calculates the position to which the pointer is to be moved.
  • the controller 110 may calculate the pointer's position after movement by using Equation 3:
  • Equation (3) pX1 and pY1 indicate horizontal and vertical coordinates of the position of the pointer prior to movement.
  • pX2 and pY2 indicate horizontal and vertical coordinates of the position of the pointer after movement.
  • step S 419 the controller 110 processes the calculated pointer's position (pX2, pY2) lest the position (pX2, pY2) should be outside a boundary of the projection image.
  • the controller 110 may process the calculated pointer's position (pX2, pY2) lest the position (pX2, pY2) should be outside the boundary of the projection image by using Equation (4):
  • the controller 110 may process the pointer's position after movement such that the position is in the boundary of the projection image.
  • the controller 110 determines the pointer's position (pX2, pY2) according to the foregoing process, and controls the projector module 177 to display the pointer in the determined position on the projection image.
  • FIG. 5 is a flowchart illustrating a method for controlling an electronic device according to an exemplary embodiment of the present invention.
  • FIGS. 6A and 6 B are conceptual diagrams illustrating an electronic device according to an exemplary embodiment of the present invention.
  • step S 501 the electronic device executes the touch screen pointer mode.
  • step S 503 the electronic device determines whether a pointer move command is input.
  • step S 503 If the electronic device determines that a pointer move command is input (e.g., if the electronic device senses (e.g., detects) a pointer move command that is input to the touch screen) in step S 503 , then the electronic device proceeds to step S 505 .
  • the electronic device may determine that a pointer move command is input.
  • step S 505 the electronic device moves the pointer and displays on the projection image in response to the pointer move command, without sending the pointer move command to the currently executed application.
  • step S 507 the electronic device determines whether a pointer execution command is input. If the electronic device determines that a pointer execution command is input in step S 507 (e.g., if electronic device senses a pointer execution command on the touch screen), then the electronic device proceeds to step S 509 .
  • the user 10 may sense a touch gesture 601 on the touch screen.
  • step S 509 the electronic sends dispatches the pointer execution command to the currently executed application to input the execution command to a position corresponding to the pointer (e.g., a fourth projection function key 324 ).
  • the pointer execution command may be sent to the currently executed application.
  • the user 10 may experience the inconvenience of performing manipulation for setting the normal mode and inputting the execution command.
  • the electronic device may be set to send the pointer execution command to the currently executed application if the pointer execution command is sensed (e.g., detected).
  • the pointer 362 is positioned over the fourth projection function key 324 . Accordingly, if a pointer execution command is detected, the electronic device executes a command corresponding to the fourth projection function key.
  • step S 511 the electronic device outputs an event corresponding to the execution command.
  • the electronic device may output an event of a fourth function key 314 corresponding to the fourth projection function key 324 .
  • the electronic device may reduce and display a map to correspond to the fourth function key 314 .
  • the reduced map 380 including “A Theater”, “B Sports”, “C Hair Shop”, and “D Food” is displayed.
  • the pointer execution command is dispatched to the currently executed application, thus maximizing user convenience.
  • FIGS. 7A and 7B are flowcharts illustrating a method for controlling an electronic device according to exemplary embodiments of the present invention.
  • step S 701 the electronic device executes an application.
  • step S 703 the electronic device determines whether the electronic device is set to the normal mode.
  • step S 703 the electronic device determines that the electronic device is set to the normal mode in step S 703 . If the electronic device determines that the electronic device is set to the normal mode in step S 703 , then the electronic device proceeds to step S 705 in which the electronic device dispatches a gesture sensed (e.g., detected) on the touch screen to the currently executed application.
  • a gesture sensed e.g., detected
  • step S 703 determines that the electronic device is not set to the normal mode in step S 703 . If the electronic device determines that the electronic device is not set to the normal mode in step S 703 , then the electronic device proceeds to step S 707 in which the electronic device determines whether the electronic device is set to the touch screen pointer mode.
  • step S 707 If the electronic device determines that the electronic device is set to the touch screen pointer mode in step S 707 , then the electronic device proceeds to step S 709 in which the electronic device moves and displays the pointer to correspond to the gesture sensed on the touch screen.
  • the electronic device may terminate the procedure.
  • step S 701 the electronic device executes the application. Steps S 701 through S 707 have been described with reference to FIG. 7A , and thus will not be described at this time.
  • step S 711 the electronic device may determine whether a sensed (e.g., detected) command corresponds to a pointer move command or a pointer execution command. For example, if a continuous touch gesture is sensed (e.g., detected), the electronic device may determine that the sensed (e.g., detected) command corresponds to the pointer move command. If the touch gesture is sensed (e.g., detected), the electronic device may determine that the sensed (e.g., detected) command corresponds to the pointer execution command.
  • a sensed (e.g., detected) command corresponds to a pointer move command or a pointer execution command. For example, if a continuous touch gesture is sensed (e.g., detected), the electronic device may determine that the sensed (e.g., detected) command corresponds to the pointer move command. If the touch gesture is sensed (e.g., detected), the electronic device may determine that the sensed (e.g., detected) command corresponds to the pointer execution command.
  • step S 711 If the electronic device determines that a sensed (e.g., detected) command corresponds to the pointer move command in step S 711 , then the electronic device proceeds to step S 709 in which the electronic device does not send the sensed (e.g., detected) pointer move command to the application and moves and displays the pointer in step S 709 .
  • step S 711 the electronic device determines that a sensed (e.g., detected) command does not correspond to the pointer move command in step S 711 . If the electronic device determines that a sensed (e.g., detected) command does not correspond to the pointer move command in step S 711 , then the electronic device proceeds to step S 713 in which the electronic device determines whether a sensed (e.g., detected) command corresponds to a pointer execution command.
  • step S 713 If the electronic device determines that the sensed (e.g., detected) command corresponds the pointer execution command in step S 713 , then the electronic device proceeds to step S 705 in which the electronic device sends the detected pointer execution command to the application in.
  • the electronic device may input the execution command to a portion of the touch screen corresponding to a point in which the pointer is positioned.
  • the electronic device may terminate the procedure.
  • FIGS. 8A and 8B are conceptual diagrams illustrating the electronic device according to exemplary embodiments of the present invention.
  • the electronic device 100 performs communication with an external device 800 .
  • the electronic device 100 may perform data transmission and reception with the external device 800 , for example, based on Bluetooth, infrared communication, Zig-bee communication, and/or the like.
  • the electronic device 100 transmits a displayed screen 310 to the external device 800 .
  • the external device 800 displays a received screen 320 .
  • the user may input the pointer move command, such as a continuous touch gesture, to the touch screen 190 of the electronic device 100 .
  • the electronic device 100 may determine the pointer's position based on the sensed (e.g., detected) pointer move command.
  • the electronic device 100 displays the pointer on the screen 310 and transmits the pointer-displayed screen 310 to the external device 800 .
  • the external device 800 displays the received pointer-displayed screen 320 .
  • the electronic device 100 may detect a gesture and based on the detected gesture, may determine the pointer's position, and may send, to the external device 800 , information corresponding to a displayed screen and applicable pointer position information.
  • the electronic device 100 may also transmit and receive data with the external device 800 through relay of a server 810 .
  • the electronic device 100 may transmit the pointer-displayed screen to the external device 800 in response to the pointer move command sensed (e.g., detected) on the touch screen 190 .
  • FIG. 9 is a flowchart illustrating a method for controlling an electronic device according to an exemplary embodiment of the present invention.
  • the input gesture is emulated according to position of the pointer.
  • the input gesture corresponds to touch screen events.
  • touch screen events There are 3 types of touch screen events: down, move and up events.
  • the down events means beginning of a gesture
  • the move event means changing position of finger during the gesture
  • the up event means finishing the gesture.
  • the gesture can be further interpreted as a tap gesture, a long tap gesture or a drag gesture.
  • step S 901 the electronic device acquires an input first gesture.
  • step S 903 the electronic device determines whether the input first gesture corresponds to a down gesture.
  • step S 903 If the electronic device determines that the input first gesture does not correspond to a down gesture in step S 903 , then the electronic device proceeds to step S 901 .
  • step S 903 determines that the input first gesture corresponds to a down gesture in step S 903 . If the electronic device determines that the input first gesture corresponds to a down gesture in step S 903 , then the electronic device proceeds to step S 905 in which the electronic device executes a timer. Thereafter, the electronic device proceeds to step S 907 .
  • step S 907 the electronic device determines whether a second gesture is input within a preset time.
  • step S 905 If the electronic device determines that a second gesture is not input within a preset time (e.g., if the timer is executed and a second gesture is not input within a preset time) in step S 905 , then the electronic device proceeds to step S 909 in which the electronic device sends the sensed (e.g., detected) first gesture to a currently executed application. Then, the pointer execution command (S 713 ) is activated. For example, when a user inputs a long tap gesture onto the touch screen, the electronic device is intercepting the touch screen events and the electronic device sends the event which was hold by interception to the application. Then, the application performs the action corresponding to the long tap gesture.
  • the electronic device determines that a second gesture is not input within a preset time (e.g., if the timer is executed and a second gesture is not input within a preset time) in step S 905 , then the electronic device proceeds to step S 909 in which the electronic device sends the sensed
  • the electronic device changes touch coordinates into pointer coordinates. Then, the pointer execution command is activated. Additionally, the electronic device may generate a vibration through the vibration motor 164 to inform the user of whether the pointer execution command is activated.
  • step S 905 If the electronic device determines that a second gesture is input within a preset time (e.g., if the timer is executed and the second gesture is input within the preset time) in step S 905 , then the electronic device proceeds to step S 911 in which the electronic device determines whether the sensed first gesture and second gesture form a move gesture together.
  • a preset time e.g., if the timer is executed and the second gesture is input within the preset time
  • step S 911 the electronic device determines that the first gesture and the second gesture form the move gesture together in step S 911 . If the electronic device determines that the first gesture and the second gesture form the move gesture together in step S 911 , then the electronic device proceeds to step S 915 in which the electronic device may move and display the pointer to correspond to the move gesture. The electronic device may further move and display the pointer to correspond to subsequently input gestures. If the electronic device determines that the first gesture and the second gesture do not form the move gesture together in step S 911 , then the electronic device proceeds to step S 913 in which the electronic device determines that the sensed (e.g., detected) second gesture is an up gesture, and sends the second gesture to the currently executed application.
  • the sensed (e.g., detected) second gesture is an up gesture
  • an electronic device including a projector for adding and displaying a pointer corresponding to a gesture sensed on a touch screen to a projected image and a method for controlling the electronic device may be provided.
  • a user may easily manipulate a smart phone or a tablet PC while observing the projected image, such that user convenience may be maximized.
  • the above-described methods according to exemplary embodiments of the present invention can be implemented in hardware, firmware or as software or computer code that is stored on a non-transitory machine readable medium such as a Compact Disc (CD) ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk or computer code downloaded over a network originally stored on a remote recording medium or a non-transitory machine readable medium and stored on a local non-transitory recording medium, so that the methods described herein are loaded into hardware such as a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an Application-Specific Integrated Circuit (ASIC) or Field-Programmable Gate Array (FPGA).
  • ASIC Application-Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the computer, the processor, microprocessor controller or the programmable hardware include memory components (e.g., RAM, ROM, Flash, and the like) that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein.
  • memory components e.g., RAM, ROM, Flash, and the like
  • the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing shown herein.
  • a “processor” or “microprocessor” constitutes hardware. Under the broadest reasonable interpretation, the appended claims constitute statutory subject matter in compliance with 35 U.S.C. ⁇ 101 and none of the elements correspond to software per se.
  • unit or “module” as may be used herein is to be understood as constituting hardware such as a processor or microprocessor configured for a certain desired functionality in accordance with statutory subject matter under 35 U.S.C. ⁇ 101 and does not constitute software per se.
  • the electronic device may receive and store a program including machine executable code that is loaded into hardware such as a processor and executed to configure the hardware, and the machine executable code may be provided from an external device connected in a wired or wireless manner.
  • the device providing the machine executable code can include a non-transitory memory for storing the machine executable code that when executed by a process will instruct the electronic device to execute the claimed method for controlling the electronic device, information necessary for the method for controlling the electronic device, and so forth, a communication unit for performing wired or wireless communication with the portable terminal, and a controller for transmitting a corresponding program to the electronic device at the request of the electronic device or automatically.

Abstract

A method for controlling an electronic device that executes and displays an application is provided. The method includes displaying an execution screen of the application on a touch screen, projecting and displaying a projection image corresponding to the execution screen of the application, displaying a pointer on the projection image, and moving and displaying the pointer on the projection image to correspond to a gesture detected on the touch screen.

Description

    PRIORITY
  • This application claims the benefit under 35 U.S.C. §119(a) of a Korean patent application filed on Mar. 28, 2013 in the Korean Intellectual Property Office and assigned Serial No. 10-2013-0033598, the entire disclosure of which is hereby incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an electronic device including a projector and a method for controlling the electronic device. More particularly, the present invention relates to an electronic device for displaying a rendered screen on a screen of the electronic device or for projecting and displaying the rendered screen through a projector and a method for controlling the electronic device.
  • 2. Description of the Related Art
  • Recently, smart phones or tablet Personal Computers (PCs) have become popular among consumers. As a result, applications using the smart phones or the tablet PCs are being actively developed. According to the related art, the smart phones or the tablet PCs typically include touch screens, and users may manipulate the smart phones or the tablet PCs by inputting a predetermined gesture onto the touch screens.
  • However, the smart phones or the tablet PCs emphasize portability, and thus the size of the touch screen of the smart phone or the tablet PC is limited. In particular, when the user desires to view a moving image or a still image by using the smart phone or the tablet PC, the user may experience a difficulty in viewing the moving image or the still image due to the limited-size touch screen.
  • According to the related art, a smart phone or tablet PC may project and display an image onto a screen by using a projector module. The user views the projected image rather than viewing the image displayed on the limited-size touch screen.
  • According to the related art, in order to input a particular command, the user inputs a particular gesture onto the touch screen. For example, the user needs to input the particular gesture while observing the touch screen, but cannot check this process on a projected image. For example, because the user-input gesture is input using a hand or a stylus pen and a smart phone or tablet PC according to the related art does not display a position of the hand or stylus pen on the touch screen, the user is unable to check (e.g., confirm) the desired input to the touch screen. The smart phone or tablet PC merely projects and displays rendered data and cannot project and display a gesture that the user physically inputs onto the touch screen.
  • Therefore, a need exists for a technique which allows a user to easily manipulate a smart phone or a tablet PC while observing a projected image.
  • The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present invention.
  • SUMMARY OF THE INVENTION
  • Aspects of the present invention are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, aspects of the present invention provide an electronic device including a projector which adds a pointer corresponding to a gesture sensed on a touch screen to and display the pointer on a projected image and a method for controlling the electronic device.
  • In accordance with an aspect of the present invention, a method for controlling an electronic device that executes and displays an application is provide. The method includes displaying an execution screen of the application on a touch screen, projecting a projection image corresponding to the execution screen of the application, displaying a pointer on the projection image, and moving and displaying the pointer on the projection image to correspond to a gesture detected on the touch screen.
  • In accordance with another aspect of the present invention, an electronic device is provided. The electronic device includes a touch screen for displaying an execution screen of an application, a projector module for projecting and displaying a projection image corresponding to the execution screen of the application, and a controller for displaying a pointer on the projection image, and for moving and displaying the pointer on the projection image to correspond to a gesture detected on the touch screen.
  • Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other features and advantages of exemplary embodiments of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a schematic block diagram illustrating an electronic device according to an exemplary embodiment of the present invention;
  • FIG. 2 is a flowchart illustrating a method for controlling an electronic device including a projector module according to an exemplary embodiment of the present invention;
  • FIGS. 3A through 3I are conceptual diagrams illustrating an electronic device according to exemplary embodiments of the present invention;
  • FIGS. 4A and 4B are flowcharts illustrating a method for controlling an electronic device according to exemplary embodiments of the present invention;
  • FIG. 5 is a flowchart illustrating a method for controlling an electronic device according to an exemplary embodiment of the present invention;
  • FIGS. 6A and 6B are conceptual diagrams illustrating an electronic device according to an exemplary embodiment of the present invention;
  • FIGS. 7A and 7B are flowcharts illustrating a method for controlling an electronic device according to exemplary embodiments of the present invention;
  • FIGS. 8A and 8B are conceptual diagrams illustrating an electronic device according to exemplary embodiments of the present invention; and
  • FIG. 9 is a flowchart illustrating a method for controlling an electronic device according to an exemplary embodiment of the present invention.
  • FIG. 10 is a screen illustrating an electronic device's touch screen according to exemplary embodiment of the invention.
  • Throughout the drawings, like reference numerals will be understood to refer to like parts, components, and structures.
  • DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
  • The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
  • The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention is provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
  • It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
  • By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
  • Exemplary embodiments of the present invention will be provided to achieve the above-described technical aspects of the present invention. In an exemplary implementation, defined entities may have the same names, to which the present invention is not limited. Thus, exemplary embodiments of the present invention can be implemented with same or ready modifications in a system having a similar technical background.
  • As a non-exhaustive illustration only, a device (e.g., electronic device) described herein may refer to mobile devices such as a cellular phone, a Personal Digital Assistant (PDA), a digital camera, a portable game console, and an digital audio player, a Portable/Personal Multimedia Player (PMP), a handheld e-book, a portable lap-top Personal Computer (PC), a Global Positioning System (GPS) navigation, and the like capable of wireless communication or network communication consistent with that disclosed herein.
  • FIG. 1 is a schematic block diagram illustrating an electronic device according to an exemplary embodiment of the present invention.
  • Referring to FIG. 1, an electronic device 100 may be connected with an external device (not illustrated) by using a mobile communication module 120, a sub communication module 130, and a connector 165. The “external device” may include another device (not illustrated), a cellular phone (not illustrated), a smart phone (not illustrated), a tablet PC (not illustrated), a server (not illustrated), and the like.
  • The electronic device 100 may include a controller 110, a mobile communication module 120, a sub communication module 130, a multimedia module 140, a camera module 150, a GPS module 155, an input/output module 160, a sensor module 170, a storage unit 175, a power supply unit 180, a touch screen 190 and a touch screen controller 195.
  • The sub communication module 130 may include at least one of a Wireless Local Area Network (WLAN) module 131 and a short-range communication module 132 (e.g., a Near Field Communications (NFC) module).
  • The multimedia module 140 may include at least one of a broadcast communication module 141, an audio playback module 142, and a video playback module 143.
  • The camera module 150 may include at least one of a first camera 151 and a second camera 152.
  • The input/output module 160 may include at least one of a button 161, a microphone 162, a speaker 163, a vibration motor 164, a connector 165, and a keypad 166.
  • The controller 110 may include a Central Processing Unit (CPU) 111, a Read Only Memory (ROM) 112 in which a control program for controlling the electronic device 100 may be stored, and a Random Access Memory (RAM) 113 which stores a signal or data input from the electronic device 100 or which is used as a memory region for a task performed in the electronic device 100. The CPU 111 may include a various number of core processors (e.g., the CPU may include a single core, a dual core, a triple core, a quad core processor). The CPU 111, the ROM 112, and the RAM 113 may be interconnected through an internal bus.
  • The controller 110 controls the mobile communication module 120, the multimedia module 140, the camera module 150, the GPS module 155, the input/output module 160, the sensor module 170, the storing unit 175, the power supply unit 180, the touch screen 190, and the touch screen controller 195.
  • The mobile communication module 120 enables the electronic device 100 to be connected with an external device through mobile communication by using at least one antenna or plural-antennas (not illustrated) under control of the controller 110. The mobile communication module 120 transmits/receives a wireless signal for voice call, video call, a text message (e.g., Short Messaging Service (SMS)), a multimedia message (e.g., Multi Media Service (MMS)), and the like with a cellular phone (not illustrated), a smart phone (not illustrated), a tablet PC, or another electronic device (not illustrated) which has a phone number input to the electronic device 100.
  • The sub communication module 130 may include at least one of the WLAN module 131 and the short-range communication module 132. For example, the sub communication module 130 may include either the WLAN module 131 or the short-range communication module 132 or both of the WLAN module 131 and the short-range communication module 132.
  • The WLAN module 131 may be connected to the Internet through a wireless Access Point (AP) (not illustrated) under control of the controller 110. The WLAN module 131 supports the WLAN standard IEEE802.11x of the Institute of Electrical and Electronics Engineers (IEEE). The short-range communication module 132 may wirelessly perform short-range communication between the electronic device 100 and an image forming apparatus (not illustrated) under control of the controller 110. The short-range communication may include Bluetooth, Infrared Data Association (IrDA), or the like.
  • The electronic device 100 may include at least one of the mobile communication module 120, the WLAN module 131, and the short-range communication module 132. For example, the electronic device 100 may include a combination of the mobile communication module 120, the WLAN module 131, and the short-range communication module 132.
  • The multimedia module 140 may include the broadcast communication module 141, the audio playback module 142, or the video playback module 143. The broadcast communication module 141 receives a broadcast signal (e.g., a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like) and broadcast additional information (e.g., Electric Program Guide (EPG), Electric Service Guide (ESG), and the like) transmitted from a broadcasting station (not shown) via a broadcast communication antenna (not illustrated) under control of the controller 110. The audio playback module 142 may play a digital audio file (e.g., a file having a file extension such as ‘mp3’, ‘wma’, ‘ogg’, ‘wav’, and the like) stored or received under control of the controller 110. The video playback module 143 may play a digital video file (e.g., a file having a file extension such as ‘mpeg’, ‘mpg’, ‘mp4’, ‘avi’, ‘mov’, ‘mkv’, and the like) stored or received under control of the controller 110. The video playback module 143 may also play a digital audio file.
  • The multimedia module 140 may include the audio playback module 142 and the video playback module 143, except for the broadcast communication module 141. According to exemplary embodiments of the present invention, the audio playback module 142 and/or the video playback module 143 of the multimedia module 140 may be included in the controller 100.
  • The camera module 150 may include at least one of the first camera 151 and the second camera 152 which capture a still image or a video under control of the controller 110. The first camera 151 or the second camera 152 may include an auxiliary light source (e.g., a flash (not illustrated)) which provides light of an amount necessary for photographing. According to an exemplary embodiment of the present invention, first camera 151 may be positioned on the front surface of the electronic device 100, and the second camera 152 may be positioned on the rear surface of the electronic device 100. According to another exemplary embodiment of the present invention, the first camera 151 and the second camera 152 may be positioned adjacent to each other (e.g., a space between the first camera 151 and the second camera 152 is greater than 1 cm and less than 8 cm) to capture a 3-Dimensional (3D) still image or a 3D moving image.
  • The GPS module 155 receives electric waves from a plurality of GPS satellites (not illustrated) in the Earth's orbit, and calculates a location of the electronic device 100 by using a time of arrival from the GPS satellite (not illustrated) to the electronic device 100.
  • The input/output module 160 may include at least one of the button 161, the microphone 162, the speaker 163, the vibration motor 164, the connector 165, and the keypad 166.
  • The button 161 may be formed on at least one of a front surface, a side surface, and a rear surface of a housing (or case) of the electronic device 100, and may include at least one of a power/lock button, a volume button, a menu button, a home button, a back button, and a search button.
  • The microphone 162 receives voice or sound and generates a corresponding electric signal under control of the controller 110.
  • The speaker 163 outputs sound corresponding to various signals (e.g., wireless data, broadcast data, a digital audio file, a digital video file, a captured image, or the like) of the mobile communication module 120, the sub communication module 130, the multimedia module 140, or the camera module 150 to the outside of the electronic device 100 under control of the controller 110. The speaker 163 may output sound corresponding to a function executed by the electronic device 100 (e.g., button manipulation sound corresponding to a phone call or a ring back tone). One or more speakers 163 may be formed in a proper position or proper positions of a housing of the electronic device 100.
  • The vibration motor 164 may convert an electrical signal into mechanical vibration under control of the controller 110. For example, when the electronic device 100 in a vibration mode receives a voice call from another device (not illustrated), the vibration motor operates. A single vibration motor or multiple vibration motors may be formed in the housing of the electronic device 100. The vibration motor may operate in response to a user's touch on the touch screen 190 and continuous movement of the touch on the touch screen 190.
  • The connector 165 may be used as an interface for connecting the electronic device 100 with an external device (not illustrated) or a power source (not illustrated). Under control of the controller 110, through a wired cable connected to the connector 165, data stored in the storing unit 175 of the electronic device 100 may be transmitted to the external device (not illustrated) or data may be received from the external device (not illustrated). For example, through the wired cable connected to the connector 165, power may be input from the power source (not illustrated) or a battery (not illustrated) may be charged.
  • The keypad 166 may receive a key input from the user for control of the electronic device 100. The keypad 166 may include a physical keypad (not illustrated) formed in the electronic device 100 or a virtual keypad (not illustrated) displayed on the touch screen 190. The physical keypad (not illustrated) formed in the electronic device 100 may be excluded depending on the performance or structure of the electronic device 100.
  • The sensor module 170 may include at least one sensor for detecting a state of the electronic device 100. For example, the sensor module 170 may include a proximity sensor for detecting user's proximity to the electronic device 100, an illumination sensor (not illustrated) for detecting an amount of light around the electronic device 100, or a motion sensor (not illustrated), such as a geo-magnetic sensor or an acceleration sensor, for detecting a motion of the electronic device 100 (e.g., a rotation of the electronic device 100, or an acceleration or a vibration applied to the electronic device 100). The at least one sensor may detect a state of the electronic device 100, generate a signal corresponding to detection, and transmit the generated signal to the controller 110. The at least one sensor of the sensor module 170 may be added or removed according to implementation of the electronic device 100.
  • The storage unit 175 stores a signal or data which is input/output corresponding to operations of the mobile communication module 120, the sub communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the input/output module 160, the sensor module 170, or the touch screen 190, under control of the controller 110. The storage unit 175 may also store a control program and applications for control of the electronic device 100 and/or the controller 110.
  • The term “storage unit” may include the storage unit 175, the ROM 112 and the RAM 113 in the controller 110, a memory card (not illustrated) mounted in the portable terminal 100 (e.g., a Secure Digital (SD) card, a memory stick), and the like. The storage unit 175 may include a non-volatile memory, a volatile memory, a Hard Disk Drive (HDD), a Solid State Drive (SSD), and the like.
  • A projector module 177 projects and displays a rendered image. For example, the projector module 177 may include a light source for emitting light to be used in projection, a light-modulator for modulating light incident from the light source according to an image signal, and a lens unit for projecting the light incident from the light-modulator onto a screen.
  • The power supply unit 180 may supply power to at least one battery (not illustrated) disposed in the housing of the electronic device 100 under control of the controller 110. The at least one batter (not illustrated) supplies power to the electronic device 100. The power supply unit 180 may supply power input from an external power source (not illustrated) through a wired cable connected with the connector 165 to the electronic device 100.
  • The touch screen 190 may provide a user interface corresponding to various services (e.g., call, data transmission, broadcasting, picture/moving image capturing, and the like) to the user. The touch screen 190 may transmit an analog signal corresponding to at least one touch input to the user interface to the touch screen controller 195. The touch screen 190 may sense (e.g., detect) at least one touch through a user's body (e.g., a finger including a thumb) or a touch-possible input means (e.g., a stylus pen). The touch screen 190 may sense (e.g., detect) continuous movement of one of the at least one touch. The touch screen 190 may transmit an analog signal corresponding to continuous movement of the sensed (e.g., detected) touch to the touch screen controller 195.
  • According to exemplary embodiments of the present invention, the touch may include a contactless (e.g., a detectable distance between the touch screen 190 and the user's body or the touch-possible input means) touch as well as a contact between the touch screen 190 and the user's body or the touch-possible input means. In the touch screen 190, the detectable distance may vary depending on the performance or structure of the electronic device 100.
  • The touch screen 190 may include, for example, a first touch panel 190 a and a second touch panel 190 b. The first touch panel 190 a may measure a touch or approach of a part of the user's body. For example, the first touch panel 190 a may be of a resistive type, a capacitive type, an infrared type, an acoustic wave type, or the like.
  • The second touch panel 190 b may measure a touch or approach of a device such as a stylus pen. For example, the second touch panel 190 b may be of an Electromagnetic Resonance (EMR) measurement type.
  • The touch screen controller 195 converts an analog signal received from the touch screen 10 into a digital signal (e.g., X and Y coordinates) for transmission to the controller 110. The controller 110 then controls the touch screen 190 by using the digital signal received from the touch screen controller 195. For example, the controller 110 may select or execute a shortcut icon (not illustrated) displayed on the touch screen 190 in response to a touch. According to exemplary embodiments of the present invention, touch screen controller 195 may be included in the controller 110. The touch screen controller 195 may include a first touch panel controller 195 a for controlling the first touch panel 190 a and a second touch panel controller 195 b for controlling the second touch panel 190 b.
  • The controller 110 detects various user inputs received through the camera module 150, the input/output module 160, the sensor module 170, and the touch screen 190. The user inputs may include not only touches, but also various forms of information input to the electronic device 100, such as user's gesture, voice, eye movement, and biomedical signal. The controller 110 may control the overall operation of the electronic device 100 to perform a predetermined operation or function corresponding to the detected user input.
  • FIG. 2 is a flowchart illustrating a method for controlling an electronic device including projector module according to an exemplary embodiment of the present invention. FIGS. 3A through 3I are conceptual diagrams illustrating an electronic device according to exemplary embodiments of the present invention. FIG. 10 is a screen illustrating an electronic device's touch screen according to exemplary embodiment of the invention.
  • The control method illustrated in FIG. 2 will be described in more detail with reference to FIGS. 3A through 3I.
  • Referring to FIGS. 2 and 3A through 3I, in step S201, the electronic device 100 executes and displays an application. For example, as illustrated in FIG. 3A, the electronic device 100 including the projector module 177 may display a launcher application on the touch screen 190. The launcher application displays icons 301, 302, 303, and 304 corresponding to applications installed on the electronic device 100 on the touch screen 190. A user 10 selects a map application icon 301 which corresponds to one of the icons 301, 302, 303, and 304 displayed on the launcher application. According to exemplary embodiments of the present invention such as the example illustrated in FIG. 3A, the user 10 inputs a gesture of touching the map application icon 301 by using a part of the body (e.g., a finger), to execute the application. As an example, the user 10 may also select the map application icon 301 by using a coordinate designating device such as a stylus pen. The user 10 may also select the map application icon 301 by inputting various gestures as well as the touch gesture. An application execution method may be easily changed and carried out, and an artisan will understand and appreciate that the scope of the present invention is not limited by the application execution method.
  • The controller 110 executes an application and display an application execution screen 310 as illustrated in FIG. 3B. The application execution screen 310 corresponding to the map application may include a map indicating positions of “A Theater”, “B Sports”, and “C Hair shop” and a street nearby and a plurality of function keys 311, 312, 313, and 314. For example, the first function key 311 may be a function key for displaying a map, the second function key 312 may be a function key for displaying an aerial photograph, the third function key 313 may be a function key for enlarging the map, and the fourth function key 314 may be a function key for reducing the map.
  • Referring to FIG. 2, in step S203, the electronic device 100 executes a projection display mode.
  • In step S205, projects and displays an application execution screen through the projector module 177. For example, as illustrated in FIG. 3C, the controller 110 controls the projector module 177 to project the same image as the application execution screen 310 displayed on the touch screen 190. The projector module 177 generates and projects a projection image 320 which is the same as the application execution screen 310. The projection image 320 includes first through fourth projection function keys 321 through 324 corresponding to the first through fourth function keys 311 through 314.
  • Once when the electronic device 100 executes the projection display mode in step S203, the controller 110 controls to display the screen including a check box to select one of a normal mode and a touch screen pointer mode on the application execution screen 310. The normal mode may be a mode for dispatching a gesture sensed (e.g., detected) on the touch screen 190 to the application. Moreover, the touch screen pointer mode may be a mode for using the gesture sensed on the touch screen 190 for movement of the pointer on the projection image 320, instead of dispatching the sensed gesture to the application. Referring to FIG. 10, projector setting screen 380 includes a check box to select a touch pad 381 and a racing mode 383. Also, a projector setting screen 380 includes a adjust focus menu 385 and a rotate projection menu 387 and a enable quick pad menu 389. The check box to select a touch pad 381 is for switching between the normal mode and the touch screen pointer mode. Also, the check box to select a racing mode 383 is for activating a new mode dedicated for racing games. Also, the adjust focus menu 385 is for adjusting the focus, and the adjust focus menu 385 includes a scroll bar and a auto menu. Additionally, the check box to an auto focus mode is for activating an auto focus function. Also, the rotate projection menu 387 is for selecting landscape or portrait projection mode. Also, enable quick pad 389 is for activating a presentation mode.
  • Also the electronic device 100 executes the projection display mode in step S203, the controller 110 displays a toggling switch 315 for determining (e.g., for selecting) one of a normal mode and a touch screen pointer mode (or a touch screen pointer mode) on the application execution screen 310.
  • In step S207, the controller 110 determines whether the electronic device 100 is operating the normal mode or the touch screen pointer mode. For example, in FIG. 3C, if the toggling switch 315 is set to be positioned to the left, the controller 110 may set the normal mode; if the toggling switch 315 is set to be positioned to the right, the controller 110 may set the touch screen pointer mode. For example, in step S207, the controller 110 determines whether the electronic device 100 is operating in a touch screen pointer mode.
  • If the controller 110 determines that the electronic device 100 is not operating in the touch screen pointer mode in step S207, then the controller 110 proceeds to step S209 in which the controller 110 operatively outputs an event corresponding to a gesture input to the touch screen 190. Thereafter, the controller 110 proceeds to step S205.
  • For example, as illustrated in FIG. 3C, the toggling switch 315 is set to be positioned to the left and thus the normal mode is set. Hence, if the user 10 inputs a leftward continuous touch gesture (or drag gesture) 331 as illustrated in FIG. 3D, the controller 110 may send the continuous touch gesture 331 sensed (e.g., detected) on the touch screen 190 to the map application.
  • If the leftward continuous touch gesture 331 is sensed, the map application may be set to output a map corresponding to a region located to the right of the currently displayed region (e.g., the map may operatively pan left). As illustrated in FIG. 3E, the controller 110 displays an application execution screen 340 including the map corresponding to the right region, which is the output of the map application. The application execution screen 340 includes “A Theater”, “B Sports”, and “D Food” and geographical features of a street nearby. In particular, “A Theater” and “B Sports” displayed to the right on the application execution screen 320 are displayed to the left on the application execution screen 340. For example, the application execution screen 340 displays the map corresponding to the left region of the application execution screen 320.
  • In the normal mode, the gesture sensed on the touch screen 190 is sent to the application. The application outputs an event corresponding to the sensed gesture (detected) in step S209. For example, if the event corresponding to the sensed (e.g., detected) gesture is a change in the display screen, the electronic device 100 may project and display the changed screen 350 on the touch screen 190. Moreover, the electronic device 100 may also project the changed screen. According to exemplary embodiments of the present invention, the display screen change as the event corresponding to the sensed (e.g., detected) gesture is merely an example, and the electronic device 100 may also output events in various forms, such as voice, light, vibration, or the like, to correspond to the sensed (e.g., detected) gesture.
  • If the controller 110 determines that the electronic device 100 is operating in the touch screen pointer mode in step S207, then the controller 110 proceeds to step S211. For example, referring to FIG. 2, if the controller 110 sets the touch screen pointer mode, and the controller 110 proceeds to display the pointer corresponding to the gesture sensed (e.g., detected) on the touch screen 190.
  • FIG. 3F is a conceptual diagram illustrating a case in which the toggling switch 315 is set to be positioned to the right according to an exemplary embodiment of the present invention.
  • Referring to FIG. 3F, when the toggling switch 315 is set to be positioned to the right, the touch screen pointer mode is set. Once the touch screen pointer mode is set, the controller 110 displays a pointer 361 on the application execution screen 310. The projector module 177 displays a projected pointer 362 corresponding to the pointer 361 on the projection image 320.
  • Referring to FIG. 3G, if the user 10 inputs a leftward continuous touch gesture 371 while the electronic device is operating in the touch screen pointer mode, then controller 110 intercepts the leftward continuous touch gesture 371 sensed (e.g., detected) on the touch screen 190 and thus does not dispatch the continuous touch gesture 371 to the map application. The controller 110 moves and displays the pointer 361 to correspond to the leftward continuous touch gesture 371. For example, rather than panning the map displayed by the map application according to a leftward continuous touch gesture 371 when the electronic device 100 is operating in a normal mode, if the electronic device 100 is operating in the touch screen pointer mode, the controller 110 moves the pointer 361 according to the leftward continuous gesture 371.
  • As illustrated in FIG. 3H, the controller 110 controls the touch screen 190 to move the pointer 361 to the left and displays the pointer 361′ to correspond to the leftward continuous touch gesture 371. The controller 110 controls the projector module 177 to move the projected pointer 362 to the left and displays the projected pointer 362′. In particular, because the leftward continuous touch gesture 371 is not dispatched to the map application, the application execution screen 310 and the projection image 320 do not change.
  • Thus, if the user 10 desires to select a particular point on the map and inform another user of the selected point during presentation, the touch screen pointer mode may be set to move the pointer 361. When the user 10 desires to move a point displayed on the map to the right, the toggling switch 315 may be moved to the left to set the normal mode. The user 10 may perform manipulation for displaying the map corresponding to the right region by inputting the leftward continuous touch gesture in the normal mode. The user 10 may also move the toggling switch 315 back to the right (e.g., to set the electronic device 100 to the touch screen pointer mode) to generate the pointer 361 and may perform manipulation for moving the pointer 361 to a desired point.
  • FIG. 3I is a conceptual diagram illustrating the electronic device according to an exemplary embodiment of the present invention.
  • Referring to FIG. 3I, when the touch screen pointer mode is set, only the toggling switch 315 may be displayed on the touch screen 190. Also if the touch screen pointer mode is activated, then the touch screen may be inactivated to save a battery. For example, according to exemplary embodiments of the present invention, the projector module 177 may display the application execution screen 320, while the touch screen 190 does not display a corresponding application execution screen. Rather, the touch screen 190 may display the toggling switch 315 and/or another screen.
  • FIGS. 4A and 4B are flowcharts illustrating a method for controlling an electronic device according to exemplary embodiments of the present invention.
  • Referring to FIG. 4A, in step S401, the electronic device executes the touch screen pointer mode.
  • In step S403, the electronic device senses (e.g., detects) a continuous touch gesture which is input to the touch screen.
  • In step S405, the electronic device determines the distance and direction of the sensed (e.g., detected) continuous touch gesture.
  • In step S407, the electronic device determines the moving distance and direction of the pointer based on the determined distance and direction. For example, the electronic device determines a position to which the pointer is to be moved (or the pointer's position after movement).
  • In step S409, the electronic device moves and displays the pointer to the determined position.
  • Referring to FIG. 4B, in step S401, the electronic device executes the touch screen pointer mode in step S401.
  • In step S403, the electronic device senses (e.g., detects) a continuous touch gesture which is input to the touch screen.
  • In step S411, the electronic device determines coordinates on the touch screen, which correspond to a start point and an end point of the continuous touch gesture.
  • For example, the coordinates of the start point on the touch screen may correspond to (tx1, ty) and the coordinates of the end point on the touch screen may correspond to (tx2, ty2). Herein, tx1 and tx2 correspond to horizontal positions, ty1 and y2 correspond to vertical positions, and each value is determined on a pixel basis.
  • The start point of the continuous touch gesture may be a point at which a down gesture is input. The down gesture may be a touch on the touch screen by a user's finger or a coordinate designating device. The end point of the continuous touch gesture may be a point corresponding to an up gesture. The up gesture may be removal of the touch of the user's finger or the coordinate designating device from the touch screen. A gesture for moving the coordinates of the touch point between the up gesture and the down gesture may be called a move gesture.
  • In step S413, the controller 110 calculates a horizontal distance dX and a vertical distance dY based on the coordinates of the start point and the end point of the continuous touch gesture in step S413. The controller 110 may calculate the horizontal distance dX and the vertical distance dY by using Equation 1:

  • dX=tx1−tx2

  • dY=ty1−ty2  Equation (1)
  • In step S415, the controller 110 scales a distance based on a scale factor. For example, a horizontal scale factor may correspond to stX and a vertical scale factor may correspond to stY. According to exemplary embodiments of the present invention, the controller 110 may scale a pointer's moving distance by using Equation 2:

  • PdX=dX*stX

  • PdY=dY*stY  Equation (2),
  • In Equation (2), Pdx indicates a horizontal moving distance of the scaled pointer and Pdy indicates a vertical moving distance of the scaled pointer. stX is a result of dividing a width of a projector screen by a width of a touch screen, and stY is a result of dividing a height of the projector screen by a height of the touch screen.
  • In step S417, the controller 110 calculates the position to which the pointer is to be moved. For example, the controller 110 may calculate the pointer's position after movement by using Equation 3:

  • pX2=pX1+PdX

  • pY2=pY1+PdY  Equation (3)
  • In Equation (3), pX1 and pY1 indicate horizontal and vertical coordinates of the position of the pointer prior to movement. pX2 and pY2 indicate horizontal and vertical coordinates of the position of the pointer after movement.
  • In step S419, the controller 110 processes the calculated pointer's position (pX2, pY2) lest the position (pX2, pY2) should be outside a boundary of the projection image. For example, it is assumed that the horizontal direction coordinates of the projection image are set to bx0 and bx1 and the vertical direction coordinates of the projection image are set to by0 and by1. The controller 110 may process the calculated pointer's position (pX2, pY2) lest the position (pX2, pY2) should be outside the boundary of the projection image by using Equation (4):

  • If pX2<bx0 then pX2=bx0

  • If pY2<by0 then pY2=by0

  • If pX2>bx1 then pX2=bx1

  • If pY2>by1 then pY2=by1  Equation (4)
  • For example, if the pointer's position after movement is determined to be outside the boundary of the projection image, the controller 110 may process the pointer's position after movement such that the position is in the boundary of the projection image.
  • The controller 110 determines the pointer's position (pX2, pY2) according to the foregoing process, and controls the projector module 177 to display the pointer in the determined position on the projection image.
  • FIG. 5 is a flowchart illustrating a method for controlling an electronic device according to an exemplary embodiment of the present invention. FIGS. 6A and 6B are conceptual diagrams illustrating an electronic device according to an exemplary embodiment of the present invention.
  • Referring to FIG. 5, the control method will be described in detail with reference to FIGS. 6A and 6B.
  • In step S501, the electronic device executes the touch screen pointer mode.
  • In step S503, the electronic device determines whether a pointer move command is input.
  • If the electronic device determines that a pointer move command is input (e.g., if the electronic device senses (e.g., detects) a pointer move command that is input to the touch screen) in step S503, then the electronic device proceeds to step S505. As an example, if the electronic device senses (e.g., detects) a continuous touch gesture from a first point to a second point on the touch screen, then the electronic device may determine that a pointer move command is input. In step S505, the electronic device moves the pointer and displays on the projection image in response to the pointer move command, without sending the pointer move command to the currently executed application.
  • In contrast, if the electronic device determines that a pointer move command is not input in step S503, then the electronic device proceeds to step S507 in which the electronic device determines whether a pointer execution command is input. If the electronic device determines that a pointer execution command is input in step S507 (e.g., if electronic device senses a pointer execution command on the touch screen), then the electronic device proceeds to step S509. For example, as illustrated in FIG. 6A, the user 10 may sense a touch gesture 601 on the touch screen.
  • In step S509, the electronic sends dispatches the pointer execution command to the currently executed application to input the execution command to a position corresponding to the pointer (e.g., a fourth projection function key 324). In other words, even if the touch screen pointer mode is set, when the pointer execution command is sensed (e.g., detected), the pointer execution command may be sent to the currently executed application. To input the execution command to the particular application, the user 10 may experience the inconvenience of performing manipulation for setting the normal mode and inputting the execution command. Hence, the electronic device may be set to send the pointer execution command to the currently executed application if the pointer execution command is sensed (e.g., detected). As illustrated in FIG. 6A, the pointer 362 is positioned over the fourth projection function key 324. Accordingly, if a pointer execution command is detected, the electronic device executes a command corresponding to the fourth projection function key.
  • In step S511, the electronic device outputs an event corresponding to the execution command. For example, the electronic device may output an event of a fourth function key 314 corresponding to the fourth projection function key 324. For example, as illustrated in FIG. 6B, the electronic device may reduce and display a map to correspond to the fourth function key 314. In FIG. 6B, the reduced map 380 including “A Theater”, “B Sports”, “C Hair Shop”, and “D Food” is displayed.
  • As discussed above, even if the touch screen pointer mode is set, when the pointer execution command is sensed (e.g., detected), the pointer execution command is dispatched to the currently executed application, thus maximizing user convenience.
  • FIGS. 7A and 7B are flowcharts illustrating a method for controlling an electronic device according to exemplary embodiments of the present invention.
  • Referring to FIG. 7A, in step S701, the electronic device executes an application.
  • In step S703, the electronic device determines whether the electronic device is set to the normal mode.
  • If the electronic device determines that the electronic device is set to the normal mode in step S703, then the electronic device proceeds to step S705 in which the electronic device dispatches a gesture sensed (e.g., detected) on the touch screen to the currently executed application.
  • In contrast, if the electronic device determines that the electronic device is not set to the normal mode in step S703, then the electronic device proceeds to step S707 in which the electronic device determines whether the electronic device is set to the touch screen pointer mode.
  • If the electronic device determines that the electronic device is set to the touch screen pointer mode in step S707, then the electronic device proceeds to step S709 in which the electronic device moves and displays the pointer to correspond to the gesture sensed on the touch screen.
  • In contrast, if the electronic device determines that the electronic device is not set to the touch screen pointer mode in step S707, then the electronic device may terminate the procedure.
  • Referring to FIG. 7B, in step S701, the electronic device executes the application. Steps S701 through S707 have been described with reference to FIG. 7A, and thus will not be described at this time.
  • If the electronic device determines that the electronic device is set to the touch screen pointer mode in step S707, then the electronic device proceeds to step S711 in which the electronic device may determine whether a sensed (e.g., detected) command corresponds to a pointer move command or a pointer execution command. For example, if a continuous touch gesture is sensed (e.g., detected), the electronic device may determine that the sensed (e.g., detected) command corresponds to the pointer move command. If the touch gesture is sensed (e.g., detected), the electronic device may determine that the sensed (e.g., detected) command corresponds to the pointer execution command.
  • If the electronic device determines that a sensed (e.g., detected) command corresponds to the pointer move command in step S711, then the electronic device proceeds to step S709 in which the electronic device does not send the sensed (e.g., detected) pointer move command to the application and moves and displays the pointer in step S709.
  • In contrast, if the electronic device determines that a sensed (e.g., detected) command does not correspond to the pointer move command in step S711, then the electronic device proceeds to step S713 in which the electronic device determines whether a sensed (e.g., detected) command corresponds to a pointer execution command.
  • If the electronic device determines that the sensed (e.g., detected) command corresponds the pointer execution command in step S713, then the electronic device proceeds to step S705 in which the electronic device sends the detected pointer execution command to the application in. The electronic device may input the execution command to a portion of the touch screen corresponding to a point in which the pointer is positioned.
  • In contrast, if the electronic device determines that the sensed (e.g., detected) command does not correspond the pointer execution command in step S713, then the electronic device may terminate the procedure.
  • FIGS. 8A and 8B are conceptual diagrams illustrating the electronic device according to exemplary embodiments of the present invention.
  • Referring to FIG. 8A, the electronic device 100 performs communication with an external device 800. The electronic device 100 may perform data transmission and reception with the external device 800, for example, based on Bluetooth, infrared communication, Zig-bee communication, and/or the like. The electronic device 100 transmits a displayed screen 310 to the external device 800. The external device 800 displays a received screen 320. The user may input the pointer move command, such as a continuous touch gesture, to the touch screen 190 of the electronic device 100. The electronic device 100 may determine the pointer's position based on the sensed (e.g., detected) pointer move command. The electronic device 100 displays the pointer on the screen 310 and transmits the pointer-displayed screen 310 to the external device 800. The external device 800 displays the received pointer-displayed screen 320. For example, the electronic device 100 may detect a gesture and based on the detected gesture, may determine the pointer's position, and may send, to the external device 800, information corresponding to a displayed screen and applicable pointer position information.
  • Referring to FIG. 8B, the electronic device 100 may also transmit and receive data with the external device 800 through relay of a server 810. Thus, even when the electronic device 100 is located remotely from the external device 800, the electronic device 100 may transmit the pointer-displayed screen to the external device 800 in response to the pointer move command sensed (e.g., detected) on the touch screen 190.
  • FIG. 9 is a flowchart illustrating a method for controlling an electronic device according to an exemplary embodiment of the present invention. When the electronic device acquires an input gesture, the input gesture is emulated according to position of the pointer. The input gesture corresponds to touch screen events. There are 3 types of touch screen events: down, move and up events. The down events means beginning of a gesture, and the move event means changing position of finger during the gesture, and the up event means finishing the gesture. The gesture can be further interpreted as a tap gesture, a long tap gesture or a drag gesture.
  • Referring to FIG. 9, in step S901, the electronic device acquires an input first gesture.
  • In step S903, the electronic device determines whether the input first gesture corresponds to a down gesture.
  • If the electronic device determines that the input first gesture does not correspond to a down gesture in step S903, then the electronic device proceeds to step S901.
  • If the electronic device determines that the input first gesture corresponds to a down gesture in step S903, then the electronic device proceeds to step S905 in which the electronic device executes a timer. Thereafter, the electronic device proceeds to step S907.
  • In step S907, the electronic device determines whether a second gesture is input within a preset time.
  • If the electronic device determines that a second gesture is not input within a preset time (e.g., if the timer is executed and a second gesture is not input within a preset time) in step S905, then the electronic device proceeds to step S909 in which the electronic device sends the sensed (e.g., detected) first gesture to a currently executed application. Then, the pointer execution command (S713) is activated. For example, when a user inputs a long tap gesture onto the touch screen, the electronic device is intercepting the touch screen events and the electronic device sends the event which was hold by interception to the application. Then, the application performs the action corresponding to the long tap gesture. Also when a user inputs a long tap gesture onto the touch screen, the electronic device changes touch coordinates into pointer coordinates. Then, the pointer execution command is activated. Additionally, the electronic device may generate a vibration through the vibration motor 164 to inform the user of whether the pointer execution command is activated.
  • If the electronic device determines that a second gesture is input within a preset time (e.g., if the timer is executed and the second gesture is input within the preset time) in step S905, then the electronic device proceeds to step S911 in which the electronic device determines whether the sensed first gesture and second gesture form a move gesture together.
  • If the electronic device determines that the first gesture and the second gesture form the move gesture together in step S911, then the electronic device proceeds to step S915 in which the electronic device may move and display the pointer to correspond to the move gesture. The electronic device may further move and display the pointer to correspond to subsequently input gestures. If the electronic device determines that the first gesture and the second gesture do not form the move gesture together in step S911, then the electronic device proceeds to step S913 in which the electronic device determines that the sensed (e.g., detected) second gesture is an up gesture, and sends the second gesture to the currently executed application.
  • According to various exemplary embodiments of the present invention, an electronic device including a projector for adding and displaying a pointer corresponding to a gesture sensed on a touch screen to a projected image and a method for controlling the electronic device may be provided.
  • Therefore, a user may easily manipulate a smart phone or a tablet PC while observing the projected image, such that user convenience may be maximized.
  • The above-described methods according to exemplary embodiments of the present invention can be implemented in hardware, firmware or as software or computer code that is stored on a non-transitory machine readable medium such as a Compact Disc (CD) ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk or computer code downloaded over a network originally stored on a remote recording medium or a non-transitory machine readable medium and stored on a local non-transitory recording medium, so that the methods described herein are loaded into hardware such as a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an Application-Specific Integrated Circuit (ASIC) or Field-Programmable Gate Array (FPGA). As would be understood in the art, the computer, the processor, microprocessor controller or the programmable hardware include memory components (e.g., RAM, ROM, Flash, and the like) that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein. In addition, it would be recognized that when a general purpose computer accesses code for implementing the processing shown herein, the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing shown herein. In addition, an artisan understands and appreciates that a “processor” or “microprocessor” constitutes hardware. Under the broadest reasonable interpretation, the appended claims constitute statutory subject matter in compliance with 35 U.S.C. §101 and none of the elements correspond to software per se.
  • The terms “unit” or “module” as may be used herein is to be understood as constituting hardware such as a processor or microprocessor configured for a certain desired functionality in accordance with statutory subject matter under 35 U.S.C. §101 and does not constitute software per se.
  • The electronic device may receive and store a program including machine executable code that is loaded into hardware such as a processor and executed to configure the hardware, and the machine executable code may be provided from an external device connected in a wired or wireless manner. The device providing the machine executable code can include a non-transitory memory for storing the machine executable code that when executed by a process will instruct the electronic device to execute the claimed method for controlling the electronic device, information necessary for the method for controlling the electronic device, and so forth, a communication unit for performing wired or wireless communication with the portable terminal, and a controller for transmitting a corresponding program to the electronic device at the request of the electronic device or automatically.
  • While the invention has been shown and described with reference to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents.

Claims (21)

What is claimed is:
1. A method for controlling an electronic device which executes and displays an application, the method comprising:
displaying an execution screen of the application on a touch screen;
projecting and displaying a projection image corresponding to the execution screen of the application;
displaying a pointer on the projection image; and
moving and displaying the pointer on the projection image to correspond to a gesture detected on the touch screen.
2. The method of claim 1, wherein the moving and displaying of the pointer comprises:
intercepting the detected gesture so as to not to send the detected gesture to the application.
3. The method of claim 1, wherein the detected gesture corresponds to a continuous touch gesture from a first point to a second point on the touch screen.
4. The method of claim 3, wherein the moving and displaying of the pointer comprises:
determining a distance and a direction of the continuous touch gesture; and
determining a position of the pointer after movement based on the determined distance and direction.
5. The method of claim 4, wherein the moving and displaying of the pointer further comprises:
determining a distance from the first point to the second point on the touch screen.
6. The method of claim 5, wherein the moving and displaying of the pointer further comprises:
scaling the determined distance based on a scale factor associated with a ratio between the touch screen and the projection image.
7. The method of claim 6, wherein the moving and displaying of the pointer further comprises:
calculating a position of the pointer after movement, based on the scaled distance.
8. The method of claim 1, further comprising:
displaying a toggle switch for toggling between a touch screen pointer mode and a normal mode,
wherein the touch screen pointer mode corresponds to a mode in which the gesture detected on the touch screen is used for movement of the pointer, and
wherein the normal mode corresponds to a mode in which the gesture detected on the touch screen is sent to the application.
9. The method of claim 8, wherein if the toggle switch is set to the normal mode, the detected gesture is not intercepted and the detected gesture is sent to the application.
10. The method of claim 1, wherein if the gesture detected on the touch screen is a pointer execution command, an execution command is input to a portion corresponding to the pointer on the projection image.
11. A non-transitory computer-readable storage medium storing instructions that, when executed, cause at least one processor to perform the method of claim 1.
12. An electronic device comprising:
a touch screen for displaying an execution screen of an application;
a projector module for projecting and displaying a projection image corresponding to the execution screen of the application; and
a controller for displaying a pointer on the projection image, and for moving and displaying the pointer on the projection image to correspond to a gesture detected on the touch screen.
13. The electronic device of claim 12, wherein the controller intercepts the detected gesture such that the detected gesture is not sent to the application.
14. The electronic device of claim 12, wherein the detected gesture corresponds to a continuous touch gesture from a first point to a second point on the touch screen.
15. The electronic device of claim 14, wherein the controller determines a distance and a direction of the continuous touch gesture, and determines a position of the pointer after movement based on the determined distance and direction.
16. The electronic device of claim 15, wherein the controller determines a distance from the first point to the second point on the touch screen.
17. The electronic device of claim 16, wherein the controller scales the determined distance based on a scale factor associated with a ratio between the touch screen and the projection image.
18. The electronic device of claim 17, wherein the controller calculates a position of the pointer after movement, based on the scaled distance.
19. The electronic device of claim 12, further comprising:
a toggle switch for toggling between a touch screen pointer mode and a normal mode,
wherein the touch screen pointer mode corresponds to a mode in which the gesture detected on the touch screen is used for movement of the pointer, and
wherein the normal mode corresponds to a mode in which the gesture detected on the touch screen is sent to the application.
20. The electronic device of claim 19, wherein if the toggle switch is set to the normal mode, the controller does not intercept the detected gesture and sends the detected gesture to the application.
21. The electronic device of claim 12, wherein if the gesture detected on the touch screen is a pointer execution command, the controller inputs an execution command to a portion corresponding to the pointer on the projection image.
US14/152,238 2013-03-28 2014-01-10 Electronic device including projector and method for controlling the electronic device Active 2034-12-14 US9569065B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020130033598A KR102097452B1 (en) 2013-03-28 2013-03-28 Electro device comprising projector and method for controlling thereof
KR10-2013-0033598 2013-03-28

Publications (2)

Publication Number Publication Date
US20140298271A1 true US20140298271A1 (en) 2014-10-02
US9569065B2 US9569065B2 (en) 2017-02-14

Family

ID=49213749

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/152,238 Active 2034-12-14 US9569065B2 (en) 2013-03-28 2014-01-10 Electronic device including projector and method for controlling the electronic device

Country Status (4)

Country Link
US (1) US9569065B2 (en)
EP (1) EP2784655B1 (en)
KR (1) KR102097452B1 (en)
CN (1) CN104077074A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD760209S1 (en) * 2013-04-11 2016-06-28 Huizhou Tcl Mobile Communication Co., Ltd. Mobile phone sheath embedded with projection device
US9692860B2 (en) * 2014-05-15 2017-06-27 Apple Inc. One layer metal trace strain gauge
WO2017114255A1 (en) * 2015-12-31 2017-07-06 青岛海尔股份有限公司 Touch method and device for projected image
US20180357942A1 (en) * 2017-05-24 2018-12-13 Compal Electronics, Inc. Display device and display method
US10223057B2 (en) 2017-03-17 2019-03-05 Dell Products L.P. Information handling system management of virtual input device interactions
US10228892B2 (en) * 2017-03-17 2019-03-12 Dell Products L.P. Information handling system management of virtual input device interactions
US20190361500A1 (en) * 2018-05-24 2019-11-28 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Electronic device and method for controlling user interface therein
CN112905136A (en) * 2021-03-11 2021-06-04 北京小米移动软件有限公司 Screen projection control method and device and storage medium
US11144173B2 (en) * 2015-11-05 2021-10-12 Samsung Electronics Co., Ltd Electronic device and method for providing object recommendation
US20220082661A1 (en) * 2019-05-31 2022-03-17 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Controlling method for electronic device, and electronic device

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881109B (en) * 2014-02-28 2018-08-10 联想(北京)有限公司 A kind of action identification method, device and electronic equipment
CN104980723A (en) * 2014-04-03 2015-10-14 洪水和 Tablet Projecting System with Modular External Connection
CN105335018A (en) * 2014-08-07 2016-02-17 联想(北京)有限公司 Control method and control system for electronic apparatus with touch function
CN105975150A (en) * 2016-04-27 2016-09-28 广东欧珀移动通信有限公司 User terminal control method and user terminal
US11349976B2 (en) 2019-09-12 2022-05-31 Lenovo (Beijing) Co., Ltd. Information processing method, file transmission method, electronic apparatus, and computing apparatus
CN110620845B (en) * 2019-09-12 2021-01-15 联想(北京)有限公司 Information processing method, electronic device and computing device
KR20230102557A (en) * 2021-12-30 2023-07-07 주식회사 브이터치 Method, system and non-transitory computer-readable recording medium for supporting user input

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050120079A1 (en) * 2003-09-10 2005-06-02 Anderson Jon J. High data rate interface
US20050213593A1 (en) * 2004-03-10 2005-09-29 Anderson Jon J High data rate interface apparatus and method
US20080018591A1 (en) * 2006-07-20 2008-01-24 Arkady Pittel User Interfacing
US20090167700A1 (en) * 2007-12-27 2009-07-02 Apple Inc. Insertion marker placement on touch sensitive display
US20090178008A1 (en) * 2008-01-06 2009-07-09 Scott Herz Portable Multifunction Device with Interface Reconfiguration Mode
US20090184924A1 (en) * 2006-09-29 2009-07-23 Brother Kogyo Kabushiki Kaisha Projection Device, Computer Readable Recording Medium Which Records Program, Projection Method and Projection System
US20100099457A1 (en) * 2008-10-16 2010-04-22 Lg Electronics Inc. Mobile communication terminal and power saving method thereof
US20100105428A1 (en) * 2008-10-24 2010-04-29 Lg Electronics Inc. Mobile terminal and method of controlling the same
US20100188428A1 (en) * 2008-10-15 2010-07-29 Lg Electronics Inc. Mobile terminal with image projection
US20110039606A1 (en) * 2009-08-12 2011-02-17 Lg Electronics Inc. Mobile terminal and power source controlling method thereof
US20110070920A1 (en) * 2009-09-24 2011-03-24 Saied Aasim M Method for a phone with content projector
US20110148789A1 (en) * 2009-12-18 2011-06-23 Samsung Electronics Co. Ltd. Mobile device having projector module and method for operating the same
US8152308B2 (en) * 2008-11-05 2012-04-10 Samsung Electronics Co., Ltd Mobile terminal having projector and method of controlling display unit in the mobile terminal
US20140223132A1 (en) * 2013-02-07 2014-08-07 Ricoh Company, Ltd. Information processing device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4044255B2 (en) * 1999-10-14 2008-02-06 富士通株式会社 Information processing apparatus and screen display method
US6750803B2 (en) 2001-02-23 2004-06-15 Interlink Electronics, Inc. Transformer remote control
KR20100039024A (en) * 2008-10-07 2010-04-15 엘지전자 주식회사 Mobile terminal and method for controlling display thereof
KR101520689B1 (en) 2008-10-22 2015-05-21 엘지전자 주식회사 a mobile telecommunication device and a method of scrolling a screen using the same
EP2343632A4 (en) 2008-10-24 2013-05-01 Nec Corp Touch panel device operating as if in the equivalent mode even when detected region is smaller than display region of display device
KR101631103B1 (en) 2009-02-05 2016-06-16 삼성전자주식회사 Method for controlling mobile terminal having projection function by using headset
KR20110046941A (en) 2009-10-29 2011-05-06 삼성전자주식회사 A mobile terminal with image projector and a method of control thereof
KR20110069526A (en) 2009-12-17 2011-06-23 삼성전자주식회사 Method and apparatus for controlling external output of a portable terminal
FR2959839A1 (en) * 2010-05-06 2011-11-11 France Telecom TERMINAL INCORPORATING A VIDEOPROJECTOR AND A SCREEN WITH A ZONE ALLOWING THE CONTROL OF A REMOTE POINTER PROJECTED BY THIS VIDEOPROJECTOR

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050120079A1 (en) * 2003-09-10 2005-06-02 Anderson Jon J. High data rate interface
US20050213593A1 (en) * 2004-03-10 2005-09-29 Anderson Jon J High data rate interface apparatus and method
US20080018591A1 (en) * 2006-07-20 2008-01-24 Arkady Pittel User Interfacing
US20090184924A1 (en) * 2006-09-29 2009-07-23 Brother Kogyo Kabushiki Kaisha Projection Device, Computer Readable Recording Medium Which Records Program, Projection Method and Projection System
US20090167700A1 (en) * 2007-12-27 2009-07-02 Apple Inc. Insertion marker placement on touch sensitive display
US20090178008A1 (en) * 2008-01-06 2009-07-09 Scott Herz Portable Multifunction Device with Interface Reconfiguration Mode
US20100188428A1 (en) * 2008-10-15 2010-07-29 Lg Electronics Inc. Mobile terminal with image projection
US20100099457A1 (en) * 2008-10-16 2010-04-22 Lg Electronics Inc. Mobile communication terminal and power saving method thereof
US20100105428A1 (en) * 2008-10-24 2010-04-29 Lg Electronics Inc. Mobile terminal and method of controlling the same
US8152308B2 (en) * 2008-11-05 2012-04-10 Samsung Electronics Co., Ltd Mobile terminal having projector and method of controlling display unit in the mobile terminal
US20110039606A1 (en) * 2009-08-12 2011-02-17 Lg Electronics Inc. Mobile terminal and power source controlling method thereof
US20110070920A1 (en) * 2009-09-24 2011-03-24 Saied Aasim M Method for a phone with content projector
US20110148789A1 (en) * 2009-12-18 2011-06-23 Samsung Electronics Co. Ltd. Mobile device having projector module and method for operating the same
US20140223132A1 (en) * 2013-02-07 2014-08-07 Ricoh Company, Ltd. Information processing device

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD760209S1 (en) * 2013-04-11 2016-06-28 Huizhou Tcl Mobile Communication Co., Ltd. Mobile phone sheath embedded with projection device
US9692860B2 (en) * 2014-05-15 2017-06-27 Apple Inc. One layer metal trace strain gauge
US11144173B2 (en) * 2015-11-05 2021-10-12 Samsung Electronics Co., Ltd Electronic device and method for providing object recommendation
WO2017114255A1 (en) * 2015-12-31 2017-07-06 青岛海尔股份有限公司 Touch method and device for projected image
US10223057B2 (en) 2017-03-17 2019-03-05 Dell Products L.P. Information handling system management of virtual input device interactions
US10228892B2 (en) * 2017-03-17 2019-03-12 Dell Products L.P. Information handling system management of virtual input device interactions
US20180357942A1 (en) * 2017-05-24 2018-12-13 Compal Electronics, Inc. Display device and display method
US10621897B2 (en) * 2017-05-24 2020-04-14 Compal Electronics, Inc. Display device with projection function and display method thereof
TWI709075B (en) * 2017-05-24 2020-11-01 仁寶電腦工業股份有限公司 Display device and display method
US20190361500A1 (en) * 2018-05-24 2019-11-28 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Electronic device and method for controlling user interface therein
US10969827B2 (en) * 2018-05-24 2021-04-06 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Electronic device and method for controlling user interface therein
US20220082661A1 (en) * 2019-05-31 2022-03-17 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Controlling method for electronic device, and electronic device
US11947045B2 (en) * 2019-05-31 2024-04-02 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Controlling method for electronic device, and electronic device
CN112905136A (en) * 2021-03-11 2021-06-04 北京小米移动软件有限公司 Screen projection control method and device and storage medium

Also Published As

Publication number Publication date
EP2784655B1 (en) 2019-07-17
KR20130086574A (en) 2013-08-02
EP2784655A3 (en) 2017-04-26
EP2784655A2 (en) 2014-10-01
CN104077074A (en) 2014-10-01
KR102097452B1 (en) 2020-04-07
US9569065B2 (en) 2017-02-14

Similar Documents

Publication Publication Date Title
US9569065B2 (en) Electronic device including projector and method for controlling the electronic device
US9454850B2 (en) Mobile communication terminal for providing augmented reality service and method of changing into augmented reality service screen
US11048373B2 (en) User interface display method and apparatus therefor
KR102016975B1 (en) Display apparatus and method for controlling thereof
US9582168B2 (en) Apparatus, method and computer readable recording medium for displaying thumbnail image of panoramic photo
KR101726790B1 (en) Mobile terminal and control method for mobile terminal
KR102015347B1 (en) Method and apparatus for providing mouse function using touch device
US9514512B2 (en) Method and apparatus for laying out image using image recognition
KR101657234B1 (en) Method, device, program and storage medium for displaying picture
KR102156729B1 (en) Method for adjusting magnification of screen images in electronic device, machine-readable storage medium and electronic device
US20180329598A1 (en) Method and apparatus for dynamic display box management
US20140337769A1 (en) Method and apparatus for using electronic device
KR20110018589A (en) Mobile and method for controlling the same
KR20180046681A (en) Image display apparatus, mobile device and operating method for the same
US20150007036A1 (en) Electronic device for sharing question message and method of controlling the electronic device
US20140258923A1 (en) Apparatus and method for displaying screen image
KR20110022217A (en) Mobile and method for controlling the same
KR20140113032A (en) Method and apparatus for displaying screen in a portable terminal
KR102482630B1 (en) Method and apparatus for displaying user interface
KR102187856B1 (en) Method and apparatus for displaying user interface
KR20110131910A (en) Mobile terminal and method for controlling the same
KR20150057721A (en) Mobile terminal and a method for controling the mobile terminal

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAKUBIAK, ANTONI;ZBOROWSKI, PAWEL;STRUPCZEWSKI, ADAM;AND OTHERS;REEL/FRAME:031939/0977

Effective date: 20140107

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4