WO2015053451A1 - Mobile terminal and operating method thereof - Google Patents

Mobile terminal and operating method thereof Download PDF

Info

Publication number
WO2015053451A1
WO2015053451A1 PCT/KR2014/003214 KR2014003214W WO2015053451A1 WO 2015053451 A1 WO2015053451 A1 WO 2015053451A1 KR 2014003214 W KR2014003214 W KR 2014003214W WO 2015053451 A1 WO2015053451 A1 WO 2015053451A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
mobile terminal
movement
infrared light
image
Prior art date
Application number
PCT/KR2014/003214
Other languages
French (fr)
Inventor
Manhyung LEE
Kyungchan PARK
Sungmin Kim
Sungdu Kwon
Original Assignee
Lg Electronics Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lg Electronics Inc. filed Critical Lg Electronics Inc.
Publication of WO2015053451A1 publication Critical patent/WO2015053451A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • the present disclosure relates to a mobile terminal and an operating method thereof.
  • Terminals may be classified into mobile/ portable terminals and stationary terminals depending on the mobility.
  • the mobile terminals may be classified into handheld terminals and vehicle mount terminals depending on direct portability by a user.
  • the terminals are being implemented as multimedia players having multiple functions including taking a picture, recording a video, playing music or video files, gaming or receiving the broadcasting.
  • recent mobile terminals are employing gesture recognition modules that recognize user gestures and perform corresponding operations.
  • gesture recognition modules installed in typical mobile terminals may sense only a gesture movement of the horizontal direction (x-axis direction), there is a limitation in controlling various operations of the mobile terminals.
  • typical gesture recognition modules using infrared signals associated with human bodies may detect user gestures only when users make big movements, and it is possible to detect user's hand movements only when there is a sufficient distance between a hand and a human body since the hand movements are affected by the human body except for the hand.
  • Embodiments provide a mobile terminal that recognizes gesture movements of x-axis, y-axis and z-axis directions and enable various operations and an operating method thereof.
  • a method of operating a mobile terminal including a gesture recognition module includes externally radiating an infrared light and recognizing a user gesture based on a infrared light reflected through the user gesture; extracting an infrared signal variation and detecting a movement of the gesture; and performing a function of the mobile terminal corresponding to the detected movement of the gesture.
  • a mobile terminal in another embodiment, includes a gesture recognition module externally radiating an infrared light, the gesture recognition module recognizing a user gesture based on a infrared light reflected through the user gesture; and a control unit extracting a infrared signal variation and detecting a movement of the gesture, the control unit performing a function of the mobile terminal corresponding to the detected movement of the gesture.
  • the mobile terminal may use a cheap gesture recognition module and effectively recognize a user gesture.
  • the mobile terminal may also easily detect the movement of y-axis and z-axis directions in addition to the x-axis direction of a gesture and perform a corresponding operation.
  • Fig. 1 is a block diagram of a mobile terminal according to an embodiment.
  • Fig. 2 is a block diagram of an additional component of a mobile terminal according to an embodiment.
  • Fig. 3 is a flow chart of a method of operating a mobile terminal according to an embodiment.
  • Fig. 4 is an example of an initial gesture according to an embodiment.
  • Fig. 5 is an example of detecting an optical signal and obtaining a gesture mage according to an embodiment.
  • Fig. 6 is a diagram for explaining processes of detecting variations in optical signals on x-axis and y-axis and recognizing a gesture movement according to an embodiment.
  • Fig. 7 is a diagram for explaining processes of detecting variations in optical signal on z-axis and recognizing a gesture movement according to an embodiment.
  • FIGS. 8 to 13 are diagrams for explaining the operation of a mobile terminal corresponding to a gesture movement according to an embodiment.
  • Mobile terminals described in the present disclosure may include cellular phones, smart phones, laptop computers, digital broadcasting terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), and navigation devices.
  • PDAs personal digital assistants
  • PMPs portable multimedia players
  • a structure of a mobile terminal according to an embodiment is described below with reference to Fig. 1.
  • Fig. 1 is a block diagram of a mobile terminal according to an embodiment.
  • a mobile terminal 100 may include a wireless communication server 110, an audio/video (AV) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a control unit 180, and a power supply unit 190. Since the components shown in Fig. 1 are not essential, a mobile terminal having components more or less than those may also be implemented.
  • AV audio/video
  • the wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and a wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located.
  • the wireless communication unit 110 may include a broadcasting receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, and a position information module 115.
  • the broadcasting receiving module 111 receives a broadcasting signal and/or broadcasting related information from an external broadcasting management server through a broadcasting channel.
  • the broadcasting channel may include a satellite channel or a terrestrial channel.
  • the broadcasting management server may mean a server that generates and transmits a broadcasting signal and/or broadcasting related information, or a server that receives a pre-generated broadcasting signal and/or broadcasting related information and transmits them to a terminal.
  • the broadcasting signal may include a broadcasting signal formed by combining a TV broadcasting signal with a radio broadcasting signal, in addition to a TV broadcasting signal, a radio broadcasting signal, and a data broadcasting signal.
  • the broadcasting related information may mean information on a broadcasting channel, a broadcasting program or a broadcasting service provider.
  • the broadcasting related information may also be provided through a mobile communication network. In this case, the information may be received by the mobile communication module 112.
  • the broadcasting related information may exist in various formats. For example, it may exist in the format of an electronic program guide (EPG) of digital multimedia broadcasting (DMB) or an electronic service guide (ESG) of digital video broadcast-handheld (DVB-H).
  • EPG electronic program guide
  • ESG electronic service guide
  • DMB digital multimedia broadcasting
  • DVB-H digital video broadcast-handheld
  • the broadcasting receiving module 111 may receive a digital broadcasting signal by using a digital broadcasting system such as a digital multimedia broadcasting-terrestrial (DMB-T) format, a digital multimedia broadcasting-satellite (DMB-S) format, a media forward link only (MediaFLO) format, a digital video broadcast-handheld (DVB-H) format, or an integrated services digital broadcast-terrestrial (ISDB-T) format.
  • a digital broadcasting system such as a digital multimedia broadcasting-terrestrial (DMB-T) format, a digital multimedia broadcasting-satellite (DMB-S) format, a media forward link only (MediaFLO) format, a digital video broadcast-handheld (DVB-H) format, or an integrated services digital broadcast-terrestrial (ISDB-T) format.
  • DMB-T digital multimedia broadcasting-terrestrial
  • DMB-S digital multimedia broadcasting-satellite
  • MediaFLO media forward link only
  • DVD-H digital video broadcast-handheld
  • a broadcasting signal and/or broadcasting related information received through the broadcasting receiving module 111 may be stored in the memory 160.
  • the mobile communication module 112 transmits and receives a wireless signal to and from at least one of a base station, an external terminal and a server on a mobile communication network.
  • the wireless signal may include various types of data depending on transmitting and receiving a voice call signal, a video call signal or a text/ multimedia message.
  • the wireless internet module 113 indicates a module for a wireless internet connection and may be built in the mobile terminal 100 or provided separately therefrom.
  • wireless internet technology wireless LAN (WLAN, Wi-Fi), wireless broadband (Wibro), world interoperability for microwave access (Wimax), and high speed downlink packet access (HSDPA) may be used.
  • the short range communication module 114 indicates a module for short range communication.
  • Bluetooth radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee may be used.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra wideband
  • ZigBee ZigBee
  • the position information module 115 is a module for obtaining the position of a mobile terminal and includes a global position system (GPS) module as a typical example.
  • GPS global position system
  • the AV input unit 120 is used for an audio signal or video signal input and may include a camera 121 and a microphone 122.
  • the camera 121 processes a picture frame of a still image or a video obtained by an image sensor in a video call mode or in an imaging mode.
  • the processed picture frame may be displayed on the display unit 151.
  • the picture frame processed by the camera 121 may be stored in the memory 160 or externally transmitted through the wireless communication unit 110.
  • the camera 121 may be arranged in plurality depending on the usage environment.
  • the microphone 122 receives an external sound signal in a call mode, a recording mode, or a voice recognition mode and changes the signal to electrical voice data.
  • the voice data may be converted and output into a format that may be transmitted to a mobile communication base station through the mobile communication module 112.
  • Various noise removing algorithms for removing noise generated in a process of receiving an external sound signal may be implemented in the microphone 122.
  • the user input unit 130 generates input data for the operation control of a user terminal.
  • the user input unit 130 may include a key pad, a dome switch, a (static pressure/ capacitive) touch pad, a jog wheel, and a jog switch.
  • the sensing unit 140 senses the current states of the mobile terminal 100 such as an open/close state of the mobile terminal 100, a position of the mobile terminal 100, absence/presence of a user contact, an orientation of the mobile terminal, and acceleration/deceleration of the mobile terminal and generates a sensing signal for controlling the operation of the mobile terminal 100.
  • the mobile terminal 100 is of a slide phone type, it is possible to sense whether a slide phone is open or close.
  • the sensing unit 140 may include a proximity sensor 141.
  • the output unit 150 is used for generating a visual, auditory or tactile output and may include the display unit 151, a sound output module 152, an alarm unit 153, and a haptic module 154.
  • the display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the display unit 151 displays user interface (UI) or graphic user interface (GUI) related to a call. When the mobile terminal 100 is in a video call mode or in an imaging mode, the display unit 151 displays an imaged and/or received image, the UI, or the GUI.
  • UI user interface
  • GUI graphic user interface
  • the display unit 151 may include at leas one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display, and a 3D display.
  • LCD liquid crystal display
  • TFT LCD thin film transistor-liquid crystal display
  • OLED organic light-emitting diode
  • flexible display and a 3D display.
  • the displays may be configured as transparent or optically transferable to enable viewing of the outside.
  • the displays may be referred to as transparent displays which include a transparent OLED (TOLED) as a typical example.
  • the back structure of the display unit 151 may also be configured as optically transferable. Due to such a structure, a user may see objects located on the back side of a terminal body through a region of the terminal body which the display unit 151 occupies.
  • the display unit 151 may exist in plurality.
  • a plurality of display units may be arranged on one surface of the mobile terminal 100 to be spaced apart from one another or integrally or may also be respectively arranged on different surfaces thereof.
  • the display unit 151 and a sensor sensing a touch operation form a mutually layered structure (hereinafter, referred to as a ‘touch screen ’)
  • the display unit 151 may also be used as an input device in addition to the output device.
  • the touch sensor may have forms such as a touch film, a touch sheet, and a touch pad.
  • the touch sensor may be configured to convert, a change in pressure applied to a specific part of the display unit 151 or in capacitance generated from a specific part of the display unit 151, into an electrical input signal.
  • the touch sensor may be configured to be able to detect pressure when a touch is performed, in addition to a touched part and area.
  • a corresponding signal(s) is transmitted to a touch controller.
  • the touch controller processes the signal(s) and then transmits corresponding data to the control unit 180. Accordingly, the control unit 180 may be aware of which region of the display unit 151 is touched.
  • the proximity sensor 141 may be arranged in the internal region of the mobile terminal surrounded by the touch screen or near the touch screen
  • the proximity sensor 141 indicates a sensor that detects the absence and presence of an object approaching or near a certain subject surface without mechanical contact by using the force of an electromagnetic field or infrared light.
  • the lifetime of the proximity sensor 141 is longer than that of a contact sensor and the utilization of the proximity sensor 141 is also higher than that.
  • Examples of the proximity sensor 141 include a transmissive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a high-frequency oscillating proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
  • the touch screen is of a capacitive type, it is configured to detect the proximity of a pointer by using a change in electric field due to the proximity of the pointer.
  • the touch screen may be classified as a proximity sensor.
  • proximity touch an action recognizing that the pointer approaches the touch screen and is located over the touch screen without a contact
  • contact touch an action made when the pointer is in actual contact with the touch screen
  • the position where the proximity sensor is made with the point over the touch screen means the position where the pointer is perpendicular to the touch screen when the pointer makes the proximity touch.
  • the proximity sensor senses the proximity touch and proximity touch patterns (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch change state). Information corresponding to the sensed proximity touch operation and proximity touch patterns may be output through the touch screen.
  • the proximity touch and proximity touch patterns e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch change state.
  • Information corresponding to the sensed proximity touch operation and proximity touch patterns may be output through the touch screen.
  • the sound output module 152 may output audio data received from the wireless communication unit 110 in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, and a broadcasting receiving mode or stored in the memory 160.
  • the sound output module 152 also outputs a sound signal related to a function that is performed by the mobile terminal 100 (e.g., call signal receiving sound or message receiving sound).
  • the sound output module 152 may include a receiver, a speaker and a buzzer.
  • the alarm unit 153 outputs a signal for providing a notice of the event generation of the mobile terminal 100. Examples of an event generated by the mobile terminal include call signal reception, message reception, a key signal input and a touch input.
  • the alarm unit 153 may also output a signal for providing a notice of event generation by using other forms except for a video signal or audio signal, such as vibration.
  • the video signal or audio signal may also be output through the display unit 151 or the voice output module 152 and thus the components 151 and 152 may be classified as portions of alarm unit 153.
  • the haptic module 154 generates various tactile effects that a user may feel.
  • a typical example of a tactile effect generated by the haptic module 154 is vibration. It is possible to control the intensity and patterns of the vibration generated by the haptic module 154. For example, different vibrations may be synthesized and output or may be sequentially output.
  • the haptic module 154 may generate various tactile effects such as effects due to a pin arrangement making a motion perpendicular to a contact skin surface, a jet force or suction force of air through a jet hole or a suction hole, grazing skin surface, an electrode contact, a stimulus including an electrostactic force, and an effect due to realizing coldness and warmness by using a heat absorbing or emitting device.
  • the haptic module 154 may be implemented to be able to deliver a tactile effect through a direction contact and to enable a user to feel a tactile effect through the muscle sense of a finger or an arm. Depending on the configuration of the mobile terminal 100, the haptic module 154 may exist in plurality.
  • the memory 160 may store programs for the operation of the control unit 180 and temporarily store data (e.g., a phone book, a message, a still image, and a video) that is input and output.
  • the memory 160 may store data on sound and vibrations having various patterns that are output when there is a touch input on the touch screen.
  • the memory 160 may include as a storage medium, at least one of memories including flash memory type, hard disk type, micro type and card type (for example, secure digital (SD) card or extreme digital (XD) card), and other types of memories including a random access memory (RAM), a static RAM (SRAM), a read-only memory (ROM), a programmable ROM (PROM), an electrically erasable PROM (EEPROM), a magnetic RAM (MRAM), a magnetic disk and an optical disk type memory.
  • RAM random access memory
  • SRAM static RAM
  • ROM read-only memory
  • PROM programmable ROM
  • EEPROM electrically erasable PROM
  • MRAM magnetic RAM
  • the mobile terminal 100 may also operate along with a web storage that performs a storage function of the memory 160 over the internet.
  • the interface unit 170 functions as a way to all external devices connected to the mobile terminal 100.
  • the interface unit 170 receives data from external devices, receives power and transmits the data to each component of the mobile terminal 100, or transmits data from the mobile terminal 100 to the external devices.
  • the interface unit 170 may include a wired/ wireless headset port, an external charger port, a wired/ wireless data port, a memory card port, a port connecting a device that includes an identification module, an audio input and output (I/O) port, and an earphone port.
  • the identification module is a chip storing various pieces of information for identifying the authority of using the mobile terminal 100 and may include a user identify module (UIM), a subscriber identity module (SIM),and a universal subscriber identity module (USIM).
  • a device that includes the identification module (hereinafter, referred to as an “identification device”) may be manufactured in a smart card type. Thus, the identification device may be connected to the terminal 100 via a port.
  • the interface unit may be a way through which power from the cradle is supplied to the mobile terminal or may be a way through which various command signals input from the cradle by a user are transmitted to the mobile terminal.
  • the power or the various command signals input from the cradle may also operate as a signal for recognizing that the mobile terminal is correctly installed on the cradle.
  • the controller 180 typically controls the overall operations of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, and video calls.
  • the controller 180 may also include a multimedia module 181 for providing multimedia playback functions.
  • the multimedia module 181 may be implemented in the controller 180, or may be implemented separately from the controller (180).
  • the controller 180 may perform pattern recognition processing through which handwriting input or a drawing input performed on the touch screen may be recognized as a character and image.
  • the controller 180 may obtain an image of a subject based on an optical signal projected onto the image sensor 270 and detect a movement of the subject by using the obtained image.
  • the controller 180 may control the mobile terminal 100 so that the mobile terminal 100 performs its function according to the detected movement of the subject.
  • the operations of the controller 180 associated therewith are described below in detail.
  • the power supply unit 190 receives internal power or external power under the control of the controller 180 and provides power required for operating each of components.
  • Various embodiments described herein may be implemented in a recording medium that may be read with a computer or a similar device by using software, hardware or a combination thereof.
  • the embodiments described herein may be implemented by using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers), microprocessors, and other electrical units for performing functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers
  • microprocessors and other electrical units for performing functions.
  • the embodiments may be implemented by the controller 180.
  • embodiments associated with procedures or functions may be implemented along with a separate software module that performs at least one function or operation.
  • Software codes may be implemented by software applications that are written in proper programming languages. The software codes may be stored in the memory 160 and may be executed by the controller 180.
  • Fig. 2 is a block diagram of an additional component of a mobile terminal according to an embodiment.
  • the mobile terminal 100 further includes a gesture recognition module 200.
  • the mobile terminal 100 when the mobile terminal 100 further include the gesture recognition module 200, the mobile terminal 100 may not include the proximity sensor as described in Fig. 1.
  • the gesture recognition module 200 may be arranged on the upper part of the touch screen of the mobile terminal 100 and may be arranged adjacent to a front camera.
  • the gesture recognition module 200 may obtain an image of a subject and recognize a movement of the subject through the obtained image of the subject.
  • the subject may be a human body and in particular, a hand.
  • the gesture recognition module 200 may be a module used for recognizing a hand movement (gesture).
  • the controller 180 of the mobile terminal 100 may recognize a gesture through the gesture recognition module 200 and may control the mobile terminal 100 so that a function corresponding to the identified gesture is performed.
  • the gesture recognition module 200 may include an infrared radiating unit 210, a lens 230, an optical filter 250, and an image sensor 270.
  • the infrared radiating unit 210 may externally radiate infrared light.
  • the infrared radiating unit 210 may include infrared light-emitting diode (LED) lighting.
  • the lens 230 may collect light incoming from the outside.
  • the optical filter 250 may filter only the infrared light of the incoming lights.
  • the image sensor 270 may use the filtered infrared light to obtain an image of a subject.
  • the image sensor 270 may include a low-resolution complementary metal?oxide?semiconductor (CMOS) image sensor.
  • CMOS complementary metal?oxide?semiconductor
  • gesture recognition module 200 Each component of the gesture recognition module 200 is described below in detail.
  • Fig. 3 is a flow chart of a method of operating a mobile terminal according to an embodiment.
  • Fig. 3 the method of operating the mobile terminal according to an embodiment is described with reference to Figs. 1 and 2.
  • the infrared radiating unit 210 of the gesture recognition module 200 installed in the mobile terminal 100 externally radiates infrared light in step S101. That is, the infrared radiating unit 210 of the mobile terminal 100 may radiate infrared light belonging to an infrared wavelength band in order to obtain an image of a subject.
  • the infrared radiating unit 210 may include infrared LED lighting.
  • the lens 230 of the gesture recognition module 200 focuses light incoming from the outside in step S103.
  • the lights incoming from the outside may include infrared lights reflected from a subject and lights belonging to other wavelength bands excluding the infrared wavelength band.
  • the lens 230 may use infrared lights reflected from the subject to focus the incoming lights to obtain an image of the subject.
  • the subject may be a human body and in particular, a hand that is a portion of the human body.
  • the optical filter 250 of the gesture recognition module 200 filters only infrared lights of the incoming lights in step S105.
  • the optical filter 250 may pass only infrared lights of the incoming lights belonging to an infrared wavelength band and block lights belonging to other wavelength bands excluding the infrared wavelength band.
  • the optical filter 250 may include a band pass filter that may pass only lights belonging to the infrared wavelength band.
  • the image sensor 270 of the gesture recognition module 200 uses the filtered infrared lights to obtain an image of a subject in step S107.
  • the image sensor 270 may include a low-resolution CMOS image sensor in order to accurately obtain the shape of the subject.
  • the CMOS image sensor is an image sensor that uses a switching technique that may include a plurality of transistors corresponding respectively to a plurality of pixels and use the plurality of transistors to detect the sequential outputs of x-axis and y-axis directions.
  • Each of the plurality of pixels of the CMOS image sensor may include one photodiode and four MOS transistors.
  • the CMOS image sensor may put the filtered infrared lights into the pixels, convert them into electrical energy, i.e., digital signals, and obtain an image of a subject.
  • the controller 180 of the mobile terminal 100 checks whether the obtained image of the subject is an initial gesture, in step S109.
  • the initial gesture may be a gesture that should be initially recognized to activate the gesture recognition module 200. That is, the initial gesture may be a gesture that should be initially recognized so that the gesture recognition module 200 recognizes a user gesture to operate the mobile terminal 100.
  • the initial gesture is used for preventing a gesture not intended by a user from becoming recognized and operated by the mobile terminal 100.
  • the initial gesture may be set by a user of the mobile terminal 100 or may be set by a manufacturer when manufacturing the mobile terminal 100.
  • the initial gesture may correspond to various hand movements. Related descriptions are provided with reference to Fig. 4.
  • Fig. 4 is an example of an initial gesture according to an embodiment.
  • the initial gesture may correspond to any one of a hand movement 301 made when a user opens his/her hand, a hand movement 302 made when a user stretches and gather only the thumbs and index fingers of both hands, and a hand movement 303 made when a user stretches only his or her index finger, and a hand movement 304 made when a user stretches only his or her index finger and middle finger.
  • the initial gesture is not needed to be limited thereto, and may also include any hand movements including a hand movement made when a user stretches only his/her thumb.
  • the gesture recognition module 200 of the mobile terminal 100 may recognize any one of gestures shown in Fig. 4 as the initial gesture and then recognize the following gestures of a user.
  • step S101 If an image of a subject is identified as no initial gesture, go back to step S101.
  • the controller 180 of the mobile terminal 100 increases the frame rate of the image sensor 270 in order to recognize gestures following the initial gesture in step S111.
  • the controller 180 of the mobile terminal 100 may adjust the frame rate of the image sensor 270 in order to recognize the initial gesture and gestures following the initial gesture in step S111.
  • the controller 180 may maintain the frame rate of the image sensor 270 at a low speed in order to recognize the initial gesture, and after the initial gesture is recognizes, it is possible to maintain the frame rate of the image sensor 270 at a high speed in order to recognize following gestures. Since the gestures following the initial gesture may include a gesture movement, the controller 180 may adjust the frame rate of the image sensor 270 in order to be aware of a gesture movement more accurately.
  • the controller 180 may maintain the frame rate of the image sensor 270 at 6 fps in order to recognize the initial gesture, and after the initial gesture is recognizes, it is possible to maintain the frame rate of the image sensor 270 at 400 fps to 5000 fps. Accordingly, by adjusting the frame rate depending on whether the initial gesture is recognized unlike a typical gesture recognition module that maintains the frame rate at a high speed, it is possible to prevent an unnecessary waste of memory.
  • the controller 180 of the mobile terminal 100 increases the frame rate, then detects a variation in an infrared signal projected into the image sensor 270 in step S113, and recognizes a gesture based on the detected variation in the infrared signal in step S115.
  • the controller 180 may increase the frame rate of the image sensor in order to recognize user gestures following the initial gesture and then detect a variation in x-axis and a variation in y-axis of an infrared signal projected into the image sensor 270.
  • controller 180 may also detect a variation in z-axis of an infrared signal projected into the image sensor 270 and recognize a gesture changing in the z-axis direction.
  • Fig. 5 is an example of detecting an optical signal and obtaining a gesture mage according to an embodiment
  • Fig. 6 is a diagram for explaining processes of detecting variations in optical signals on x-axis and y-axis and recognizing a gesture movement according to an embodiment
  • Fig. 7 is a diagram for explaining processes of detecting a variation in optical signal on z-axis and recognizing a gesture movement according to an embodiment.
  • Fig. 5 is a diagram for explaining a method of detecting a variation in optical signal and recognizing a gesture when a CMOS image sensor is used as the image sensor 270.
  • an image obtaining unit 271 of the image sensor may use an infrared signal on x-axis and an infrared signal on y-axis to obtain a 2D gesture image A.
  • the image obtaining unit 271 may include a plurality of pixels.
  • the gesture image A projected into the plurality of pixels is not a user’s hand movement image but is assumed as a user’s hand movement.
  • the image sensor 270 may combine an x-axis infrared signal of the x-axis direction with a y-axis infrared signal of the y-axis direction to obtain the 2D gesture image A.
  • the x-axis infrared signal may move in the left or right direction and the image obtaining unit 271 of the image sensor 270 may obtain a gesture image that is formed by reflecting the infrared signal moving in the x-axis direction.
  • the image obtaining unit 271 of the image sensor 270 may obtain a gesture image that is formed by reflecting the infrared signal moving in the x-axis direction.
  • the controller 180 may detect a variation in the x-axis infrared signal moving in the x-axis direction and recognize based on the detected variation in the x-axis infrared signal that a gesture has moved in the x-axis direction.
  • the x-axis infrared signal X1 may also move to the right side and thus the gesture image A obtained from the image sensor 270 may also move in the x-axis direction.
  • the controller 180 may sense a movement variation of the x-axis infrared light and thus detect a movement in gesture.
  • Fig. 6 describes only the movement of the x-axis direction of a gesture as an example, there is no need to be limited thereto and the controller 180 may also detect the movement of the y-axis direction of the gesture and the simultaneous movements of the x-axis and y-axis directions of the gesture.
  • Fig. 7 is a diagram for explaining a process of recognizing a movement of a gesture moving in the z-axis direction according to an embodiment.
  • the controller 180 may detect based on the brightness of a gesture image whether the distance between the gesture recognition module 200 and a user becomes longer or shorter. For example, when it is assumed that the brightness of the gesture image B is lower than that of the gesture image B, it may mean that the distance between a gesture corresponding to the gesture image C and the gesture recognition module 200 is shorter than that between a gesture corresponding to the gesture image G and the gesture recognition module 200.
  • an image obtained by the image obtaining unit 271 of the image sensor 270 may be clearer as the distance between the gesture recognition module 200 and a user gesture is shorter. That is, the darker the color of the obtained image is, the shorter the distance between the gesture recognition module 200 and a user gesture may be.
  • the controller 180 may use the absolute magnitude of an infrared signal projected into the image sensor 270 to obtain a gesture image and may detect the distance between the gesture recognition module 200 and a gesture based on the obtained brightness of the gesture image. The controller 180 may determine based on the detected distance whether a gesture has moved in the z-axis direction, and may control the operation of the mobile terminal 100 according to a determination result.
  • the controller 180 of the mobile terminal 100 controls the mobile terminal 100 so that an operation corresponding to a movement of the recognized gesture is performed, in step S117.
  • a plurality of gesture movements may correspond respectively to the operations of a plurality of mobile terminals 100.
  • the mobile terminal 100 may move the screen of the mobile terminal 100 to another screen in response to the gesture movement of the x-axis direction.
  • FIGS. 8 to 13 are diagrams for explaining the operation of a mobile terminal corresponding to a gesture movement according to an embodiment.
  • Figs. 8 and 9 As shown in Fig. 8, if a user gesture moves in the x-axis direction, the gesture recognition module 200 recognizes a movement of a gesture moving in the x-axis direction.
  • the mobile terminal 100 may change a user screen to another screen as shown in Fig. 9, in response to the gesture movement of the x-axis direction recognized through the gesture recognition module 200.
  • the screen change of the mobile terminal 100 corresponding to the gesture movement of the x-axis direction is merely an example.
  • Figs. 10 and 11 are described.
  • the gesture recognition module 200 recognizes a movement of a gesture moving in the x-axis and y-axis directions.
  • the mobile terminal 100 may change a user screen to a setting screen for setting the functions of the mobile terminal 100 as shown in Fig. 11, in response to the gesture movement of the x-axis and y-axis directions recognized through the gesture recognition module 200.
  • Figs. 12 and 13 are described.
  • the gesture recognition module 200 recognizes a movement of a gesture moving in the z-axis direction.
  • the mobile terminal 100 may highlight an icon K of a specific application displayed on the screen of the mobile terminal 100 to inform that a corresponding icon K is selected, as shown in Fig. 13.
  • the mobile terminal 100 may use a cheap gesture recognition module 200 to effectively recognize a user gesture.
  • the mobile terminal 100 may also easily detect the movement of y-axis and z-axis directions in addition to the x-axis direction of a gesture and perform a corresponding operation.
  • Embodiments may also be applied to a stationary terminal such as a TV set in addition to a mobile terminal.
  • the above-described method may also be embodied as processor readable codes on a program-recorded medium.
  • the processor readable medium are a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and a carrier wave form (such as data transmission through the Internet).
  • the above-described mobile terminal is not limited to the configuration and method of the above-described embodiments, and some or all of the embodiments may also be selectively combined so that various variations may be made.

Abstract

A method of operating a mobile terminal including a gesture recognition module is provided. The method includes externally radiating an infrared light; recognizing a user gesture based on a reflected infrared light; extracting an infrared signal variation and detecting a movement of the gesture based on the infrared signal variation; and performing a function of the mobile terminal corresponding to the detected movement of the gesture.

Description

MOBILE TERMINAL AND OPERATING METHOD THEREOF
The present disclosure relates to a mobile terminal and an operating method thereof.
Terminals may be classified into mobile/ portable terminals and stationary terminals depending on the mobility. The mobile terminals may be classified into handheld terminals and vehicle mount terminals depending on direct portability by a user.
As the functions of the terminals are diversified, the terminals are being implemented as multimedia players having multiple functions including taking a picture, recording a video, playing music or video files, gaming or receiving the broadcasting.
In order to support and increase the functions of the terminals, improving the structural parts and/or software parts of the terminals may be considered.
In particular, recent mobile terminals are employing gesture recognition modules that recognize user gestures and perform corresponding operations.
However, since gesture recognition modules installed in typical mobile terminals may sense only a gesture movement of the horizontal direction (x-axis direction), there is a limitation in controlling various operations of the mobile terminals.
Moreover, there are limitations in that typical gesture recognition modules using infrared signals associated with human bodies may detect user gestures only when users make big movements, and it is possible to detect user's hand movements only when there is a sufficient distance between a hand and a human body since the hand movements are affected by the human body except for the hand.
Embodiments provide a mobile terminal that recognizes gesture movements of x-axis, y-axis and z-axis directions and enable various operations and an operating method thereof.
In one embodiment, a method of operating a mobile terminal including a gesture recognition module includes externally radiating an infrared light and recognizing a user gesture based on a infrared light reflected through the user gesture; extracting an infrared signal variation and detecting a movement of the gesture; and performing a function of the mobile terminal corresponding to the detected movement of the gesture.
In another embodiment, a mobile terminal includes a gesture recognition module externally radiating an infrared light, the gesture recognition module recognizing a user gesture based on a infrared light reflected through the user gesture; and a control unit extracting a infrared signal variation and detecting a movement of the gesture, the control unit performing a function of the mobile terminal corresponding to the detected movement of the gesture.
The mobile terminal according to various embodiments may use a cheap gesture recognition module and effectively recognize a user gesture.
Moreover, the mobile terminal according to various embodiments may also easily detect the movement of y-axis and z-axis directions in addition to the x-axis direction of a gesture and perform a corresponding operation.
Fig. 1 is a block diagram of a mobile terminal according to an embodiment.
Fig. 2 is a block diagram of an additional component of a mobile terminal according to an embodiment.
Fig. 3 is a flow chart of a method of operating a mobile terminal according to an embodiment.
Fig. 4 is an example of an initial gesture according to an embodiment.
Fig. 5 is an example of detecting an optical signal and obtaining a gesture mage according to an embodiment.
Fig. 6 is a diagram for explaining processes of detecting variations in optical signals on x-axis and y-axis and recognizing a gesture movement according to an embodiment.
Fig. 7 is a diagram for explaining processes of detecting variations in optical signal on z-axis and recognizing a gesture movement according to an embodiment.
FIGS. 8 to 13 are diagrams for explaining the operation of a mobile terminal corresponding to a gesture movement according to an embodiment.
Some embodiments are described below in more detail with reference to the accompanying drawings. In the following description, since the suffixes “module” and “unit” for components are given and interchanged for easiness in making the present disclosure, they do not have distinct meanings or functions.
Mobile terminals described in the present disclosure may include cellular phones, smart phones, laptop computers, digital broadcasting terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), and navigation devices. However, a person skilled in the art would be able to easily know that configurations according to embodiments described in the present disclosure may also be applied to stationary terminals such as digital TVs and desktop computers, except when the configurations may be applied only to mobile terminals. A structure of a mobile terminal according to an embodiment is described below with reference to Fig. 1.
Fig. 1 is a block diagram of a mobile terminal according to an embodiment.
A mobile terminal 100 may include a wireless communication server 110, an audio/video (AV) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a control unit 180, and a power supply unit 190. Since the components shown in Fig. 1 are not essential, a mobile terminal having components more or less than those may also be implemented.
In the following, the components above are described one by one.
The wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and a wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcasting receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication module 114, and a position information module 115.
The broadcasting receiving module 111 receives a broadcasting signal and/or broadcasting related information from an external broadcasting management server through a broadcasting channel.
The broadcasting channel may include a satellite channel or a terrestrial channel. The broadcasting management server may mean a server that generates and transmits a broadcasting signal and/or broadcasting related information, or a server that receives a pre-generated broadcasting signal and/or broadcasting related information and transmits them to a terminal. The broadcasting signal may include a broadcasting signal formed by combining a TV broadcasting signal with a radio broadcasting signal, in addition to a TV broadcasting signal, a radio broadcasting signal, and a data broadcasting signal.
The broadcasting related information may mean information on a broadcasting channel, a broadcasting program or a broadcasting service provider. The broadcasting related information may also be provided through a mobile communication network. In this case, the information may be received by the mobile communication module 112.
The broadcasting related information may exist in various formats. For example, it may exist in the format of an electronic program guide (EPG) of digital multimedia broadcasting (DMB) or an electronic service guide (ESG) of digital video broadcast-handheld (DVB-H).
The broadcasting receiving module 111 may receive a digital broadcasting signal by using a digital broadcasting system such as a digital multimedia broadcasting-terrestrial (DMB-T) format, a digital multimedia broadcasting-satellite (DMB-S) format, a media forward link only (MediaFLO) format, a digital video broadcast-handheld (DVB-H) format, or an integrated services digital broadcast-terrestrial (ISDB-T) format. The broadcasting receiving module 111 may also be configured to be suitable for other broadcasting systems in addition to the above-described digital broadcasting system.
A broadcasting signal and/or broadcasting related information received through the broadcasting receiving module 111 may be stored in the memory 160.
The mobile communication module 112 transmits and receives a wireless signal to and from at least one of a base station, an external terminal and a server on a mobile communication network. The wireless signal may include various types of data depending on transmitting and receiving a voice call signal, a video call signal or a text/ multimedia message.
The wireless internet module 113 indicates a module for a wireless internet connection and may be built in the mobile terminal 100 or provided separately therefrom. As a wireless internet technology, wireless LAN (WLAN, Wi-Fi), wireless broadband (Wibro), world interoperability for microwave access (Wimax), and high speed downlink packet access (HSDPA) may be used.
The short range communication module 114 indicates a module for short range communication. As the short range communication technology, Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee may be used.
The position information module 115 is a module for obtaining the position of a mobile terminal and includes a global position system (GPS) module as a typical example.
Referring to Fig. 1, the AV input unit 120 is used for an audio signal or video signal input and may include a camera 121 and a microphone 122. The camera 121 processes a picture frame of a still image or a video obtained by an image sensor in a video call mode or in an imaging mode. The processed picture frame may be displayed on the display unit 151.
The picture frame processed by the camera 121 may be stored in the memory 160 or externally transmitted through the wireless communication unit 110. The camera 121 may be arranged in plurality depending on the usage environment.
The microphone 122 receives an external sound signal in a call mode, a recording mode, or a voice recognition mode and changes the signal to electrical voice data. In the call mode, the voice data may be converted and output into a format that may be transmitted to a mobile communication base station through the mobile communication module 112. Various noise removing algorithms for removing noise generated in a process of receiving an external sound signal may be implemented in the microphone 122.
The user input unit 130 generates input data for the operation control of a user terminal. The user input unit 130 may include a key pad, a dome switch, a (static pressure/ capacitive) touch pad, a jog wheel, and a jog switch.
The sensing unit 140 senses the current states of the mobile terminal 100 such as an open/close state of the mobile terminal 100, a position of the mobile terminal 100, absence/presence of a user contact, an orientation of the mobile terminal, and acceleration/deceleration of the mobile terminal and generates a sensing signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is of a slide phone type, it is possible to sense whether a slide phone is open or close. Moreover, it is also possible to sense whether power is supplied by the power supply unit 190 or whether the interface unit 170 is coupled to an external device. The sensing unit 140 may include a proximity sensor 141.
The output unit 150 is used for generating a visual, auditory or tactile output and may include the display unit 151, a sound output module 152, an alarm unit 153, and a haptic module 154.
The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the display unit 151 displays user interface (UI) or graphic user interface (GUI) related to a call. When the mobile terminal 100 is in a video call mode or in an imaging mode, the display unit 151 displays an imaged and/or received image, the UI, or the GUI.
The display unit 151 may include at leas one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display, and a 3D display.
Some of the displays may be configured as transparent or optically transferable to enable viewing of the outside. The displays may be referred to as transparent displays which include a transparent OLED (TOLED) as a typical example. The back structure of the display unit 151 may also be configured as optically transferable. Due to such a structure, a user may see objects located on the back side of a terminal body through a region of the terminal body which the display unit 151 occupies.
Depending on the implementation of the mobile terminal 100, the display unit 151 may exist in plurality. For example, a plurality of display units may be arranged on one surface of the mobile terminal 100 to be spaced apart from one another or integrally or may also be respectively arranged on different surfaces thereof.
When the display unit 151 and a sensor sensing a touch operation (hereinafter, referred to as a ‘touch sensor’) form a mutually layered structure (hereinafter, referred to as a ‘touch screen ’), the display unit 151 may also be used as an input device in addition to the output device. The touch sensor may have forms such as a touch film, a touch sheet, and a touch pad.
The touch sensor may be configured to convert, a change in pressure applied to a specific part of the display unit 151 or in capacitance generated from a specific part of the display unit 151, into an electrical input signal. The touch sensor may be configured to be able to detect pressure when a touch is performed, in addition to a touched part and area.
When there is a touch input to the touch sensor, a corresponding signal(s) is transmitted to a touch controller. The touch controller processes the signal(s) and then transmits corresponding data to the control unit 180. Accordingly, the control unit 180 may be aware of which region of the display unit 151 is touched.
Referring to Fig. 1, the proximity sensor 141 may be arranged in the internal region of the mobile terminal surrounded by the touch screen or near the touch screen The proximity sensor 141 indicates a sensor that detects the absence and presence of an object approaching or near a certain subject surface without mechanical contact by using the force of an electromagnetic field or infrared light. The lifetime of the proximity sensor 141 is longer than that of a contact sensor and the utilization of the proximity sensor 141 is also higher than that.
Examples of the proximity sensor 141 include a transmissive photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a high-frequency oscillating proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. When the touch screen is of a capacitive type, it is configured to detect the proximity of a pointer by using a change in electric field due to the proximity of the pointer. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.
For the convenience of description, an action recognizing that the pointer approaches the touch screen and is located over the touch screen without a contact is referred to as a "proximity touch” and an action made when the pointer is in actual contact with the touch screen is referred to as a “contact touch”. The position where the proximity sensor is made with the point over the touch screen means the position where the pointer is perpendicular to the touch screen when the pointer makes the proximity touch.
The proximity sensor senses the proximity touch and proximity touch patterns (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch change state). Information corresponding to the sensed proximity touch operation and proximity touch patterns may be output through the touch screen.
The sound output module 152 may output audio data received from the wireless communication unit 110 in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, and a broadcasting receiving mode or stored in the memory 160. The sound output module 152 also outputs a sound signal related to a function that is performed by the mobile terminal 100 (e.g., call signal receiving sound or message receiving sound). The sound output module 152 may include a receiver, a speaker and a buzzer.
The alarm unit 153 outputs a signal for providing a notice of the event generation of the mobile terminal 100. Examples of an event generated by the mobile terminal include call signal reception, message reception, a key signal input and a touch input. The alarm unit 153 may also output a signal for providing a notice of event generation by using other forms except for a video signal or audio signal, such as vibration. The video signal or audio signal may also be output through the display unit 151 or the voice output module 152 and thus the components 151 and 152 may be classified as portions of alarm unit 153.
The haptic module 154 generates various tactile effects that a user may feel. A typical example of a tactile effect generated by the haptic module 154 is vibration. It is possible to control the intensity and patterns of the vibration generated by the haptic module 154. For example, different vibrations may be synthesized and output or may be sequentially output.
In addition to the vibration, the haptic module 154 may generate various tactile effects such as effects due to a pin arrangement making a motion perpendicular to a contact skin surface, a jet force or suction force of air through a jet hole or a suction hole, grazing skin surface, an electrode contact, a stimulus including an electrostactic force, and an effect due to realizing coldness and warmness by using a heat absorbing or emitting device.
The haptic module 154 may be implemented to be able to deliver a tactile effect through a direction contact and to enable a user to feel a tactile effect through the muscle sense of a finger or an arm. Depending on the configuration of the mobile terminal 100, the haptic module 154 may exist in plurality.
The memory 160 may store programs for the operation of the control unit 180 and temporarily store data (e.g., a phone book, a message, a still image, and a video) that is input and output. The memory 160 may store data on sound and vibrations having various patterns that are output when there is a touch input on the touch screen.
The memory 160 may include as a storage medium, at least one of memories including flash memory type, hard disk type, micro type and card type (for example, secure digital (SD) card or extreme digital (XD) card), and other types of memories including a random access memory (RAM), a static RAM (SRAM), a read-only memory (ROM), a programmable ROM (PROM), an electrically erasable PROM (EEPROM), a magnetic RAM (MRAM), a magnetic disk and an optical disk type memory. The mobile terminal 100 may also operate along with a web storage that performs a storage function of the memory 160 over the internet.
The interface unit 170 functions as a way to all external devices connected to the mobile terminal 100. The interface unit 170 receives data from external devices, receives power and transmits the data to each component of the mobile terminal 100, or transmits data from the mobile terminal 100 to the external devices. For example, the interface unit 170 may include a wired/ wireless headset port, an external charger port, a wired/ wireless data port, a memory card port, a port connecting a device that includes an identification module, an audio input and output (I/O) port, and an earphone port.
The identification module is a chip storing various pieces of information for identifying the authority of using the mobile terminal 100 and may include a user identify module (UIM), a subscriber identity module (SIM),and a universal subscriber identity module (USIM). A device that includes the identification module (hereinafter, referred to as an “identification device”) may be manufactured in a smart card type. Thus, the identification device may be connected to the terminal 100 via a port.
When the mobile terminal 100 is connect to an external cradle, the interface unit may be a way through which power from the cradle is supplied to the mobile terminal or may be a way through which various command signals input from the cradle by a user are transmitted to the mobile terminal. The power or the various command signals input from the cradle may also operate as a signal for recognizing that the mobile terminal is correctly installed on the cradle.
The controller 180 typically controls the overall operations of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, and video calls. The controller 180 may also include a multimedia module 181 for providing multimedia playback functions. The multimedia module 181 may be implemented in the controller 180, or may be implemented separately from the controller (180).
The controller 180 may perform pattern recognition processing through which handwriting input or a drawing input performed on the touch screen may be recognized as a character and image.
In particular, the controller 180 may obtain an image of a subject based on an optical signal projected onto the image sensor 270 and detect a movement of the subject by using the obtained image. The controller 180 may control the mobile terminal 100 so that the mobile terminal 100 performs its function according to the detected movement of the subject. The operations of the controller 180 associated therewith are described below in detail.
The power supply unit 190 receives internal power or external power under the control of the controller 180 and provides power required for operating each of components. Various embodiments described herein may be implemented in a recording medium that may be read with a computer or a similar device by using software, hardware or a combination thereof.
According to a hardware implementation, the embodiments described herein may be implemented by using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers), microprocessors, and other electrical units for performing functions. In some cases, the embodiments may be implemented by the controller 180.
According to software implementation, embodiments associated with procedures or functions may be implemented along with a separate software module that performs at least one function or operation. Software codes may be implemented by software applications that are written in proper programming languages. The software codes may be stored in the memory 160 and may be executed by the controller 180.
An additional configuration of a mobile terminal according to an embodiment is described below with reference to Fig. 2.
Fig. 2 is a block diagram of an additional component of a mobile terminal according to an embodiment.
As shown in Fig. 2, the mobile terminal 100 further includes a gesture recognition module 200.
In an embodiment, when the mobile terminal 100 further include the gesture recognition module 200, the mobile terminal 100 may not include the proximity sensor as described in Fig. 1.
The gesture recognition module 200 may be arranged on the upper part of the touch screen of the mobile terminal 100 and may be arranged adjacent to a front camera.
The gesture recognition module 200 may obtain an image of a subject and recognize a movement of the subject through the obtained image of the subject. For example, the subject may be a human body and in particular, a hand.
The gesture recognition module 200 may be a module used for recognizing a hand movement (gesture). The controller 180 of the mobile terminal 100 may recognize a gesture through the gesture recognition module 200 and may control the mobile terminal 100 so that a function corresponding to the identified gesture is performed.
The gesture recognition module 200 may include an infrared radiating unit 210, a lens 230, an optical filter 250, and an image sensor 270.
The infrared radiating unit 210 may externally radiate infrared light. In an embodiment, the infrared radiating unit 210 may include infrared light-emitting diode (LED) lighting.
The lens 230 may collect light incoming from the outside.
The optical filter 250 may filter only the infrared light of the incoming lights.
The image sensor 270 may use the filtered infrared light to obtain an image of a subject. The image sensor 270 may include a low-resolution complementary metal?oxide?semiconductor (CMOS) image sensor.
Each component of the gesture recognition module 200 is described below in detail.
Fig. 3 is a flow chart of a method of operating a mobile terminal according to an embodiment.
In Fig. 3, the method of operating the mobile terminal according to an embodiment is described with reference to Figs. 1 and 2.
Referring to Fig. 3, the infrared radiating unit 210 of the gesture recognition module 200 installed in the mobile terminal 100 externally radiates infrared light in step S101. That is, the infrared radiating unit 210 of the mobile terminal 100 may radiate infrared light belonging to an infrared wavelength band in order to obtain an image of a subject.
In an embodiment, the infrared radiating unit 210 may include infrared LED lighting.
The lens 230 of the gesture recognition module 200 focuses light incoming from the outside in step S103. In an embodiment, the lights incoming from the outside may include infrared lights reflected from a subject and lights belonging to other wavelength bands excluding the infrared wavelength band. The lens 230 may use infrared lights reflected from the subject to focus the incoming lights to obtain an image of the subject.
In an embodiment, the subject may be a human body and in particular, a hand that is a portion of the human body.
The optical filter 250 of the gesture recognition module 200 filters only infrared lights of the incoming lights in step S105. In an embodiment, the optical filter 250 may pass only infrared lights of the incoming lights belonging to an infrared wavelength band and block lights belonging to other wavelength bands excluding the infrared wavelength band.
In an embodiment, the optical filter 250 may include a band pass filter that may pass only lights belonging to the infrared wavelength band.
The image sensor 270 of the gesture recognition module 200 uses the filtered infrared lights to obtain an image of a subject in step S107.
In an embodiment, the image sensor 270 may include a low-resolution CMOS image sensor in order to accurately obtain the shape of the subject. The CMOS image sensor is an image sensor that uses a switching technique that may include a plurality of transistors corresponding respectively to a plurality of pixels and use the plurality of transistors to detect the sequential outputs of x-axis and y-axis directions. Each of the plurality of pixels of the CMOS image sensor may include one photodiode and four MOS transistors.
The CMOS image sensor may put the filtered infrared lights into the pixels, convert them into electrical energy, i.e., digital signals, and obtain an image of a subject.
The controller 180 of the mobile terminal 100 checks whether the obtained image of the subject is an initial gesture, in step S109. In an embodiment, the initial gesture may be a gesture that should be initially recognized to activate the gesture recognition module 200. That is, the initial gesture may be a gesture that should be initially recognized so that the gesture recognition module 200 recognizes a user gesture to operate the mobile terminal 100.
The initial gesture is used for preventing a gesture not intended by a user from becoming recognized and operated by the mobile terminal 100.
The initial gesture may be set by a user of the mobile terminal 100 or may be set by a manufacturer when manufacturing the mobile terminal 100.
The initial gesture may correspond to various hand movements. Related descriptions are provided with reference to Fig. 4.
Fig. 4 is an example of an initial gesture according to an embodiment.
Referring to Fig. 4, the initial gesture may correspond to any one of a hand movement 301 made when a user opens his/her hand, a hand movement 302 made when a user stretches and gather only the thumbs and index fingers of both hands, and a hand movement 303 made when a user stretches only his or her index finger, and a hand movement 304 made when a user stretches only his or her index finger and middle finger. However, the initial gesture is not needed to be limited thereto, and may also include any hand movements including a hand movement made when a user stretches only his/her thumb.
The gesture recognition module 200 of the mobile terminal 100 may recognize any one of gestures shown in Fig. 4 as the initial gesture and then recognize the following gestures of a user.
Refer back to Fig. 3.
If an image of a subject is identified as no initial gesture, go back to step S101.
If the obtained image of the subject is identified as the initial gesture, the controller 180 of the mobile terminal 100 increases the frame rate of the image sensor 270 in order to recognize gestures following the initial gesture in step S111. In an embodiment, the controller 180 of the mobile terminal 100 may adjust the frame rate of the image sensor 270 in order to recognize the initial gesture and gestures following the initial gesture in step S111. In particular, the controller 180 may maintain the frame rate of the image sensor 270 at a low speed in order to recognize the initial gesture, and after the initial gesture is recognizes, it is possible to maintain the frame rate of the image sensor 270 at a high speed in order to recognize following gestures. Since the gestures following the initial gesture may include a gesture movement, the controller 180 may adjust the frame rate of the image sensor 270 in order to be aware of a gesture movement more accurately.
For example, the controller 180 may maintain the frame rate of the image sensor 270 at 6 fps in order to recognize the initial gesture, and after the initial gesture is recognizes, it is possible to maintain the frame rate of the image sensor 270 at 400 fps to 5000 fps. Accordingly, by adjusting the frame rate depending on whether the initial gesture is recognized unlike a typical gesture recognition module that maintains the frame rate at a high speed, it is possible to prevent an unnecessary waste of memory.
The controller 180 of the mobile terminal 100 increases the frame rate, then detects a variation in an infrared signal projected into the image sensor 270 in step S113, and recognizes a gesture based on the detected variation in the infrared signal in step S115.
The controller 180 may increase the frame rate of the image sensor in order to recognize user gestures following the initial gesture and then detect a variation in x-axis and a variation in y-axis of an infrared signal projected into the image sensor 270.
Moreover, the controller 180 may also detect a variation in z-axis of an infrared signal projected into the image sensor 270 and recognize a gesture changing in the z-axis direction.
Related descriptions are provided in detail with reference Figs. 5 to 7
Fig. 5 is an example of detecting an optical signal and obtaining a gesture mage according to an embodiment, Fig. 6 is a diagram for explaining processes of detecting variations in optical signals on x-axis and y-axis and recognizing a gesture movement according to an embodiment, and Fig. 7 is a diagram for explaining processes of detecting a variation in optical signal on z-axis and recognizing a gesture movement according to an embodiment.
In particular, Fig. 5 is a diagram for explaining a method of detecting a variation in optical signal and recognizing a gesture when a CMOS image sensor is used as the image sensor 270.
Referring to Fig. 5, an image obtaining unit 271 of the image sensor may use an infrared signal on x-axis and an infrared signal on y-axis to obtain a 2D gesture image A. The image obtaining unit 271 may include a plurality of pixels. The gesture image A projected into the plurality of pixels is not a user’s hand movement image but is assumed as a user’s hand movement.
The image sensor 270 may combine an x-axis infrared signal of the x-axis direction with a y-axis infrared signal of the y-axis direction to obtain the 2D gesture image A.
If a user gesture moves in the x-axis direction, the x-axis infrared signal may move in the left or right direction and the image obtaining unit 271 of the image sensor 270 may obtain a gesture image that is formed by reflecting the infrared signal moving in the x-axis direction. Related descriptions are provided with reference to Fig. 6.
Referring to Fig. 6, the controller 180 may detect a variation in the x-axis infrared signal moving in the x-axis direction and recognize based on the detected variation in the x-axis infrared signal that a gesture has moved in the x-axis direction. In particular, when a user gesture has moved to the right side, the x-axis infrared signal X1 may also move to the right side and thus the gesture image A obtained from the image sensor 270 may also move in the x-axis direction.
The controller 180 may sense a movement variation of the x-axis infrared light and thus detect a movement in gesture.
Although Fig. 6 describes only the movement of the x-axis direction of a gesture as an example, there is no need to be limited thereto and the controller 180 may also detect the movement of the y-axis direction of the gesture and the simultaneous movements of the x-axis and y-axis directions of the gesture.
Next, Fig. 7 is described.
Fig. 7 is a diagram for explaining a process of recognizing a movement of a gesture moving in the z-axis direction according to an embodiment.
Referring to Fig. 7, two gesture images B and C obtained by the image obtaining unit 271 of the image sensor 270 are shown. The controller 180 may detect based on the brightness of a gesture image whether the distance between the gesture recognition module 200 and a user becomes longer or shorter. For example, when it is assumed that the brightness of the gesture image B is lower than that of the gesture image B, it may mean that the distance between a gesture corresponding to the gesture image C and the gesture recognition module 200 is shorter than that between a gesture corresponding to the gesture image G and the gesture recognition module 200.
It is because an image obtained by the image obtaining unit 271 of the image sensor 270 may be clearer as the distance between the gesture recognition module 200 and a user gesture is shorter. That is, the darker the color of the obtained image is, the shorter the distance between the gesture recognition module 200 and a user gesture may be.
The controller 180 may use the absolute magnitude of an infrared signal projected into the image sensor 270 to obtain a gesture image and may detect the distance between the gesture recognition module 200 and a gesture based on the obtained brightness of the gesture image. The controller 180 may determine based on the detected distance whether a gesture has moved in the z-axis direction, and may control the operation of the mobile terminal 100 according to a determination result.
Refer back to Fig. 3.
The controller 180 of the mobile terminal 100 controls the mobile terminal 100 so that an operation corresponding to a movement of the recognized gesture is performed, in step S117.
In an embodiment, a plurality of gesture movements may correspond respectively to the operations of a plurality of mobile terminals 100. For example, when a gesture moves in the x-axis direction, the mobile terminal 100 may move the screen of the mobile terminal 100 to another screen in response to the gesture movement of the x-axis direction.
Various examples of the operation of the mobile terminal according to the gesture movement are described with reference to Figs. 8 to 13.
FIGS. 8 to 13 are diagrams for explaining the operation of a mobile terminal corresponding to a gesture movement according to an embodiment.
In the following, a hand movement having an open hand state is described as an example of a gesture, there is no need to be limited thereto.
Firstly, related description are provided with reference Figs. 8 and 9 As shown in Fig. 8, if a user gesture moves in the x-axis direction, the gesture recognition module 200 recognizes a movement of a gesture moving in the x-axis direction. The mobile terminal 100 may change a user screen to another screen as shown in Fig. 9, in response to the gesture movement of the x-axis direction recognized through the gesture recognition module 200. In this example, the screen change of the mobile terminal 100 corresponding to the gesture movement of the x-axis direction is merely an example.
Next, Figs. 10 and 11 are described. As shown in Fig. 10, if a user gesture moves in the x-axis and y-axis directions (diagonal direction), the gesture recognition module 200 recognizes a movement of a gesture moving in the x-axis and y-axis directions. The mobile terminal 100 may change a user screen to a setting screen for setting the functions of the mobile terminal 100 as shown in Fig. 11, in response to the gesture movement of the x-axis and y-axis directions recognized through the gesture recognition module 200.
Next, Figs. 12 and 13 are described. As shown in Fig. 12, if a user gesture moves in the z-axis direction, the gesture recognition module 200 recognizes a movement of a gesture moving in the z-axis direction. In response to the gesture movement of the z-axis direction recognized through the gesture recognition module 200, the mobile terminal 100 may highlight an icon K of a specific application displayed on the screen of the mobile terminal 100 to inform that a corresponding icon K is selected, as shown in Fig. 13.
The mobile terminal 100 according to various embodiments may use a cheap gesture recognition module 200 to effectively recognize a user gesture.
Moreover, the mobile terminal 100 according to various embodiments may also easily detect the movement of y-axis and z-axis directions in addition to the x-axis direction of a gesture and perform a corresponding operation.
Embodiments may also be applied to a stationary terminal such as a TV set in addition to a mobile terminal.
According to an embodiment, the above-described method may also be embodied as processor readable codes on a program-recorded medium. Examples of the processor readable medium are a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and a carrier wave form (such as data transmission through the Internet).
The above-described mobile terminal is not limited to the configuration and method of the above-described embodiments, and some or all of the embodiments may also be selectively combined so that various variations may be made.

Claims (15)

  1. A method of operating a mobile terminal comprising a gesture recognition module, the method comprising:
    externally radiating an infrared light;
    recognizing a user gesture based on a reflected infrared light;
    extracting an infrared signal variation and detecting a movement of the gesture based on the infrared signal variation; and
    performing a function of the mobile terminal corresponding to the detected movement of the gesture.
  2. The method according to claim 1, wherein the detecting of the movement of the gesture comprises extracting an infrared signal variation of at least one of x-axis and y-axis directions and detecting a movement of the gesture.
  3. The method according to claim 2, wherein the detecting of the movement of the gesture further comprises extracting an infrared signal variation of z-axis direction and detecting a movement of the gesture.
  4. The method according to claim 3, wherein the recognizing of the user gesture based on the reflected infrared light comprises filtering only an infrared light of incoming lights, and
    the detecting of the movement of the gesture further comprises:
    using the filtered infrared light to obtain an image of the gesture; and
    detecting a gesture movement of z-axis direction based on the brightness of the obtained gesture image.
  5. The method according to claim 1, wherein the recognizing of the user gesture based on the reflected infrared light comprises filtering only an infrared light of incoming lights,
    obtaining an image of the gesture using the filtered infrared light, and
    checking whether a gesture corresponding to the obtained gesture image is an initial gesture required for activating the gesture recognition module.
  6. The method according to claim 5, further comprising increasing a frame rate of an image sensor comprised in the gesture recognition module when the gesture corresponding to the obtained gesture image is the initial gesture.
  7. The method according to claim 1, wherein the function of the mobile terminal comprises any one of a movement of a mobile terminal screen, a switch of the mobile terminal screen, and a selection of an application displayed on a display unit of the mobile terminal.
  8. A mobile terminal comprising:
    a gesture recognition module externally to radiate an infrared light, wherein the gesture recognition module recognizes a user gesture based on a infrared light reflected; and
    a control unit to extract a infrared signal variation and detect a movement of the gesture, and perform a function of the mobile terminal corresponding to the detected movement of the gesture.
  9. The mobile terminal according to claim 8, wherein the gesture recognition module comprises:
    an infrared light radiating unit externally to radiate the infrared light;
    a lens to receive an incoming light comprising an infrared light reflected;
    an optical filter to filter the reflected infrared light of the incoming light; and
    an image sensor to obtain an image of the gesture based on the filtered infrared light.
  10. The mobile terminal according to claim 9, wherein the control unit extracts an infrared signal variation of at least one of x-axis and y-axis directions and detects a movement of the gesture based on the infrared signal variation of at least one of x-axis and y-axis directions.
  11. The mobile terminal according to claim 10, wherein the control unit extracts an infrared signal variation of z-axis direction and detects the movement of the gesture based on the infrared signal variation of z-axis direction based on the infrared signal variation of z-axis direction.
  12. The mobile terminal according to claim 11, wherein the control unit obtains the image of the gesture using the filtered infrared light and detects a gesture movement of z-axis direction based on a brightness of the obtained gesture image.
  13. The mobile terminal according to claim 9, wherein the control unit checks whether a gesture corresponding to the obtained gesture image is an initial gesture required for activating the gesture recognition module.
  14. The mobile terminal according to claim 13, wherein the control unit increases a frame rate of an image sensor comprised in the gesture recognition module when the gesture corresponding to the obtained gesture image is the initial gesture.
  15. The mobile terminal according to claim 8, wherein the function of the mobile terminal comprises any one of a movement of a mobile terminal screen, a switch of the mobile terminal screen, and the selection of an application displayed on a display unit of the mobile terminal.
PCT/KR2014/003214 2013-10-10 2014-04-14 Mobile terminal and operating method thereof WO2015053451A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2013-0120765 2013-10-10
KR20130120765A KR20150042039A (en) 2013-10-10 2013-10-10 Mobile terminal and operating method thereof

Publications (1)

Publication Number Publication Date
WO2015053451A1 true WO2015053451A1 (en) 2015-04-16

Family

ID=52813254

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2014/003214 WO2015053451A1 (en) 2013-10-10 2014-04-14 Mobile terminal and operating method thereof

Country Status (2)

Country Link
KR (1) KR20150042039A (en)
WO (1) WO2015053451A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016201704A1 (en) * 2016-02-04 2017-08-10 Bayerische Motoren Werke Aktiengesellschaft A gesture recognition apparatus and method for detecting a gesture of an occupant of a vehicle
WO2018006481A1 (en) * 2016-07-04 2018-01-11 中兴通讯股份有限公司 Motion-sensing operation method and device for mobile terminal
CN112527093A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Gesture input method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080040692A1 (en) * 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US20080244468A1 (en) * 2006-07-13 2008-10-02 Nishihara H Keith Gesture Recognition Interface System with Vertical Display
US20100321289A1 (en) * 2009-06-19 2010-12-23 Samsung Electronics Co. Ltd. Mobile device having proximity sensor and gesture based user interface method thereof
KR20110054256A (en) * 2009-11-17 2011-05-25 엘지전자 주식회사 Mobile terminal and method for controlling thereof
JP2013534009A (en) * 2010-06-17 2013-08-29 クゥアルコム・インコーポレイテッド Method and apparatus for non-contact gesture recognition and power reduction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080040692A1 (en) * 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
US20080244468A1 (en) * 2006-07-13 2008-10-02 Nishihara H Keith Gesture Recognition Interface System with Vertical Display
US20100321289A1 (en) * 2009-06-19 2010-12-23 Samsung Electronics Co. Ltd. Mobile device having proximity sensor and gesture based user interface method thereof
KR20110054256A (en) * 2009-11-17 2011-05-25 엘지전자 주식회사 Mobile terminal and method for controlling thereof
JP2013534009A (en) * 2010-06-17 2013-08-29 クゥアルコム・インコーポレイテッド Method and apparatus for non-contact gesture recognition and power reduction

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016201704A1 (en) * 2016-02-04 2017-08-10 Bayerische Motoren Werke Aktiengesellschaft A gesture recognition apparatus and method for detecting a gesture of an occupant of a vehicle
WO2018006481A1 (en) * 2016-07-04 2018-01-11 中兴通讯股份有限公司 Motion-sensing operation method and device for mobile terminal
CN112527093A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Gesture input method and electronic equipment
WO2021052139A1 (en) * 2019-09-18 2021-03-25 华为技术有限公司 Gesture input method and electronic device

Also Published As

Publication number Publication date
KR20150042039A (en) 2015-04-20

Similar Documents

Publication Publication Date Title
WO2014003329A1 (en) Mobile terminal and method for recognizing voice thereof
WO2012036324A1 (en) Mobile terminal and method for controlling operation thereof
WO2014137074A1 (en) Mobile terminal and method of controlling the mobile terminal
WO2015012449A1 (en) Electronic device and control method thereof
WO2018092949A1 (en) Mobile terminal
WO2012030001A1 (en) Mobile terminal and method for controlling operation thereof
WO2014109421A1 (en) Terminal and control method therefor
WO2018030597A1 (en) Watch type terminal
WO2017164567A1 (en) Intelligent electronic device and method of operating the same
WO2015088166A1 (en) Mobile terminal and method for operating rear surface input unit of same
WO2018056473A1 (en) Head-mounted display device
WO2011078454A1 (en) Mobile terminal and method for notifying charging state thereof
WO2018105772A1 (en) Mobile terminal
WO2012046891A1 (en) Mobile terminal, display device, and method for controlling same
WO2015147581A1 (en) Method and apparatus for providing information based on movement of an electronic device
WO2018097340A1 (en) Mobile terminal
WO2018093005A1 (en) Mobile terminal and method for controlling the same
WO2016056723A1 (en) Mobile terminal and controlling method thereof
WO2015023040A1 (en) Mobile terminal and method of driving same
WO2018101508A1 (en) Mobile terminal
WO2014204022A1 (en) Mobile terminal
WO2020060004A1 (en) Mobile terminal
WO2015108287A1 (en) Mobile terminal
WO2015064887A1 (en) Mobile terminal
WO2012023642A1 (en) Mobile equipment and security setting method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14851842

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14851842

Country of ref document: EP

Kind code of ref document: A1