WO2015152845A1 - A device with touch screen - Google Patents

A device with touch screen Download PDF

Info

Publication number
WO2015152845A1
WO2015152845A1 PCT/TR2015/000129 TR2015000129W WO2015152845A1 WO 2015152845 A1 WO2015152845 A1 WO 2015152845A1 TR 2015000129 W TR2015000129 W TR 2015000129W WO 2015152845 A1 WO2015152845 A1 WO 2015152845A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch screen
point
touched
area
memory
Prior art date
Application number
PCT/TR2015/000129
Other languages
French (fr)
Inventor
Abdullah Ruhi SOYLU
Gorkem YAVAS
Bora ERGIN
Original Assignee
Soylu Abdullah Ruhi
Yavas Gorkem
Ergin Bora
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Soylu Abdullah Ruhi, Yavas Gorkem, Ergin Bora filed Critical Soylu Abdullah Ruhi
Publication of WO2015152845A1 publication Critical patent/WO2015152845A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to a device which has a touch screen and ensures that the point or the area touched on the touch screen are indicated by means of various visual elements and thus it is touched to the intended point or the area less accidentally.
  • An objective of the present invention is to realize a device which has a touch screen, ensures that it is touched to the intended point or the area less accidentally by determining the point or the area touched on the touch screen and indicating it through a strong feedback by means of various visual elements on touch screen as well.
  • Figure 1 is a schematic block diagram of the inventive system.
  • Figure 2 is a view which indicates the touch and the representation realized in cases when a keyboard is displayed on the touch screen.
  • a device (1 ) having a touch screen and enabling indication of a touched area upon determining the said area comprises:
  • At least one touch screen (2) which serves as a visual input output unit; at least one touch screen control unit (3) which ensures that signals relating to touches performed on the touch screen (2) and image to be displayed on the touch screen (2) are exchanged from the touch screen (2);
  • At least one memory (4) wherein there are visual data and commands required for determining the touched point on the touch screen (2) and displaying the touched point on the touch screen (2) are located;
  • At least one processor (5) which carries out process of determining the touched point on the touch screen (2) by using the data and the commands on the memory (4) and sending the image that will enable display of the touched point to the touch screen (2); at least one peripheral interface (6) which serves as an interface between the touch screen (2) and all input/output units similar to this (2) and the memory (4) and the processor (5)
  • the touch screen (2) is a unit which serves as a visual input output unit.
  • the touch screen (2) receives the documents realized on itself as input and displays a certain image as output. T he said image may comprise items such as text, picture, figure, video.
  • the touch screen (2) may be a screen made of LED (Light Emitting Diode), LCD (Liquid Crystal Display) or another screen technology.
  • the touch screen (2) enables indication and use of a keyboard on itself and indicates the point which is touched on the keyboard while using the keyboard real-timely by means of various visual items.
  • the touch screen control unit (3) is a unit which transfers the information about the touch to the peripheral interface (6) upon receiving the touches performed on the touch screen (2) from the touch screen (2) and transfers the signals relating to the information about the image to be displayed on the touch screen (2), to the touch screen (2) upon receiving it from the peripheral interface (6).
  • the memory (4) is a unit wherein data and commands required for determining the point touched on the touch screen (2) and displaying the point touched on the touch screen (2) are kept by the processor (5) in compliance with use.
  • the memory (4) which comprises volatile and non-volatile storage units.
  • the commands located on the memory (4) are commands which enable a real-time indication of the point touched on the keyboard only when a keyboard use is performed over the touch screen (2).
  • the processor (5) uses the volatile memory on the memory (4) when running the said commands.
  • the said images may be one, several or all of a line/s which intersect/s each other and whose intersection indicates the point or the area touched; an arrow mark/s whose endpoints indicate the point or the area touched; a line/s which is/are located inside a circle and whose intersections indicate the point or the area touched: a square/s whose intersections of diagonals indicate the point or the area touched: preferably a two-dimensional image/s such as a bubble which is/are indicated near the touched area or indicate/s a representation of the touched area together with the touched point or area in thereof.
  • these visual effects can be indicated by effects such as change of colour or shade in the form of animation on the touch screen (2) as well.
  • the processor (5) is a unit which carries out process of determining the point or the area touched on the touch screen (2) by using the data and the commands on the memory (4) and sending the image that will enable display of the point touched to the touch screen (2) during touch.
  • the processor (5) performs at least two tasks within the invention by using the data and the commands on the memory (4). One of these tasks is to determine the location touched on the touch screen (2) and the other task is to enable indication of this point on the touch screen (2) by means of visual elements.
  • the point or the area touched on the touch screen (2) is determined and displayed on the touch screen (2) in a preferred embodiment of the invention.
  • the processor (5) determines the point or the area touched on the touch screen (2) by miming the commands, which enable determining the point or the area touched on the touch screen (2) in the memory (4), together with the information received from the peripheral interface (6) and then transmits the related image information to the touch screen (2) over the peripheral interface (6) by running the commands, which enable displaying the point or the area touched on the touch screen (2) in the memory (4) by means of various images, using this information.
  • the visual tip which will indicate the point touched on the touch screen (2) can be prepared by the processor (5) such that it wi l l be within the change of colour, shade or contrast as long as it is displayed on the screen (2).
  • the processor (5) can use CPU (Central Processing Unit) or GPU (Graphics Processing Unit).
  • the peripheral interface (6) is a unit which serves as an interface between all input/output control units such as the touch screen control unit (3) and thus the memory (4) and the processor (5) that are located or may be located within the device (1).
  • the inventive device ( 1) By means of the inventive device ( 1), it is ensured that the point or the area touched on the touch screen (2) is determined and the point or the area touched is displayed on the touch screen (2) by means of various visual elements which give feedback.
  • the signals which are created when it is touched onto the touch screen (2) by a user's finger or any object that will enable touch on the touch screen (2) or it is approached until a certain distance while the said process is being carried out, are received by the touch screen control unit (3) and the information about the said touch is transferred to the memory (4) and the processor (5) by means of the peripheral interface (6).
  • the processor (5) firstly determines the location (point or area) touched on the touch screen (2) by using the information received from peripheral interface (6) and the commands kept in the memory (4) and then transfers the image wherein the area touched is displayed precisely, real-timely by means of apparent and two-dimensional visual tips to the touch screen (2) by means of the peripheral interface (6).
  • the said display takes place on the touch screen (2) by visual tips such that it will be understood even if the touch center (point or area) is covered. Examples for this type of display are provided in the Figure 2.
  • the center of touch can be estimated easily even if it is covered by finger or tools such as touch pen and the user looks at another part of the screen.

Abstract

The present invention relates to a device (1) which has a touch screen (2) and ensures that the point or the area touched on the touch screen (2) are indicated by means of various visual elements more specifically and thus it is touched to the intended point or the area less accidentally. The inventive device (1) comprises: touch screen (2), touch screen control unit (3), memory (4), processor (5) and peripheral interface (6).

Description

DESCRIPTION
A DEVICE WITH TOUCH SCREEN Technical Field
The present invention relates to a device which has a touch screen and ensures that the point or the area touched on the touch screen are indicated by means of various visual elements and thus it is touched to the intended point or the area less accidentally.
Background of the Invention
Today, use of devices with touch screen shows a continuous increase and touch screens of the aforementioned devices can be used both as an input and output unit. T he fact that a user knows and understands where she/he exactly touches on touch screens existing in devices particularly such as smart phones, tablets is of vital importance in terms of carrying out an input process with less error.
Particularly in cases when a text entry tool, for example a text keyboard, is displayed on a touch screen; the fact that users can touch the area where the intended letter is located is a significant factor which ensures an error-free text entry by preventing wrong writing.
In the state of the art, various visual feedback methods that serve for showing the user which area is touched on the touch screen are applied in order that the said factor can be realized. In these methods and the devices wherein these methods are applied, however, the center or the area of touch is displayed only as unidimensional and uniaxial (only vertically or horizontally or as a point) or the touch center marked cannot be seen clearly because it remains under the finger of the user or the object enabling the touch. In this case, the user cannot receive feedback strong enough for concerning where the user has touched.
Considering existing techniques it is realized that there is need for novel methods which indicate touch centers in a plurality of axes such as both vertical and horizontal in touches performed on a touch screen and which enable a user to see visual tips easily when s/he looks at an area away from the touch center, and devices wherein these methods are applied. The United States patent document no. US2008204427, an application in the state of the art, discloses a device which comprises a pressure-sensitive touch screen. In the said device; an indication which varies by intensity of pressure exerted on touch screen is provided in the touched area of the touch screen during use. The monitor provides a visual indication depending on a magnitude of a pressure registered by the touch screen. The indication is rendered as centered on the touch area and has an attribute that depends on the pressure exerted.
Summary of the Invention
An objective of the present invention is to realize a device which has a touch screen, ensures that it is touched to the intended point or the area less accidentally by determining the point or the area touched on the touch screen and indicating it through a strong feedback by means of various visual elements on touch screen as well.
Detailed Description of the Invention
"A Device with Touch Screen" realized to fulfill the objectives of the present invention is shown in the figures attached, in which: Figure 1 is a schematic block diagram of the inventive system.
Figure 2 is a view which indicates the touch and the representation realized in cases when a keyboard is displayed on the touch screen.
The components illustrated in the Figure 1 are individually numbered, where the numbers refer to the following:
1. Device
2. Touch screen
3. Touch screen control unit
4. Memory
5. Processor
6. Peripheral interface
A device (1 ) having a touch screen and enabling indication of a touched area upon determining the said area comprises:
at least one touch screen (2) which serves as a visual input output unit; at least one touch screen control unit (3) which ensures that signals relating to touches performed on the touch screen (2) and image to be displayed on the touch screen (2) are exchanged from the touch screen (2);
at least one memory (4) wherein there are visual data and commands required for determining the touched point on the touch screen (2) and displaying the touched point on the touch screen (2) are located;
at least one processor (5) which carries out process of determining the touched point on the touch screen (2) by using the data and the commands on the memory (4) and sending the image that will enable display of the touched point to the touch screen (2); at least one peripheral interface (6) which serves as an interface between the touch screen (2) and all input/output units similar to this (2) and the memory (4) and the processor (5)
(Figure 1 ).
The touch screen (2) is a unit which serves as a visual input output unit. The touch screen (2) receives the documents realized on itself as input and displays a certain image as output. T he said image may comprise items such as text, picture, figure, video. In different embodiments of the invention, the touch screen (2) may be a screen made of LED (Light Emitting Diode), LCD (Liquid Crystal Display) or another screen technology. In a preferred embodiment of the invention, the touch screen (2) enables indication and use of a keyboard on itself and indicates the point which is touched on the keyboard while using the keyboard real-timely by means of various visual items.
The touch screen control unit (3) is a unit which transfers the information about the touch to the peripheral interface (6) upon receiving the touches performed on the touch screen (2) from the touch screen (2) and transfers the signals relating to the information about the image to be displayed on the touch screen (2), to the touch screen (2) upon receiving it from the peripheral interface (6).
The memory (4) is a unit wherein data and commands required for determining the point touched on the touch screen (2) and displaying the point touched on the touch screen (2) are kept by the processor (5) in compliance with use.
In one embodiment of the invention, the memory (4) which comprises volatile and non-volatile storage units. In a preferred embodiment of the invention, there are commands serving for determining the point touched on the touch screen (2) and serving for indicating the point touched on the touch screen (2) during touch by means of various images in the non-volatile storage unit on the memory (4) in one embodiment of the invention. In one embodiment of the invention, the commands located on the memory (4) are commands which enable a real-time indication of the point touched on the keyboard only when a keyboard use is performed over the touch screen (2). The processor (5) uses the volatile memory on the memory (4) when running the said commands.
On the memory (4), there are images that will to be used for indicating the area touched on the touch screen (2) in real-time. In different embodiments of the invention, the said images may be one, several or all of a line/s which intersect/s each other and whose intersection indicates the point or the area touched; an arrow mark/s whose endpoints indicate the point or the area touched; a line/s which is/are located inside a circle and whose intersections indicate the point or the area touched: a square/s whose intersections of diagonals indicate the point or the area touched: preferably a two-dimensional image/s such as a bubble which is/are indicated near the touched area or indicate/s a representation of the touched area together with the touched point or area in thereof. In one embodiment of the invention, these visual effects can be indicated by effects such as change of colour or shade in the form of animation on the touch screen (2) as well. The processor (5) is a unit which carries out process of determining the point or the area touched on the touch screen (2) by using the data and the commands on the memory (4) and sending the image that will enable display of the point touched to the touch screen (2) during touch. The processor (5) performs at least two tasks within the invention by using the data and the commands on the memory (4). One of these tasks is to determine the location touched on the touch screen (2) and the other task is to enable indication of this point on the touch screen (2) by means of visual elements. Thus, the point or the area touched on the touch screen (2) is determined and displayed on the touch screen (2) in a preferred embodiment of the invention. The processor (5) determines the point or the area touched on the touch screen (2) by miming the commands, which enable determining the point or the area touched on the touch screen (2) in the memory (4), together with the information received from the peripheral interface (6) and then transmits the related image information to the touch screen (2) over the peripheral interface (6) by running the commands, which enable displaying the point or the area touched on the touch screen (2) in the memory (4) by means of various images, using this information. In one embodiment of the invention, the visual tip which will indicate the point touched on the touch screen (2) can be prepared by the processor (5) such that it wi l l be within the change of colour, shade or contrast as long as it is displayed on the screen (2). In different embodiments of the invention, the processor (5) can use CPU (Central Processing Unit) or GPU (Graphics Processing Unit).
The peripheral interface (6) is a unit which serves as an interface between all input/output control units such as the touch screen control unit (3) and thus the memory (4) and the processor (5) that are located or may be located within the device (1).
By means of the inventive device ( 1), it is ensured that the point or the area touched on the touch screen (2) is determined and the point or the area touched is displayed on the touch screen (2) by means of various visual elements which give feedback. The signals, which are created when it is touched onto the touch screen (2) by a user's finger or any object that will enable touch on the touch screen (2) or it is approached until a certain distance while the said process is being carried out, are received by the touch screen control unit (3) and the information about the said touch is transferred to the memory (4) and the processor (5) by means of the peripheral interface (6). The processor (5) firstly determines the location (point or area) touched on the touch screen (2) by using the information received from peripheral interface (6) and the commands kept in the memory (4) and then transfers the image wherein the area touched is displayed precisely, real-timely by means of apparent and two-dimensional visual tips to the touch screen (2) by means of the peripheral interface (6). The said display takes place on the touch screen (2) by visual tips such that it will be understood even if the touch center (point or area) is covered. Examples for this type of display are provided in the Figure 2. The center of touch can be estimated easily even if it is covered by finger or tools such as touch pen and the user looks at another part of the screen.
It is possible to develop a great variety of embodiments of the inventive system (1); it cannot be limited to the examples disclosed herein and the representations stated in Claim 2; it is essentially according to the claims.

Claims

CLAIMS 1. A device (1 ) having a touch screen and enabling indication of a touched area upon determining the said area comprising:
at least one touch screen (2) which serves as a visual input output unit; at least one touch screen control unit (3) which ensures that signals relating to touches performed on the touch screen (2) and image to be displayed on the touch screen (2) are exchanged from the touch screen (2);
at least one memory (4);
at least one processor (5);
at least one peripheral interface (6) which serves as an interface between the touch screen (2) and all input/output units similar to this (2) and the memory (4) and the processor (5);
and characterized by
at least one memory (4) wherein there are visual data and commands required for determining the point or the area touched on the touch screen (2) and displaying the point or the area touched on the touch screen (2) are located;
at least one processor (5) which carries out process of determining the point or the area touched on the touch screen (2) by using the data and the commands on the memory (4) and sending the image that will enable di splay of the point or the area touched to the touch screen (2); 2. A device (1 ) according to Claim 1. characterized by the touch screen
(2) which enables indication and use of a keyboard on itself and indicates the point or the area which is touched on the keyboard while using the keyboard real-timely by means of various visual items.
3. A device (1) according to Claim 1 or Claim 2, characterized by the touch screen control unit (3) which transfers the information about the touch to the peripheral interface (6) upon receiving the signals relating to touches performed on the touch screen (2) from the touch screen (2); and transfers the signals relating to the information about the image to be displayed on the touch screen (2), to the touch screen (2) upon receiving it from the peripheral interface (6).
4. A device (1) according to any of the preceding claims, characterized by the memory (4) wherein data and commands required for determining the point or the area touched on the touch screen (2) and displaying the point or the area touched on the touch screen (2) are kept by the processor (5) in compliance with use.
5. A device ( 1 ) according to any of the preceding claims, characterized by the memory (4) which comprises volatile memory and/or non-volatile storage units.
6. A device (1 ) according to any of the preceding claims, characterized by the memory (4) wherein there are commands serving for determining the point or the area touched on the touch screen (2) and serving for indicating the point or the area touched on the touch screen (2) during touch by means of various images comprising feedbacks in the non-volatile storage unit thereof.
7. A device (1) according to any of the preceding claims, characterized by the memory (4) wherein there are commands which enable a real-time indication of the point or the area touched on the keyboard only when a keyboard use is performed over the touch screen (2).
8. A device (1 ) according to any of the preceding claims, characterized by the processor (5) which uses the volatile memory on the memory (4) when running the commands in the memory (4).
9. A device (1) according to any of the preceding claims, characterized by the memory (4) which comprises the images to be used for indicating the area touched on the touch screen (2) real-timely.
10. A device (1) according to any of the preceding claims, characterized by the memory (4) wherein the images which will to be used for indicating the area touched on the touch screen (2) real-timely is one, several or all of a line/s which intersect/s each other and whose intersection indicates the point or the area touched; an arrow mark/s whose endpoints indicate the point or the area touched; a line/s which is/are located inside a circle and whose intersections indicate the point or the area touched; a square/s whose intersections of diagonals indicate the point or the area touched; preferably a two-dimensional image/s such as a bubble which is/are indicated near the touched area or indicate/s a representation of the touched area together with the touched point or area in thereof.
11. A device ( 1 ) according to any of the preceding claims, characterized by the processor (5) which determines the point or the area touched o the touch screen (2) as information of point or area on the touch screen (2).
12. A device ( 1 ) according to any of the preceding claims, characterized by the processor (5) which determines the point or the area touched on the touch screen (2) by running the commands, that enable determining the point or the area touched on the touch screen (2) in the memory (4), together with the information received from the peripheral interface (6) and then transmits the related image information to the touch screen (2) over the peripheral interface (6) by running the commands, that enable displaying the point or the area touched on the touch screen (2) in the memory (4) by means of various images, using this information.
13. A device ( 1) according to any of the preceding claims, characterized by the processor (5) which prepares the visual tip, that will indicate the point touched on the touch screen (2). such that it will be within the change of colour, shade or contrast as long as it is displayed on the screen (2).
14. A device ( 1) according to any of the preceding claims, characterized by the processor (5) which transfers the image - wherein the location touched on the touch screen (2) is displayed precisely, real-timely by means of apparent and two-dimensional visual tips - to the touch screen (2) by means of the peripheral interface (6).
15. A device ( 1 ) according to any of the preceding claims, characterized by the touch screen (2) where the location touched on the touch screen (2) is displayed by visual tips such that they will indicate the touch center and there will be extensions and traces thereof on the touch screen (2).
PCT/TR2015/000129 2014-04-01 2015-03-27 A device with touch screen WO2015152845A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR201403772 2014-04-01
TR2014/03772 2014-04-01

Publications (1)

Publication Number Publication Date
WO2015152845A1 true WO2015152845A1 (en) 2015-10-08

Family

ID=53059387

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2015/000129 WO2015152845A1 (en) 2014-04-01 2015-03-27 A device with touch screen

Country Status (1)

Country Link
WO (1) WO2015152845A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004051392A2 (en) * 2002-11-29 2004-06-17 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
US20080204427A1 (en) 2004-08-02 2008-08-28 Koninklijke Philips Electronics, N.V. Touch Screen with Pressure-Dependent Visual Feedback
CN101315592A (en) * 2008-07-18 2008-12-03 华硕电脑股份有限公司 Touch control type mobile operation device and display method used on the same
CN101825990A (en) * 2010-04-28 2010-09-08 宇龙计算机通信科技(深圳)有限公司 Touch point positioning method and system and touch screen device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004051392A2 (en) * 2002-11-29 2004-06-17 Koninklijke Philips Electronics N.V. User interface with displaced representation of touch area
US20080204427A1 (en) 2004-08-02 2008-08-28 Koninklijke Philips Electronics, N.V. Touch Screen with Pressure-Dependent Visual Feedback
CN101315592A (en) * 2008-07-18 2008-12-03 华硕电脑股份有限公司 Touch control type mobile operation device and display method used on the same
CN101825990A (en) * 2010-04-28 2010-09-08 宇龙计算机通信科技(深圳)有限公司 Touch point positioning method and system and touch screen device

Similar Documents

Publication Publication Date Title
KR102165444B1 (en) Apparatus and Method for Portable Device displaying Augmented Reality image
Grubert et al. Multifi: Multi fidelity interaction with displays on and around the body
KR102161510B1 (en) Portable device and controlling method thereof
Ng et al. In the blink of an eye: Investigating latency perception during stylus interaction
US9535595B2 (en) Accessed location of user interface
WO2014057814A1 (en) Display control device, display control method and program
WO2016036540A3 (en) Sending drawing data via contact list
CN105493023A (en) Manipulation of content on a surface
US20090265659A1 (en) Multi-window display control system and method for presenting a multi-window display
US8902156B2 (en) Intelligent real-time display selection in a multi-display computer system
EP2631771A3 (en) Handwritten information inputting device and portable electronic apparatus including handwritten information inputting device
EP2624120A3 (en) Reversible user interface component
US9558171B2 (en) Formatting tables for display on computing devices of varying screen size
Pietroszek et al. Smartcasting: a discount 3D interaction technique for public displays
JP6828801B2 (en) Information processing equipment, information processing systems, control methods, and programs
US20210077578A1 (en) Method and apparatus for interface control with prompt and feedback
CN106462222B (en) Transparent white panel display
US9436291B2 (en) Method, system and computer program product for operating a keyboard
US20150046809A1 (en) Activity indicator
US11553897B2 (en) Ultrasound imaging system image identification and display
US10073519B2 (en) Apparatus and method for providing information by recognizing user's intentions
JP6757114B2 (en) Input display device
US20070195065A1 (en) Jog-dial assisted character selection
JP2016085513A (en) Electronic apparatus, processing method and program
KR102202871B1 (en) Liquid crystal display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15721358

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase
122 Ep: pct application non-entry in european phase

Ref document number: 15721358

Country of ref document: EP

Kind code of ref document: A1