US5551876A - Target practice apparatus - Google Patents

Target practice apparatus Download PDF

Info

Publication number
US5551876A
US5551876A US08/393,397 US39339795A US5551876A US 5551876 A US5551876 A US 5551876A US 39339795 A US39339795 A US 39339795A US 5551876 A US5551876 A US 5551876A
Authority
US
United States
Prior art keywords
image
hit
miss
screen
target practice
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US08/393,397
Inventor
Sumio Koresawa
Masanori Yamazaki
Hidetoshi Imaide
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Mitsubishi Hitachi Power Systems Ltd
Original Assignee
Babcock Hitachi KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Babcock Hitachi KK filed Critical Babcock Hitachi KK
Assigned to HITACHI, LTD. reassignment HITACHI, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IMAIDE, HIDETOSHI, KORESAWA, SUMIO
Application granted granted Critical
Publication of US5551876A publication Critical patent/US5551876A/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying
    • F41G3/2616Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device
    • F41G3/2694Teaching or practice apparatus for gun-aiming or gun-laying using a light emitting device for simulating a target

Definitions

  • the present invention relates a target practice apparatus, and more particularly to a target practice apparatus utilizing a target in the form of a projected image which allows shooting with live bullets and which improves the ability of a trainee to make a proper circumstantial judgment.
  • Target practice is categorized into basic target practice in which a trainee improves his hitting accuracy when using live bullets, and advanced target practice in which the trainee shoots while judging a suitable timing and situation for firing.
  • a trainee In target practice, a trainee generally shoots at a stationary, moving or bobbing target, and the trainee or a judge visually checks the impact position on the target to evaluate the hitting accuracy and the ability of the trainee to make a proper circumstantial judgment.
  • Japanese Patent Application Laid-Open (kokai) 56-119499 discloses a target practice apparatus which provides a moving target.
  • This target practice apparatus utilizes a projected target device in which an image projected on a screen is used as a target.
  • a plurality of spaced strips of conductive rubber for an X-axis and a plurality of spaced strips of conductive rubber for a Y-axis are superposed onto each other in a grid-like configuration, and a screen on which an image is projected is attached to the surface thereof.
  • the conductive rubber for the X-axis contacts the conductive rubber for the Y-axis due to impact of the bullet so that an electrical connection is established between them.
  • the position of the electrical connection represents the coordinates of the impact position.
  • the impact position is then displayed on the screen using a spot light.
  • target practice apparatuses in which bullets are actually shot at a projected image are disclosed in Japanese Patent Application Laid-Open (kokai) No. 2-61499 and Japanese Patent Application Laid-Open (kokai) No. 1-193600.
  • transparent conductive film is used instead of conductive rubber.
  • a transparent target is used to allow an image to be projected from the rear side of the target, and transparent pressure sensitive resistors and photo diodes are arranged along a designated pattern.
  • an impact position is displayed by selectively turning on the photo diodes.
  • these apparatus uses dummy bullets, and a projected target which responds to the impact of the dummy bullets.
  • Japanese Patent Application Laid-Open (kokai) 56-119499 has the following disadvantages.
  • a position detection mechanism which is formed by conductive rubber, conductive film, or pressure sensitive resistors may be destroyed when live bullets are used. Therefore, instead of live bullets, dummy bullets are used in the apparatuses of Japanese Patent Application Laid-Open (kokai) Nos. 1-193600 and 2-61499.
  • the projection screen is formed by superposing conductive rubber sheets or conductive films in a grid-like configuration, an enormous number of rubber sheets or films is needed.
  • 172,800 480 ⁇ 360 conductive rubber sheets or conductive films must be bonded to the screen. This bonding work is troublesome, and the target must be replaced with a new one due to damage caused by live bullets. Therefore, this target incurs high operational costs.
  • an impact position on a screen which is detected by an impact position detector is displayed only by a spot light or a photo diode which is not related to various portions of a projected image. Therefore, it is impossible to vary the scoring depending on which parts of a human image are struck, such as the hands, legs, or head of the image.
  • An object of the present invention is to provide an improved target practice apparatus which uses a projected target to prevent a position detecting mechanism from being destroyed even if shooting is performed using live bullets.
  • Another object of the present invention is to provide an improved target practice apparatus in which a score for a hit can be varied depending on the location of an impact on the image.
  • a target practice apparatus includes a screen on which an image for training (hereinafter referred to as a "training image”) is projected, three or more acoustic sensors for detecting a sound generated when a bullet passes through the screen, an impact coordinate measuring section for computing the coordinates of an impact position on the screen based on differences among points of time when the acoustic sensors detect the sound, a projector for projecting the training image on the screen, a playback unit for supplying the projector with an image signal and including a recording medium in which an image representing a hit (hereinafter referred to as a "hit image”) and an image representing a miss (hereinafter referred to as a "miss image”) are recorded as well as the training image, a memory unit in which a hit range and a miss range are previously stored for each frame of the training image, and a data processing and controlling section.
  • a training image an image for training
  • three or more acoustic sensors for detecting a sound generated when a bullet passes through the screen
  • the data processing and controlling section controls the playback unit in a frame-by-frame fashion, compares the coordinates of the impact position fed from by the impact coordinate measuring section with data of the hit range and the miss range stored in the memory unit to judge whether a target is hit or missed, selects the hit image or the miss image, and reproduces the selected image.
  • the target practice apparatus further includes a superimpose compositing unit for displaying a mark representing an impact position on the screen by superimposing the mark on the training image, and an output device for outputting a list of scores representing the results of the training which are determined based on the results of the judgment.
  • the target practice apparatus it is preferred to integrate the screen with the acoustic sensors to form a target member. That is, the target member is provided with a frame for holding the edge of the screen, and a group of at least three acoustic sensors is provided in one side of the frame.
  • the group of acoustic sensors be provided in each of two adjacent sides of the frame which are perpendicularly intersecting each other.
  • the screen of the target practice apparatus be made of rubber and that the surface of the screen be white or silver-white.
  • the data processing and controlling section preferably includes time judging means for judging whether it is currently a time in which shooting is allowed or is a time in which shooting is prohibited, and hit judging means for judging whether or not a bullet hits an area within the hit range during the time that shooting is allowed.
  • the target practice apparatus preferably includes a computer which generates a computer graphic image.
  • the computer inputs a hit signal or a miss signal which is output from the data processing and controlling section so as to superimpose an image corresponding to the input signal on the computer graphic image.
  • An image signal representing the superimposed computer graphic image is then output to the playback unit, which reproduces a graphic image from the image signal output from the computer.
  • the projector projects the image, which has been reproduced by the playback unit, on the screen using a liquid crystal display projector.
  • shooting at an image of a moving target can be performed with live bullets. Accordingly, a trainee can improve his ability to make a proper circumstantial judgment and effectively gain experience in shooting with live bullets.
  • FIG. 1 is a block diagram showing a target practice apparatus according to a first embodiment of the present invention
  • FIG. 2 is a perspective view showing a state in which the target practice apparatus shown FIG. 1 is used;
  • FIG. 3 is an explanatory chart showing a method of computing the coordinates of an impact position on the screen
  • FIG. 4 is a flowchart showing a method of performing a hit/miss judgment used in the first embodiment
  • FIG. 5 is an explanatory chart showing the contents of a training image used in the first embodiment
  • FIG. 6 is an explanatory chart showing the start frame number and the like of each image of the training image
  • FIG. 7 is a block diagram showing a target practice apparatus according to a second embodiment of the present invention.
  • FIG. 8 is a block diagram showing a target practice apparatus according to a third embodiment of the present invention.
  • FIG. 9 is a block diagram showing a target practice apparatus according to a fourth embodiment of the present invention.
  • FIG. 10 is a view showing the structure of a target member of a target practice apparatus according to a fifth embodiment of the present invention.
  • FIG. 11 is an explanatory chart showing data shored in a memory device of a target practice apparatus according to a sixth embodiment of the present invention.
  • FIG. 12 is a block diagram showing a target practice apparatus according to a seventh embodiment of the present invention.
  • FIG. 1 is a block diagram showing a target practice apparatus according to a first embodiment of the present invention.
  • the target practice apparatus includes a screen 4 on which a training image is projected, three or more acoustic sensors 15 for detecting a sound generated when a bullet passes through the screen 4, impact coordinate measuring section 5 for computing the coordinates of an impact position on the screen 4 based on differences among points of time when the respective acoustic sensors 15 detect the sound, a projector 3 for projecting the training image on the screen 4, a playback unit 1 for supplying the projector 3 with an image signal and including a recording medium 16 in which an image representing a hit (hit image) and an image representing a miss (miss image) are recorded, as well as the training image, a memory unit 7 in which a hit range and a miss range are previously stored for each frame of the training image, and a data processing and controlling section 6.
  • the data processing and controlling section 6 controls the playback unit 1 in a frame-by-frame fashion, compares the coordinates of the impact position fed from the impact coordinate measuring section 5 with data of the hit range and the miss range stored in the memory unit 7 to perform a hit/miss judgment, selects the hit image or the miss image based on the results of the judgment, and reproduces the selected image.
  • the target practice apparatus further includes a superimpose compositing unit 2 for displaying a mark representing an impact position on the screen by superimposing the mark on the training image, and a printer 8 serving as an output device for outputting a list of scores representing the results of the training.
  • the superimpose compositing unit 2 superimposes a hit signal or a miss signal which is output from the playback unit 1 on a training image which is also output from the playback unit 1, and outputs the composited signal to the projector 3.
  • the superimpose compositing unit 2 superimposes the hit or miss signal on the training image in accordance with a control signal output from the data processing and controlling section 6.
  • FIG. 1 a target section of the target practice apparatus according to the present embodiment has a structure such that the acoustic sensors 15 are arranged at the periphery of the screen 4.
  • FIG. 2 is a perspective outside view showing a state in which the target practice apparatus according to the present embodiment is used.
  • the screen 4 and the acoustic sensors 15 are integrated to form a target member 30.
  • the target member 30 has a frame 20 which holds the peripheral edge of the screen 4.
  • a group of three or more acoustic sensors 15 is provided in each side of the frame 20. Although the group of acoustic sensors 15 is provided in each side of the frame 20 shown in FIG.
  • the group of acoustic sensors 15 may be provided only in a single side of the frame 20.
  • a large size screen measuring 2.4 m ⁇ 1.5 m is used.
  • the impact of a bullet can be accurately detected, because the group of acoustic sensors is provided in each side.
  • the screen 4 is made of rubber, and the surface of the screen 4 is white or silver-white.
  • the screen 4 Since live bullets are shot against the screen 4, as shown in FIG. 2, the screen 4 is made of a rubber which is soft and has a large stretching ratio to reduce damage caused by bullets as much as possible. To improve the visibility of a training image projected by the projector 3, the screen 4 is formed such that its surface is delustered white or sliver-white. Further, a white or silver-white pigment is contained in the rubber itself. This minimizes a decrease in visibility due to damage of the rubber caused by the bullets, and eliminates the necessity of painting to repair the screen 4. A film 22 made of cloth or a polymer resin may be adhered to the back surface of the target member 30.
  • the screen 4 and the film 22 cooperate to provide the capability of isolating outside sounds, thereby improving the detection accuracy of the acoustic sensors 15.
  • many through holes 21 communicating with the inner side of the frame 20 are formed in the frame 20 to allow air to circulate between the inside and outside of the target member 30. This structure maintains the internal temperature of the screen 4 and the target member 30 constant, thereby further increasing the detection accuracy of the acoustic sensors 15.
  • FIG. 3 is an explanatory chart showing a method of computing the coordinates of an impact position on the screen 4.
  • points S 1 , S 2 , S 3 respectively show the positions where the acoustic sensors 15 are attached, i.e., the positions of the three acoustic sensors 15 provided at one peripheral side of the screen 4.
  • the sensors 15 are lined in parallel with the x-axis.
  • the difference between the time when the sensor 15a detects a sound and the time when the sensor 15b detects the sound is T 1
  • the difference between the time when the sensor 15c detects the sound and the time when the sensor 15b detects the sound is T 2
  • the distance between the sensors 15a and 15b and the distance between the sensors 15b and 15c are both L
  • the coordinates of the impact position are P(x, y)
  • the coordinates of the sensors 15a, 15b and 15c are S 1 (-L, O), S 2 (O, O) and S 3 (L, O), respectively.
  • the x-axis coordinate x O and the Y-axis coordinate y O of the impact position is represented by the following mathematical expressions (1) and (2):
  • the Y-axis coordinate y O is obtained by using three or more sensors 15 which are provided on the right or left-hand side of the screen 4 and which are linearly lined in the Y-axis direction, as shown in FIG. 2.
  • the Y-axis coordinate y O can be accurately calculated in the same way as in the case of calculating the x-axis coordinate x O .
  • the target practice apparatus After a bullet hits the screen 4, the target practice apparatus goes into its pause state, in which an impact mark is composited with the stationary training image, and the composited training image is displayed.
  • FIG. 4 is a flowchart showing a method of performing a hit/miss judgment.
  • the data processing and controlling section 6 waits until a bullet hits the screen 4 (S1, S2).
  • the data processing and controlling section 6 inputs the coordinates of the impact position calculated by the impact coordinate measuring section 5 (S3).
  • the data processing and controlling section 6 calculates an address of the memory unit 7 where hit/miss data and a score are stored for a divided area, corresponding to the impact coordinates in the training image which was projected when the bullet hit the screen 4 (S4).
  • each frame of the training image (about 30 frames of image exist in 1 second) is finely divided in the vertical and the horizontal directions to form divided sections.
  • hit/miss data and a score are stored for each of the divided areas. For example, each frame is vertically divided in 480 and horizontally divided in 512.
  • the data processing and controlling section 6 reads out data from the memory unit 7 based on the calculated address (S5), and performs a hit/miss judgment (S6).
  • a hit image is obtained from a certain frame and is played back (S8).
  • a miss image is obtained from another frame and is played back (S9).
  • the data processing and controlling section 6 measures the length of time from the time when a shooting chance has started and the time when a shooting is actually carried out, and computes a score representing a response time based on the measured length of time. This score is output to the printer 8 together with a hit/miss score.
  • data which represent the training image and are stored in the memory medium 16 are related to the data which represent the hit/miss score and are stored in the memory unit 7. Therefore, when a new different training image is prepared, the data of hit/miss score stored in the memory unit 7 are replaced with new data, simultaneously with the replacement of the data in the memory medium 16. With this replacement, target practice using the new training image becomes possible.
  • the amount of data stored in the memory unit 7 approximately becomes 553 megabytes (480 dots ⁇ 512 dots ⁇ 8 bits ⁇ 5 seconds ⁇ 30 frames ⁇ 15 kinds/8 bits). Accordingly, the memory unit 7 is formed by a memory unit having a large capacity, such as a magnetic disc-type memory unit or a photomagnetic disc-type memory unit.
  • the operation of the data processing and controlling unit 6 may be modified such that the hit/miss judgment is performed only within the period of a shooting chance, and the hit/miss judgment is not performed when the impact of a bullet is detected outside the period of the shooting chance. In a latter case, an impact mark and a message indicating that the shooting was performed outside the shooting chance are composited with the training image by the superimpose compositing unit 2 for display. With this modification, the amount of data to be stored can be reduced.
  • the amount of data stored in the memory unit 7 can be further reduced by the following measures.
  • the hit/miss data stored in the memory unit 7 are not used for a single frame only, but commonly used for two or more frames. For example, assuming that hit/miss data are set for every two frames, the total time length of these two frames approximately becomes 66.7 mS ((1 second/30 frames) ⁇ 2), because 30 frames of image exist in one second.
  • the same scoring results can be obtained as in the case where data are set for every frame.
  • the second measure is to reduce the amount of data by restricting the movement of a person in the training image within a certain region.
  • FIG. 5 is an explanatory chart showing the contents of a training image
  • FIG. 6 is an explanatory chart showing the start frame number and the like of each image of the training image.
  • FIG. 5 shows contents of training for each shooter. 15 kinds of training images are prepared for one round of training.
  • FIG. 6 shows the start frame number, the end frame number and the like of each training image which is stored in the recording medium 16 and is played back by the playback unit 1. The information of the start frame number, the end frame number and the like is stored in the memory unit 7.
  • the playback unit 1 plays back an image in "START FRAME NO. (m011) OF IMAGE FOR OUTSIDE OF SHOOTING CHANCE OF 1ST TARGET PRACTICE AND FOR MISS", shown in FIG. 6.
  • This image is projected on the screen 4.
  • the time period of each shooting chance is set to 5 seconds, while the time period outside each shooting chance is set to 15 seconds.
  • the lengths of these time periods are the same in all the training images, the effect of training decreases as the trainee's skill improves. Accordingly, the lengths of time periods, such as the length of the shooting chance are varied for each training image.
  • the playback of each frame of an image is managed using the frame numbers in the above-described embodiment, the playback of each frame of the image may be managed using time codes.
  • the image can be switched in case a bullet hit a target and in case where the bullet missed the target in the present embodiment. Since this switching must be performed within about 1 second, it is preferred that the playback unit 1 use a laser disc having a function of random access in which an image can be played back from an arbitrary frame.
  • the target practice apparatus allows a trainee to shoot at a target on a moving image with live bullets. Accordingly, it is possible to improve the trainee's ability to make a proper circumstantial judgment and to allow the trainee to effectively gain experience in actual shooting. Thus, the shooting skill of the trainee can be easily increased in shorter periods.
  • FIG. 7 is a block diagram showing a target practice apparatus according to a second embodiment of the present invention.
  • the target practice apparatus of the present embodiment differs from the target practice apparatus of the first embodiment shown in FIG. 1 in that the target practice apparatus according to the present embodiment is provided with two playback units 1a and 1b, and an image switching unit 9.
  • the playback unit 1a plays back an image in case a bullet missed a target, whereas the playback unit 1b plays back an image in case a bullet hit the target.
  • a playback head is initially positioned such that the image for the case where a bullet hit the target is played back upon the start-up.
  • the image switching unit 9 When a bullet missed the target, the image switching unit 9 outputs an image from the playback unit 1a to the superimpose compositing unit 2 as is. On the contrary, when a bullet hit the target, the playback unit 1b is started and the image switching unit 9 is operated so that an image from the playback unit 1b is output to the superimpose compositing unit 2.
  • FIG. 8 is a block diagram showing a target practice apparatus according to a third embodiment of the present invention.
  • the target practice apparatus of the present embodiment differs from the target practice apparatus of the first embodiment shown in FIG. 1 in that the target practice apparatus according to the present embodiment is provided with a video tape recorder 13 in which a training image is recorded, and an image switching unit 14.
  • the video tape recorder 13 plays back a predetermined training image which was prepared with a video camera.
  • the operation of the video tape recorder 13 is stopped at that time, is brought into its pause state, and is caused to display an impact mark at a position where the bullet impacted.
  • the training image played back by the video tape recorder 13 is not related to data output from the memory unit which stores hit/miss data. Accordingly, the function of switching images based on the hit/miss judgment is not provided.
  • the switching between a training image output from the playback unit 1 and a training image output from the video tape recorder 13 is performed by the image switching unit 14 in accordance with the state of a switching switch provided in the data processing and controlling section 6.
  • the target practice apparatus can use various training images which were captured by a third person, and easily changes the training image.
  • FIG. 9 is a block diagram showing a target practice apparatus according to a fourth embodiment of the present invention.
  • images which were actually captured by a video camera or the like are used.
  • the target practice apparatus according to the present embodiment uses artificial images which are prepared using computer graphics technology.
  • the target practice apparatus according to the present embodiment is provided with a computer 17.
  • the target practice apparatus can use, as a training image, an image which is difficult or impossible to be captured in a real world.
  • FIG. 10 is a chart showing the structure of a target member of a target practice apparatus according to a fifth embodiment of the present invention.
  • a temperature sensor 26 and an acoustic sensor are further provided in the target member shown in FIG. 2.
  • the temperature sensor 26 is used for performing compensation based on sound velocity.
  • the acoustic sensor is provided at a corner of the target member, and a commonly used for detection in the X-axis direction and detection in the Y-axis direction.
  • the acoustic sensor provided at the corner can eliminate the dead angle in which the impact position of a bullet can not be detected.
  • a time when the acoustic sensor detects the impact of a bullet can be used as a reference time for detection in the X-axis direction and for detection in the Y-axis direction, thereby greatly increasing the accuracy in measuring the impact point.
  • FIG. 11 is an explanatory chart showing data stored in a memory device 7 of a target practice apparatus according to a sixth embodiment of the present invention.
  • a target in a training image is divided into a plurality of areas such as an area PO outside a pattern, areas P1 corresponding to hands and legs, and the like, and different scores are given to these divided areas.
  • Each frame of image stored in the recording medium of the playback unit 7 is subjected to an image processing to obtain a profile line of a pattern of a person.
  • the thus obtained profile line is properly modified to set divided areas and scores therefor. Accordingly, even if various parts of the pattern move in a complex manner, various hit areas can be set. Also, areas can be set in a more detail.
  • FIG. 12 is a block diagram showing a target practice apparatus according to a seventh embodiment of the present invention.
  • the target practice apparatus according to the present embodiment differs from the target practice apparatus shown in FIG. 7 in that there is further provided a recording and playback unit 24 which can perform recording and playback of an image.
  • the output signal from the superimpose compositing unit is partially branched and is input to the recording and playback unit 24.
  • the recorded training image can be used in a playback mode in which the training image is again passed through the superimpose compositing unit to composite an information message such as a message of a shooting chance with the training image and to output the composited image to the projector.
  • the output unit 8A is not limited to a printer, and any device, such as a CRT display, which displays or records information can be used.

Abstract

A target practice apparatus which allows a trainee to shoot at a target on a moving image with live bullets. The target practice apparatus includes a screen on which a training image is projected, three or more acoustic sensors for detecting an impact sound, impact coordinate measuring section for computing the coordinates of an impact position on the screen based on detection signals from the acoustic sensors, a projector, a playback unit for supplying the projector with an image signal, a recording medium which stores a hit image and a miss image as well as the training image, a memory unit in which a hit range and a miss range are previously stored for each frame of the training image, and a data processing and controlling section which compares the detected coordinates of the impact position with data of the hit range and the miss range stored in the memory unit to perform a hit/miss judgment, selects the hit image or the miss image based on the results of the judgment, and reproduces the selected image. The trainee can improves his ability to make a proper circumstantial judgment and effectively gains experience in shooting with live bullet.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates a target practice apparatus, and more particularly to a target practice apparatus utilizing a target in the form of a projected image which allows shooting with live bullets and which improves the ability of a trainee to make a proper circumstantial judgment.
2. Related Art
Target practice is categorized into basic target practice in which a trainee improves his hitting accuracy when using live bullets, and advanced target practice in which the trainee shoots while judging a suitable timing and situation for firing.
In target practice, a trainee generally shoots at a stationary, moving or bobbing target, and the trainee or a judge visually checks the impact position on the target to evaluate the hitting accuracy and the ability of the trainee to make a proper circumstantial judgment.
To automatically and safely check such an impact position, various target practice apparatuses have been proposed. For example, in the target practice apparatus disclosed in Japanese Patent Application Laid-Open (kokai) No. 53-121657, a sound wave due to firing of a live bullet is detected by a plurality of acoustic sensors to obtain an impact position on the target based on differences among points of time when the sound wave is detected by the sensors. In the target practice apparatus disclosed in Japanese Patent Application Laid-Open (kokai) No. 5-196395, images of a target are captured by a video camera, and images before and after hitting of a bullet are processed to obtain the impact position based on the bullet mark.
Also, Japanese Patent Application Laid-Open (kokai) 56-119499 discloses a target practice apparatus which provides a moving target. This target practice apparatus utilizes a projected target device in which an image projected on a screen is used as a target. In the target device, a plurality of spaced strips of conductive rubber for an X-axis and a plurality of spaced strips of conductive rubber for a Y-axis are superposed onto each other in a grid-like configuration, and a screen on which an image is projected is attached to the surface thereof. When a bullet hits the screen, the conductive rubber for the X-axis contacts the conductive rubber for the Y-axis due to impact of the bullet so that an electrical connection is established between them. The position of the electrical connection represents the coordinates of the impact position. The impact position is then displayed on the screen using a spot light.
Other examples of target practice apparatuses in which bullets are actually shot at a projected image are disclosed in Japanese Patent Application Laid-Open (kokai) No. 2-61499 and Japanese Patent Application Laid-Open (kokai) No. 1-193600. In the apparatus disclosed in the former document, transparent conductive film is used instead of conductive rubber. In the apparatus disclosed in the latter document, a transparent target is used to allow an image to be projected from the rear side of the target, and transparent pressure sensitive resistors and photo diodes are arranged along a designated pattern. In this apparatus, an impact position is displayed by selectively turning on the photo diodes. However, these apparatus uses dummy bullets, and a projected target which responds to the impact of the dummy bullets.
In the advanced target practice for improving the ability to make a proper circumstantial judgment, it is preferred to use a target practice apparatus in which a projected target is used.
However, the apparatus disclosed in Japanese Patent Application Laid-Open (kokai) 56-119499 has the following disadvantages. First, a position detection mechanism which is formed by conductive rubber, conductive film, or pressure sensitive resistors may be destroyed when live bullets are used. Therefore, instead of live bullets, dummy bullets are used in the apparatuses of Japanese Patent Application Laid-Open (kokai) Nos. 1-193600 and 2-61499.
Second, since the projection screen is formed by superposing conductive rubber sheets or conductive films in a grid-like configuration, an enormous number of rubber sheets or films is needed. For example, in the case where an impact position must be detected on a screen measuring 2,400 mm×1,800 mm with an accuracy of 5 mm, 172,800 (480×360) conductive rubber sheets or conductive films must be bonded to the screen. This bonding work is troublesome, and the target must be replaced with a new one due to damage caused by live bullets. Therefore, this target incurs high operational costs.
Third, in the projection screen, an impact position on a screen which is detected by an impact position detector is displayed only by a spot light or a photo diode which is not related to various portions of a projected image. Therefore, it is impossible to vary the scoring depending on which parts of a human image are struck, such as the hands, legs, or head of the image.
SUMMARY OF THE INVENTION
An object of the present invention is to provide an improved target practice apparatus which uses a projected target to prevent a position detecting mechanism from being destroyed even if shooting is performed using live bullets.
Another object of the present invention is to provide an improved target practice apparatus in which a score for a hit can be varied depending on the location of an impact on the image.
Briefly, a target practice apparatus according to the present invention includes a screen on which an image for training (hereinafter referred to as a "training image") is projected, three or more acoustic sensors for detecting a sound generated when a bullet passes through the screen, an impact coordinate measuring section for computing the coordinates of an impact position on the screen based on differences among points of time when the acoustic sensors detect the sound, a projector for projecting the training image on the screen, a playback unit for supplying the projector with an image signal and including a recording medium in which an image representing a hit (hereinafter referred to as a "hit image") and an image representing a miss (hereinafter referred to as a "miss image") are recorded as well as the training image, a memory unit in which a hit range and a miss range are previously stored for each frame of the training image, and a data processing and controlling section. The data processing and controlling section controls the playback unit in a frame-by-frame fashion, compares the coordinates of the impact position fed from by the impact coordinate measuring section with data of the hit range and the miss range stored in the memory unit to judge whether a target is hit or missed, selects the hit image or the miss image, and reproduces the selected image. The target practice apparatus according to the present invention further includes a superimpose compositing unit for displaying a mark representing an impact position on the screen by superimposing the mark on the training image, and an output device for outputting a list of scores representing the results of the training which are determined based on the results of the judgment.
In the target practice apparatus according to the present invention, it is preferred to integrate the screen with the acoustic sensors to form a target member. That is, the target member is provided with a frame for holding the edge of the screen, and a group of at least three acoustic sensors is provided in one side of the frame.
In the target practice apparatus according to the present invention, it is preferred that the group of acoustic sensors be provided in each of two adjacent sides of the frame which are perpendicularly intersecting each other.
Also, it is preferred that the screen of the target practice apparatus be made of rubber and that the surface of the screen be white or silver-white.
The data processing and controlling section preferably includes time judging means for judging whether it is currently a time in which shooting is allowed or is a time in which shooting is prohibited, and hit judging means for judging whether or not a bullet hits an area within the hit range during the time that shooting is allowed.
Moreover, the target practice apparatus according to the present invention preferably includes a computer which generates a computer graphic image. In this case, the computer inputs a hit signal or a miss signal which is output from the data processing and controlling section so as to superimpose an image corresponding to the input signal on the computer graphic image. An image signal representing the superimposed computer graphic image is then output to the playback unit, which reproduces a graphic image from the image signal output from the computer. The projector projects the image, which has been reproduced by the playback unit, on the screen using a liquid crystal display projector.
In the shooting apparatus according to the present invention, shooting at an image of a moving target can be performed with live bullets. Accordingly, a trainee can improve his ability to make a proper circumstantial judgment and effectively gain experience in shooting with live bullets.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
Various other objects, features and many of the attendant advantages of the present invention will be readily appreciated as the same becomes better understood with reference to the following detailed description of the preferred embodiments when considered in connection with the accompanying drawings, in which:
FIG. 1 is a block diagram showing a target practice apparatus according to a first embodiment of the present invention;
FIG. 2 is a perspective view showing a state in which the target practice apparatus shown FIG. 1 is used;
FIG. 3 is an explanatory chart showing a method of computing the coordinates of an impact position on the screen;
FIG. 4 is a flowchart showing a method of performing a hit/miss judgment used in the first embodiment;
FIG. 5 is an explanatory chart showing the contents of a training image used in the first embodiment;
FIG. 6 is an explanatory chart showing the start frame number and the like of each image of the training image;
FIG. 7 is a block diagram showing a target practice apparatus according to a second embodiment of the present invention;
FIG. 8 is a block diagram showing a target practice apparatus according to a third embodiment of the present invention;
FIG. 9 is a block diagram showing a target practice apparatus according to a fourth embodiment of the present invention;
FIG. 10 is a view showing the structure of a target member of a target practice apparatus according to a fifth embodiment of the present invention;
FIG. 11 is an explanatory chart showing data shored in a memory device of a target practice apparatus according to a sixth embodiment of the present invention; and
FIG. 12 is a block diagram showing a target practice apparatus according to a seventh embodiment of the present invention.
DESCRIPTION OF PREFERRED EMBODIMENTS
Embodiments of the present invention will now be described with reference to the accompanying drawings.
FIG. 1 is a block diagram showing a target practice apparatus according to a first embodiment of the present invention. The target practice apparatus includes a screen 4 on which a training image is projected, three or more acoustic sensors 15 for detecting a sound generated when a bullet passes through the screen 4, impact coordinate measuring section 5 for computing the coordinates of an impact position on the screen 4 based on differences among points of time when the respective acoustic sensors 15 detect the sound, a projector 3 for projecting the training image on the screen 4, a playback unit 1 for supplying the projector 3 with an image signal and including a recording medium 16 in which an image representing a hit (hit image) and an image representing a miss (miss image) are recorded, as well as the training image, a memory unit 7 in which a hit range and a miss range are previously stored for each frame of the training image, and a data processing and controlling section 6. The data processing and controlling section 6 controls the playback unit 1 in a frame-by-frame fashion, compares the coordinates of the impact position fed from the impact coordinate measuring section 5 with data of the hit range and the miss range stored in the memory unit 7 to perform a hit/miss judgment, selects the hit image or the miss image based on the results of the judgment, and reproduces the selected image. The target practice apparatus further includes a superimpose compositing unit 2 for displaying a mark representing an impact position on the screen by superimposing the mark on the training image, and a printer 8 serving as an output device for outputting a list of scores representing the results of the training.
The superimpose compositing unit 2 superimposes a hit signal or a miss signal which is output from the playback unit 1 on a training image which is also output from the playback unit 1, and outputs the composited signal to the projector 3. The superimpose compositing unit 2 superimposes the hit or miss signal on the training image in accordance with a control signal output from the data processing and controlling section 6.
As shown in FIG. 1, a target section of the target practice apparatus according to the present embodiment has a structure such that the acoustic sensors 15 are arranged at the periphery of the screen 4. FIG. 2 is a perspective outside view showing a state in which the target practice apparatus according to the present embodiment is used. As shown in FIG. 2, the screen 4 and the acoustic sensors 15 are integrated to form a target member 30. In detail, the target member 30 has a frame 20 which holds the peripheral edge of the screen 4. A group of three or more acoustic sensors 15 is provided in each side of the frame 20. Although the group of acoustic sensors 15 is provided in each side of the frame 20 shown in FIG. 2, the group of acoustic sensors 15 may be provided only in a single side of the frame 20. In the present embodiment, a large size screen measuring 2.4 m×1.5 m is used. However, the impact of a bullet can be accurately detected, because the group of acoustic sensors is provided in each side.
The screen 4 is made of rubber, and the surface of the screen 4 is white or silver-white. In the present embodiment, a rubber having a hardness of 37±31 RHD and a stretching ratio of 700%.
The detail structure of the screen 4 will be described below. Since live bullets are shot against the screen 4, as shown in FIG. 2, the screen 4 is made of a rubber which is soft and has a large stretching ratio to reduce damage caused by bullets as much as possible. To improve the visibility of a training image projected by the projector 3, the screen 4 is formed such that its surface is delustered white or sliver-white. Further, a white or silver-white pigment is contained in the rubber itself. This minimizes a decrease in visibility due to damage of the rubber caused by the bullets, and eliminates the necessity of painting to repair the screen 4. A film 22 made of cloth or a polymer resin may be adhered to the back surface of the target member 30. The screen 4 and the film 22 cooperate to provide the capability of isolating outside sounds, thereby improving the detection accuracy of the acoustic sensors 15. Further, many through holes 21 communicating with the inner side of the frame 20 are formed in the frame 20 to allow air to circulate between the inside and outside of the target member 30. This structure maintains the internal temperature of the screen 4 and the target member 30 constant, thereby further increasing the detection accuracy of the acoustic sensors 15.
FIG. 3 is an explanatory chart showing a method of computing the coordinates of an impact position on the screen 4. In FIG. 3, points S1, S2, S3 respectively show the positions where the acoustic sensors 15 are attached, i.e., the positions of the three acoustic sensors 15 provided at one peripheral side of the screen 4. In this embodiment, the sensors 15 are lined in parallel with the x-axis.
In FIG. 3, the difference between the time when the sensor 15a detects a sound and the time when the sensor 15b detects the sound is T1, the difference between the time when the sensor 15c detects the sound and the time when the sensor 15b detects the sound is T2, the distance between the sensors 15a and 15b and the distance between the sensors 15b and 15c are both L, and the coordinates of the impact position are P(x, y), the coordinates of the sensors 15a, 15b and 15c are S1 (-L, O), S2 (O, O) and S3 (L, O), respectively. In this case, the x-axis coordinate xO and the Y-axis coordinate yO of the impact position is represented by the following mathematical expressions (1) and (2):
x.sub.O -{(T.sub.1 -T.sub.2)(C.sup.2 T.sub.1 T.sub.2, L.sup.2)}/{2L(T.sub.1 +T.sub.2)}                                                (1)
y.sub.O =(L.sub.1.sup.2 -x.sub.O.sup.2)                    (2)
wherein C is a sound velocity, and L1 is the distance between P and S2 which is represented by the following expression:
L.sub.1 ={2L.sup.2 -C.sup.2 (T.sub.1.sup.2 +T.sub.2.sup.2)}/{2C(T.sub.1 +T.sub.2)}.
Since the sensors are lined in the X-axis direction, the accuracy in detecting the Y-axis coordinate yO of the impact position is lower than that for the X-axis coordinate xO. This was confirmed by experiments, which also revealed that the errors in detecting the Y-axis coordinate yO are more than double the errors in detecting the X-axis coordinate xO. Accordingly, only the X-axis coordinate xO is used. The Y-axis coordinate yO is obtained by using three or more sensors 15 which are provided on the right or left-hand side of the screen 4 and which are linearly lined in the Y-axis direction, as shown in FIG. 2. The Y-axis coordinate yO can be accurately calculated in the same way as in the case of calculating the x-axis coordinate xO.
After a bullet hits the screen 4, the target practice apparatus goes into its pause state, in which an impact mark is composited with the stationary training image, and the composited training image is displayed.
Next, other operations of the data processing and controlling section 6 will be described. The data processing and controlling section 6 judges whether or not a bullet reached the screen 4 and hit a bad fellow in an image which was projected on the screen 4 at that time. FIG. 4 is a flowchart showing a method of performing a hit/miss judgment.
First, the data processing and controlling section 6 waits until a bullet hits the screen 4 (S1, S2). When the bullet hits the screen 4, the data processing and controlling section 6 inputs the coordinates of the impact position calculated by the impact coordinate measuring section 5 (S3). Thereafter, the data processing and controlling section 6 calculates an address of the memory unit 7 where hit/miss data and a score are stored for a divided area, corresponding to the impact coordinates in the training image which was projected when the bullet hit the screen 4 (S4). In this embodiment, each frame of the training image (about 30 frames of image exist in 1 second) is finely divided in the vertical and the horizontal directions to form divided sections. In the memory unit 7, hit/miss data and a score are stored for each of the divided areas. For example, each frame is vertically divided in 480 and horizontally divided in 512.
Then, the data processing and controlling section 6 reads out data from the memory unit 7 based on the calculated address (S5), and performs a hit/miss judgment (S6). When it is judged in step S7 that the trainee hit a target, a hit image is obtained from a certain frame and is played back (S8). On the other hand, when it is judged in step S7 that the trainee missed the target, a miss image is obtained from another frame and is played back (S9). By the above-described processing, the hit/miss judgment is completed for one bullet (S10).
Moreover, the data processing and controlling section 6 measures the length of time from the time when a shooting chance has started and the time when a shooting is actually carried out, and computes a score representing a response time based on the measured length of time. This score is output to the printer 8 together with a hit/miss score.
In principle, data which represent the training image and are stored in the memory medium 16 are related to the data which represent the hit/miss score and are stored in the memory unit 7. Therefore, when a new different training image is prepared, the data of hit/miss score stored in the memory unit 7 are replaced with new data, simultaneously with the replacement of the data in the memory medium 16. With this replacement, target practice using the new training image becomes possible.
Assuming that (1) an image of each frame is divided into 480 dots in the vertical direction and 512 dots in the horizontal direction, that (2) data of a score in each divided section is represented by 8 bits, that (3) data for displaying a hit/miss score for 5 seconds is stored for each training image, and that (4) 15 kinds of training images are used, the amount of data stored in the memory unit 7 approximately becomes 553 megabytes (480 dots×512 dots×8 bits×5 seconds×30 frames×15 kinds/8 bits). Accordingly, the memory unit 7 is formed by a memory unit having a large capacity, such as a magnetic disc-type memory unit or a photomagnetic disc-type memory unit.
When the divisional number of image, the bit number of score data, the time length of the training image or the kinds of the training images increase, a possibility arises that these data cannot be stored in the memory unit 7 even if the memory unit 7 is formed by a magnetic disc-type memory unit or a photomagnetic disc-type memory unit. To overcome this problem, the operation of the data processing and controlling unit 6 may be modified such that the hit/miss judgment is performed only within the period of a shooting chance, and the hit/miss judgment is not performed when the impact of a bullet is detected outside the period of the shooting chance. In a latter case, an impact mark and a message indicating that the shooting was performed outside the shooting chance are composited with the training image by the superimpose compositing unit 2 for display. With this modification, the amount of data to be stored can be reduced.
The amount of data stored in the memory unit 7 can be further reduced by the following measures. In the first measure, the hit/miss data stored in the memory unit 7 are not used for a single frame only, but commonly used for two or more frames. For example, assuming that hit/miss data are set for every two frames, the total time length of these two frames approximately becomes 66.7 mS ((1 second/30 frames)×2), because 30 frames of image exist in one second. When the amount of movement of a person in the training image is within the size of a single divided section, the same scoring results can be obtained as in the case where data are set for every frame.
The second measure is to reduce the amount of data by restricting the movement of a person in the training image within a certain region.
Next, process for performing switching between the hit image and the miss image, which is shown in FIG. 4, will be described in further detail. FIG. 5 is an explanatory chart showing the contents of a training image, and FIG. 6 is an explanatory chart showing the start frame number and the like of each image of the training image.
The contents of the image in FIG. 5 show contents of training for each shooter. 15 kinds of training images are prepared for one round of training. FIG. 6 shows the start frame number, the end frame number and the like of each training image which is stored in the recording medium 16 and is played back by the playback unit 1. The information of the start frame number, the end frame number and the like is stored in the memory unit 7.
When a first target practice is started, the playback unit 1 plays back an image in "START FRAME NO. (m011) OF IMAGE FOR OUTSIDE OF SHOOTING CHANCE OF 1ST TARGET PRACTICE AND FOR MISS", shown in FIG. 6. This image is projected on the screen 4. The reason why the "image for outside of a shooting chance" and the "image for the case where a bullet missed a target during a shooting chance" are treated as a continuous image is to obtain a continuous smooth image even in the transition period from the outside to the inside of a shooting chance. This continuous image can be obtained by preparing a training image without cutting.
When the frame number of the image currently played back reaches or becomes greater than the frame number of "START FRAME NO. (m012) OF IMAGE FOR SHOOTING CHANCE OF 1ST TARGET PRACTICE", the hit/miss judgment is carried out based on the detection of the impact of a bullet.
If the impact of a bullet is detected during the shooting chance and the bullet hit a target, a playback operation is performed from "START FRAME NO. (m015) OF IMAGE FOR HIT IN 1ST TARGET PRACTICE" to "END FRAME NO. (m016) OF IMAGE FOR HIT IN 1ST TARGET PRACTICE".
When the impact of a bullet is not detected within the shooting chance or when the impact of a bullet is detected but the bullet missed the target, the playback of the training image is continued to "END FRAME NO. (m013) OF IMAGE FOR SHOOTING CHANCE OF 1ST TARGET PRACTICE".
By playing back the images in the above-described frames, one round of target practice is completed. In FIG. 5, the time period of each shooting chance is set to 5 seconds, while the time period outside each shooting chance is set to 15 seconds. However, when the lengths of these time periods are the same in all the training images, the effect of training decreases as the trainee's skill improves. Accordingly, the lengths of time periods, such as the length of the shooting chance are varied for each training image.
Although the playback of each frame of an image is managed using the frame numbers in the above-described embodiment, the playback of each frame of the image may be managed using time codes.
As described above, and as shown in FIG. 4, the image can be switched in case a bullet hit a target and in case where the bullet missed the target in the present embodiment. Since this switching must be performed within about 1 second, it is preferred that the playback unit 1 use a laser disc having a function of random access in which an image can be played back from an arbitrary frame.
With the above-described structure, the target practice apparatus according to the present embodiment allows a trainee to shoot at a target on a moving image with live bullets. Accordingly, it is possible to improve the trainee's ability to make a proper circumstantial judgment and to allow the trainee to effectively gain experience in actual shooting. Thus, the shooting skill of the trainee can be easily increased in shorter periods.
FIG. 7 is a block diagram showing a target practice apparatus according to a second embodiment of the present invention. The target practice apparatus of the present embodiment differs from the target practice apparatus of the first embodiment shown in FIG. 1 in that the target practice apparatus according to the present embodiment is provided with two playback units 1a and 1b, and an image switching unit 9.
The playback unit 1a plays back an image in case a bullet missed a target, whereas the playback unit 1b plays back an image in case a bullet hit the target. In the playback unit 1b, a playback head is initially positioned such that the image for the case where a bullet hit the target is played back upon the start-up.
When a bullet missed the target, the image switching unit 9 outputs an image from the playback unit 1a to the superimpose compositing unit 2 as is. On the contrary, when a bullet hit the target, the playback unit 1b is started and the image switching unit 9 is operated so that an image from the playback unit 1b is output to the superimpose compositing unit 2.
With this structure, the time lag at the time of switching images can be shortened.
FIG. 8 is a block diagram showing a target practice apparatus according to a third embodiment of the present invention. The target practice apparatus of the present embodiment differs from the target practice apparatus of the first embodiment shown in FIG. 1 in that the target practice apparatus according to the present embodiment is provided with a video tape recorder 13 in which a training image is recorded, and an image switching unit 14.
The video tape recorder 13 plays back a predetermined training image which was prepared with a video camera. In the target practice apparatus, when the impact of a bullet is detected, the operation of the video tape recorder 13 is stopped at that time, is brought into its pause state, and is caused to display an impact mark at a position where the bullet impacted. In this embodiment, the training image played back by the video tape recorder 13 is not related to data output from the memory unit which stores hit/miss data. Accordingly, the function of switching images based on the hit/miss judgment is not provided.
The switching between a training image output from the playback unit 1 and a training image output from the video tape recorder 13 is performed by the image switching unit 14 in accordance with the state of a switching switch provided in the data processing and controlling section 6.
With this structure, the target practice apparatus according to the present embodiment can use various training images which were captured by a third person, and easily changes the training image.
FIG. 9 is a block diagram showing a target practice apparatus according to a fourth embodiment of the present invention. In the above-described first through third embodiments, images which were actually captured by a video camera or the like are used. By contrast, the target practice apparatus according to the present embodiment uses artificial images which are prepared using computer graphics technology. To this end, the target practice apparatus according to the present embodiment is provided with a computer 17.
With this structure, the target practice apparatus according to the present embodiment can use, as a training image, an image which is difficult or impossible to be captured in a real world.
FIG. 10 is a chart showing the structure of a target member of a target practice apparatus according to a fifth embodiment of the present invention.
In this embodiment, a temperature sensor 26 and an acoustic sensor are further provided in the target member shown in FIG. 2. The temperature sensor 26 is used for performing compensation based on sound velocity. The acoustic sensor is provided at a corner of the target member, and a commonly used for detection in the X-axis direction and detection in the Y-axis direction. Especially, the acoustic sensor provided at the corner can eliminate the dead angle in which the impact position of a bullet can not be detected. Also, a time when the acoustic sensor detects the impact of a bullet can be used as a reference time for detection in the X-axis direction and for detection in the Y-axis direction, thereby greatly increasing the accuracy in measuring the impact point.
FIG. 11 is an explanatory chart showing data stored in a memory device 7 of a target practice apparatus according to a sixth embodiment of the present invention. A target in a training image is divided into a plurality of areas such as an area PO outside a pattern, areas P1 corresponding to hands and legs, and the like, and different scores are given to these divided areas. Each frame of image stored in the recording medium of the playback unit 7 is subjected to an image processing to obtain a profile line of a pattern of a person. The thus obtained profile line is properly modified to set divided areas and scores therefor. Accordingly, even if various parts of the pattern move in a complex manner, various hit areas can be set. Also, areas can be set in a more detail.
FIG. 12 is a block diagram showing a target practice apparatus according to a seventh embodiment of the present invention. The target practice apparatus according to the present embodiment differs from the target practice apparatus shown in FIG. 7 in that there is further provided a recording and playback unit 24 which can perform recording and playback of an image. The output signal from the superimpose compositing unit is partially branched and is input to the recording and playback unit 24. With this construction, the recorded training image can be used in a playback mode in which the training image is again passed through the superimpose compositing unit to composite an information message such as a message of a shooting chance with the training image and to output the composited image to the projector.
The output unit 8A is not limited to a printer, and any device, such as a CRT display, which displays or records information can be used.
Obviously, numerous modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the present invention may be practiced otherwise than as specifically described herein.

Claims (10)

What is claimed is:
1. A target practice apparatus comprising:
a screen on which a training image is projected;
three or more acoustic sensors for detecting a sound generated when a bullet passes through said screen;
impact coordinate measuring section for computing the coordinates of an impact position on said screen based on differences among points of time when said acoustic sensors detect the sound;
a projector for projecting the training image on said screen;
a playback unit for supplying the projector with an image signal and including a recording medium in which an image representing a hit and an image representing a miss are recorded as well as the training image;
a memory unit in which a hit range and a miss range are previously stored for each frame of said training image;
a data processing and controlling section which controls said playback unit in a frame-by-frame fashion, compares the coordinates of the impact position fed from said impact coordinate measuring section with data of the hit range and the miss range stored in said memory unit so as to perform judgment as to whether a hit or a miss, and selects the image representing a hit or the image representing a miss based on the results of the judgment, and reproduces the selected image;
a superimpose compositing unit for displaying a mark representing an impact position on the screen by superimposing the mark on said training image; and
an output device for outputting a list of scores representing the results of the training which are determined based on the results of said judgment.
2. A target practice apparatus according to claim 1, wherein said screen is integrated with said acoustic sensors to form a target member, said target member is provided with a frame for holding the edge of said screen, and a group of at least three acoustic sensors is provided in one side of the frame.
3. A target practice apparatus according to claim 2, wherein said group of acoustic sensors is provided in each of two adjacent sides of said frame which are perpendicularly intersecting each other.
4. A target practice apparatus according to claim 3, further comprising:
an acoustic sensor which is provided at a corner between perpendicularly intersecting two sides of said frame and is commonly used for detection in an X-axis direction and detection in a Y-axis direction; and
a temperature sensor used for performing compensation based on sound velocity.
5. A target practice apparatus according to claim 1, wherein said screen is made of rubber and the surface of said screen is white or silver-white.
6. A target practice apparatus according to claim 1, wherein said data processing and controlling section comprises:
time judging means for judging whether it is currently a time in which shooting is allowed or is a time in which shooting is prohibited; and
hit judging means for judging whether or not a bullet hits an area within the hit range during the time that shooting is allowed.
7. A target practice apparatus according to claim 1, wherein said playback unit is composed of a first playback unit for playing back a hit image and a second playback unit for playing back a miss image, and an image switching unit is further provided so as to selectively use the hit and miss images.
8. A target practice apparatus according to claim 1, further comprising:
a video tape recorder for recording and playing back an image of results of a target practice; and
an image switching unit for selecting the image from said video tape recorder and the image from said playback unit.
9. A target practice apparatus according to claim 1, wherein a recording and playback unit is further provided in parallel to said playback unit, and the output signal from said superimpose compositing unit is branched to be input to said recording and playback unit, and an image switching unit is provided to selectively use the output signal from said recording and playback unit and the output signal from said playback unit.
10. A target practice apparatus according to claim 1, wherein said target practice apparatus further comprises a computer which generates a computer graphic image and which inputs a hit signal or a miss signal output from said data processing and controlling section so as to superimpose an image corresponding to the input signal on the computer graphic image and to output an image signal representing the superimposed computer graphic image to said playback unit, said playback unit reproducing a graphic image from the image signal output from said computer, and said projector projecting the image, which has been reproduced by said playback unit, on the screen using a liquid crystal display projector.
US08/393,397 1994-02-25 1995-02-23 Target practice apparatus Expired - Fee Related US5551876A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP6-028195 1994-02-25
JP6028195A JP2691247B2 (en) 1994-02-25 1994-02-25 Shooting training equipment

Publications (1)

Publication Number Publication Date
US5551876A true US5551876A (en) 1996-09-03

Family

ID=12241902

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/393,397 Expired - Fee Related US5551876A (en) 1994-02-25 1995-02-23 Target practice apparatus

Country Status (7)

Country Link
US (1) US5551876A (en)
EP (1) EP0669512B1 (en)
JP (1) JP2691247B2 (en)
KR (1) KR0169504B1 (en)
CN (1) CN1063545C (en)
AU (1) AU692209B2 (en)
DE (1) DE69507349T2 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6283861B1 (en) * 1999-03-23 2001-09-04 Square Co., Ltd. Video game device and video game method
US20020163576A1 (en) * 2001-03-22 2002-11-07 Nikon Corporation Position detector and attitude detector
US20030003424A1 (en) * 1997-08-25 2003-01-02 Motti Shechter Network-linked laser target firearm training system
US20030134700A1 (en) * 2001-12-19 2003-07-17 Salva Francesc Casas Ball-trapping device with electronic detection of impact on a target and detection method used therewith
US20030152892A1 (en) * 2002-02-11 2003-08-14 United Defense, L.P. Naval virtual target range system
US20030175661A1 (en) * 2000-01-13 2003-09-18 Motti Shechter Firearm laser training system and method employing modified blank cartridges for simulating operation of a firearm
US20040014010A1 (en) * 1997-08-25 2004-01-22 Swensen Frederick B. Archery laser training system and method of simulating weapon operation
US6699041B1 (en) * 2002-11-07 2004-03-02 The United States Of America As Represented By The United States Department Of Energy Self-assessing target with automatic feedback
US6966775B1 (en) * 2000-06-09 2005-11-22 Beamhit, Llc Firearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations
US20070160960A1 (en) * 2005-10-21 2007-07-12 Laser Shot, Inc. System and method for calculating a projectile impact coordinates
US7329127B2 (en) 2001-06-08 2008-02-12 L-3 Communications Corporation Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US20080213732A1 (en) * 2005-10-21 2008-09-04 Paige Manard System and Method for Calculating a Projectile Impact Coordinates
US20080212833A1 (en) * 2004-07-15 2008-09-04 Cubic Corporation Enhancement of aimpoint in simulated training systems
US7926408B1 (en) * 2005-11-28 2011-04-19 Metadigm Llc Velocity, internal ballistics and external ballistics detection and control for projectile devices and a reduction in device related pollution
US8621774B1 (en) 2004-03-29 2014-01-07 Metadigm Llc Firearm with multiple targeting laser diodes
US9429397B1 (en) 2015-02-27 2016-08-30 Kevin W. Hill System, device, and method for detection of projectile target impact
US20160275808A1 (en) * 2014-09-04 2016-09-22 The Government Of The United States, As Represented By The Secretary Of The Army Emission of a Commencement Sound and a Conclusion Sound
US9453711B2 (en) 2012-09-21 2016-09-27 Randy Wayne Martin Weapons firing range system and apparatus employing reflected imagery
US9470485B1 (en) 2004-03-29 2016-10-18 Victor B. Kley Molded plastic cartridge with extended flash tube, sub-sonic cartridges, and user identification for firearms and site sensing fire control
US9759530B2 (en) 2014-03-06 2017-09-12 Brian D. Miller Target impact sensor transmitter receiver system
US9921017B1 (en) 2013-03-15 2018-03-20 Victor B. Kley User identification for weapons and site sensing fire control
US10288381B1 (en) 2018-06-22 2019-05-14 910 Factor, Inc. Apparatus, system, and method for firearms training
US10458758B2 (en) 2015-01-20 2019-10-29 Brian D. Miller Electronic audible feedback bullet targeting system
US10712133B2 (en) * 2017-08-01 2020-07-14 nTwined LLC Impact indication system
US11397071B1 (en) 2021-09-14 2022-07-26 Vladimir V. Maslinkovskiy System and method for anti-blinding target game

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2899632B2 (en) * 1994-03-31 1999-06-02 バブコック日立株式会社 Shooting training equipment
JP3298043B2 (en) * 1995-11-01 2002-07-02 バブコック日立株式会社 Shooting training equipment
JP3564213B2 (en) * 1995-11-02 2004-09-08 バブコック日立株式会社 Group shooting training device
JPH09166398A (en) * 1995-12-15 1997-06-24 Babcock Hitachi Kk Rifle drill equipment
JP3803589B2 (en) * 2002-02-15 2006-08-02 Necパーソナルプロダクツ株式会社 Shooting training system and method
JP4034623B2 (en) * 2002-09-17 2008-01-16 バブコック日立株式会社 Shooting training equipment
JP3997143B2 (en) * 2002-10-30 2007-10-24 バブコック日立株式会社 Video shooting training device
KR100647066B1 (en) * 2004-04-30 2006-11-23 홍선태 Operating Apparatus for BB-Bullet Detection Data for Survival Game
JP4203769B2 (en) * 2006-03-30 2009-01-07 日本電気株式会社 Landing point display device and shooting system
DE202009014784U1 (en) * 2009-11-02 2010-12-16 Spiller, Jürgen shooting system
CN101839677B (en) * 2010-04-08 2013-03-13 西安工业大学 Acousto-optic automatic target reporting system
KR101034024B1 (en) * 2010-12-16 2011-05-11 주식회사 가온지에스 Wind direction / wind speed information display for shooting training
RU2460031C1 (en) * 2011-03-24 2012-08-27 Рафас Максумович Шарипов Target complex (versions)
KR101493207B1 (en) * 2013-05-03 2015-02-13 주식회사 홍인터내셔날 Dart game apparatus, method and computer readable medium thereof
JP6548082B2 (en) * 2013-12-17 2019-07-31 株式会社エイテック Target system
DE202014101791U1 (en) * 2014-04-15 2014-04-29 Reiner Bayer Device for event presentations in duel-shooting
CN105617646A (en) * 2014-11-07 2016-06-01 伊吉士摩斯科技股份有限公司 Shooting training and game system provided with virtual targets
CN105136028A (en) * 2015-06-26 2015-12-09 哈尔滨工程大学 Shooting precision measuring instrument used for multi-launch rocket and provided with quadrotor
JP6799848B2 (en) * 2015-10-30 2020-12-16 株式会社エイテック Target system
CN107702598B (en) * 2017-08-30 2020-01-14 京东方科技集团股份有限公司 Dart target disc, dart scoring device and dart scoring method
CN108120345B (en) * 2017-12-19 2019-12-13 北京君盾装备技术有限公司 target-scoring method for detecting radar and thermal infrared combined accurate positioning
CN109029132B (en) * 2018-08-31 2021-02-23 南京强钧防务科技研究院有限公司 Accurate positioning method of target scoring system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4264309A (en) * 1978-09-08 1981-04-28 Brooksby Brian Thomas Projected image target apparatus
JPS56119499A (en) * 1980-02-23 1981-09-19 Kyosan Electric Mfg Slide target unit
US4552533A (en) * 1981-11-14 1985-11-12 Invertron Simulated Systems Limited Guided missile fire control simulators
JPS6217193A (en) * 1985-07-13 1987-01-26 Shirakawa Seisakusho:Kk Gas permeable membrane
JPH01193600A (en) * 1988-01-29 1989-08-03 Toshiba Tesuko Kk Shooting training device
JPH0261499A (en) * 1988-08-29 1990-03-01 Toshiba Tesuko Kk Impact position sensor

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3802099A (en) * 1972-09-25 1974-04-09 Carter Ind Inc Method and apparatus for training policemen
US3849910A (en) * 1973-02-12 1974-11-26 Singer Co Training apparatus for firearms use
AU530979B2 (en) * 1978-12-07 1983-08-04 Aus. Training Aids Pty. Ltd., Detecting position of bullet fired at target
FR2556827B1 (en) * 1983-12-15 1988-04-22 Giravions Dorand INDOOR SHOOTING TRAINING DEVICE
SE451504B (en) * 1985-04-03 1987-10-12 Saab Training Systems Ab SLIDING BOARD
US5035622A (en) * 1989-11-29 1991-07-30 The United States Of America As Represented By The Secretary Of The Navy Machine gun and minor caliber weapons trainer
US5213503A (en) * 1991-11-05 1993-05-25 The United States Of America As Represented By The Secretary Of The Navy Team trainer
JP2969492B2 (en) * 1991-11-15 1999-11-02 バブコック日立株式会社 Shooting training equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4264309A (en) * 1978-09-08 1981-04-28 Brooksby Brian Thomas Projected image target apparatus
JPS56119499A (en) * 1980-02-23 1981-09-19 Kyosan Electric Mfg Slide target unit
US4552533A (en) * 1981-11-14 1985-11-12 Invertron Simulated Systems Limited Guided missile fire control simulators
JPS6217193A (en) * 1985-07-13 1987-01-26 Shirakawa Seisakusho:Kk Gas permeable membrane
JPH01193600A (en) * 1988-01-29 1989-08-03 Toshiba Tesuko Kk Shooting training device
JPH0261499A (en) * 1988-08-29 1990-03-01 Toshiba Tesuko Kk Impact position sensor

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030003424A1 (en) * 1997-08-25 2003-01-02 Motti Shechter Network-linked laser target firearm training system
US20040014010A1 (en) * 1997-08-25 2004-01-22 Swensen Frederick B. Archery laser training system and method of simulating weapon operation
US6283861B1 (en) * 1999-03-23 2001-09-04 Square Co., Ltd. Video game device and video game method
US20030175661A1 (en) * 2000-01-13 2003-09-18 Motti Shechter Firearm laser training system and method employing modified blank cartridges for simulating operation of a firearm
US6935864B2 (en) 2000-01-13 2005-08-30 Beamhit, Llc Firearm laser training system and method employing modified blank cartridges for simulating operation of a firearm
US6966775B1 (en) * 2000-06-09 2005-11-22 Beamhit, Llc Firearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations
US20020163576A1 (en) * 2001-03-22 2002-11-07 Nikon Corporation Position detector and attitude detector
US6993206B2 (en) * 2001-03-22 2006-01-31 Nikon Corporation Position detector and attitude detector
US7329127B2 (en) 2001-06-08 2008-02-12 L-3 Communications Corporation Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US20030134700A1 (en) * 2001-12-19 2003-07-17 Salva Francesc Casas Ball-trapping device with electronic detection of impact on a target and detection method used therewith
US6875019B2 (en) 2002-02-11 2005-04-05 United Defense, Lp Naval virtual target range system
US20030152892A1 (en) * 2002-02-11 2003-08-14 United Defense, L.P. Naval virtual target range system
US6699041B1 (en) * 2002-11-07 2004-03-02 The United States Of America As Represented By The United States Department Of Energy Self-assessing target with automatic feedback
US9891030B1 (en) 2004-03-29 2018-02-13 Victor B. Kley Molded plastic cartridge with extended flash tube, sub-sonic cartridges, and user identification for firearms and site sensing fire control
US9470485B1 (en) 2004-03-29 2016-10-18 Victor B. Kley Molded plastic cartridge with extended flash tube, sub-sonic cartridges, and user identification for firearms and site sensing fire control
US8621774B1 (en) 2004-03-29 2014-01-07 Metadigm Llc Firearm with multiple targeting laser diodes
US7687751B2 (en) * 2004-07-15 2010-03-30 Cubic Corporation Enhancement of aimpoint in simulated training systems
US20080212833A1 (en) * 2004-07-15 2008-09-04 Cubic Corporation Enhancement of aimpoint in simulated training systems
US20070160960A1 (en) * 2005-10-21 2007-07-12 Laser Shot, Inc. System and method for calculating a projectile impact coordinates
US8360776B2 (en) 2005-10-21 2013-01-29 Laser Shot, Inc. System and method for calculating a projectile impact coordinates
US20080213732A1 (en) * 2005-10-21 2008-09-04 Paige Manard System and Method for Calculating a Projectile Impact Coordinates
US7926408B1 (en) * 2005-11-28 2011-04-19 Metadigm Llc Velocity, internal ballistics and external ballistics detection and control for projectile devices and a reduction in device related pollution
US9453711B2 (en) 2012-09-21 2016-09-27 Randy Wayne Martin Weapons firing range system and apparatus employing reflected imagery
US9921017B1 (en) 2013-03-15 2018-03-20 Victor B. Kley User identification for weapons and site sensing fire control
US9759530B2 (en) 2014-03-06 2017-09-12 Brian D. Miller Target impact sensor transmitter receiver system
US9786191B2 (en) * 2014-09-04 2017-10-10 The United States Of America, As Represented By The Secretary Of The Army Emission of a commencement sound and a conclusion sound
US20160275808A1 (en) * 2014-09-04 2016-09-22 The Government Of The United States, As Represented By The Secretary Of The Army Emission of a Commencement Sound and a Conclusion Sound
US10458758B2 (en) 2015-01-20 2019-10-29 Brian D. Miller Electronic audible feedback bullet targeting system
US9429397B1 (en) 2015-02-27 2016-08-30 Kevin W. Hill System, device, and method for detection of projectile target impact
US10712133B2 (en) * 2017-08-01 2020-07-14 nTwined LLC Impact indication system
US10288381B1 (en) 2018-06-22 2019-05-14 910 Factor, Inc. Apparatus, system, and method for firearms training
US11397071B1 (en) 2021-09-14 2022-07-26 Vladimir V. Maslinkovskiy System and method for anti-blinding target game

Also Published As

Publication number Publication date
JP2691247B2 (en) 1997-12-17
EP0669512B1 (en) 1999-01-20
CN1063545C (en) 2001-03-21
AU1348795A (en) 1995-09-07
DE69507349T2 (en) 1999-09-02
KR950025410A (en) 1995-09-15
AU692209B2 (en) 1998-06-04
JPH07234096A (en) 1995-09-05
CN1122445A (en) 1996-05-15
DE69507349D1 (en) 1999-03-04
KR0169504B1 (en) 1999-02-01
EP0669512A1 (en) 1995-08-30

Similar Documents

Publication Publication Date Title
US5551876A (en) Target practice apparatus
US5366229A (en) Shooting game machine
US6179720B1 (en) Correlation method and apparatus for target-oriented sports activities
US4713686A (en) High speed instantaneous multi-image recorder
US6626756B2 (en) Amusement game system and a computer-readable storage medium
US4828500A (en) Apparatus and method for motion teaching
US5190286A (en) Image synthesizing system and shooting game machine using the same
US5419562A (en) Method and apparatus for analyzing movements of an individual
EP0536810A1 (en) Method for rating billiards shots and displaying optimal paths
JP2002233665A5 (en)
JP3242276B2 (en) Video target shooting training device
US4639222A (en) Gunnery training apparatus
JP3619223B2 (en) Game device
JPH04322672A (en) Indoor golf trainer
JPH08257191A (en) Golf swing diagnostic device and method
JP3796662B2 (en) Video shooting training device
JP3997143B2 (en) Video shooting training device
JP4034623B2 (en) Shooting training equipment
JPH11142097A (en) Shooting training device
JP2004108632A (en) Shooting training device and hit determining method in the shooting training device
US7394917B2 (en) Apparatus for measuring a trajectory
JPH07213672A (en) Image processor for golf swing analysis
JP3901203B2 (en) Shooting game equipment
JPH08289951A (en) Swing analyzer for golf club
JP2000102671A (en) Position specifying device and recording medium storing position specifying program

Legal Events

Date Code Title Description
AS Assignment

Owner name: HITACHI, LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KORESAWA, SUMIO;IMAIDE, HIDETOSHI;REEL/FRAME:007627/0123

Effective date: 19950328

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20080903