US20060223633A1 - Gaming program, gaming machine, and record medium - Google Patents
Gaming program, gaming machine, and record medium Download PDFInfo
- Publication number
- US20060223633A1 US20060223633A1 US11/392,618 US39261806A US2006223633A1 US 20060223633 A1 US20060223633 A1 US 20060223633A1 US 39261806 A US39261806 A US 39261806A US 2006223633 A1 US2006223633 A1 US 2006223633A1
- Authority
- US
- United States
- Prior art keywords
- character
- action
- character action
- control
- processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000009471 action Effects 0.000 claims abstract description 340
- 238000013500 data storage Methods 0.000 claims description 9
- 238000012545 processing Methods 0.000 description 205
- 238000000034 method Methods 0.000 description 47
- 230000008569 process Effects 0.000 description 42
- 230000000694 effects Effects 0.000 description 28
- 230000004044 response Effects 0.000 description 26
- 230000035699 permeability Effects 0.000 description 11
- 230000002159 abnormal effect Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 238000011084 recovery Methods 0.000 description 6
- 230000015556 catabolic process Effects 0.000 description 5
- 238000006731 degradation reaction Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 102100036848 C-C motif chemokine 20 Human genes 0.000 description 4
- 101000713099 Homo sapiens C-C motif chemokine 20 Proteins 0.000 description 4
- 108090000237 interleukin-24 Proteins 0.000 description 4
- 238000009877 rendering Methods 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 101000661807 Homo sapiens Suppressor of tumorigenicity 14 protein Proteins 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 101000911772 Homo sapiens Hsc70-interacting protein Proteins 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 239000002574 poison Substances 0.000 description 1
- 231100000614 poison Toxicity 0.000 description 1
- 230000001012 protector Effects 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- A63F13/10—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/67—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/63—Methods for processing data by generating or executing the game program for controlling the execution of the game in time
- A63F2300/638—Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6692—Methods for processing data by generating or executing the game program for rendering three dimensional images using special effects, generally involving post-processing, e.g. blooming
Abstract
A gaming program, a gaming machine, and a record medium for making it possible to suppress the loss of the interest in a game and enhance the game presence are provided. The gaming machine selects a character action mode based on an operation signal and a plurality of pieces of character data and performs character action control based on the selected character action mode. The gaming machine performs display control of selection of the character action mode and the character action control. Particularly, the gaming machine performs specific image control while the character action mode is selected during the battle, and limits specific image control while the character action control is performed.
Description
- This application claims the priority of Japanese Patent Application No. 2005-106337 filed on Apr. 1, 2005, which is incorporated herein by reference.
- 1. Field of the Invention
- This invention relates to a gaming program, a gaming machine, and a record medium and in particular to a gaming program, a gaming machine, and a record medium for determining the action order of a plurality of characters, selecting the action mode of each character, and controlling the action of each character based on the action mode.
- 2. Description of the Related Arts
- Hitherto, various gaming programs have been provided for making a player play a role of a character, etc., as a command is entered, etc., in response to player operation in a virtual world in a game on a screen of a computer or a display, the action mode of a character in the game is selected according to a predetermined action order, and a preset story is advanced. Such a game is generally called RPG (Role Playing Game).
- Such a gaming program containing a battle scene wherein a character operated in response to player operation (which will be hereinafter referred to as “player character”) and an enemy character controlled by a computer fight a battle, the program for the player to get an experience value, virtual money, etc., by beating the enemy character in the battle and raise the character level to advance the story is generally known.
- In such a gaming program, if action of attack, etc., of the player character is taken in a battle scene, a command is entered, etc., in response to player operation for each player character according to a predetermined action order, and the action mode of each character is selected based on the character data and operation corresponding to the character, for example, as shown in Japanese Unexamined Patent Publication No. 2004-237071. Display control wherein such action mode selection is made is called character action selection display control. After the action mode of the character is selected, action control of the character is performed in such a manner that display control for the character to fight is performed based on the action mode of the character. Display control wherein actual character action is taken based on such action mode selection is called character action display control. Thus, the character action selection display control and the character action display control are repeatedly switched according to a predetermined order, whereby a battle scene is executed. When the character action display control is executed, display control is performed for providing powerful punch for the acting character so that the player can clearly recognize the acting character.
- For example, Japanese Unexamined Patent Publication No. 2001-351123 discloses a gaming program for suppressing specific image control such as fog processing about the display object of the character to be displayed on the screen, etc., for producing image display rich in presence for improving the interest in the game. A gaming program wherein a display area for displaying the display object on a screen and a non-display area not for displaying the display object are set based on the capacity of the display object, the gaming program not for displaying the display object and for suppressing specific image control if it is determined that the display object is contained in the non-display area is disclosed; it is made possible to display a large number of characters on one screen in image processing in the game.
- However, in the gaming program, specific image control is simply performed in the battle scene, whereby the control load is raised and it is feared that the displayed image may be degraded because of frame delay, frame skip, etc., and the interest in the game may be decreased. On the other hand, if the number of the display objects is limited to prevent such image degradation, the character to be displayed may be skipped and be undisplayed depending on the position of the character and it is feared that the player may be given a feeling being out of place; it is hard to say that a game high in presence is provided.
- It is therefore an object of the invention to provide a gaming program, a gaming machine, and a record medium for making it possible to suppress the loss of the interest in a game and enhance the game presence.
- To the end, according to the invention, there are provided the following:
- (1) A gaming program product for use in a computer including an input device that can be operated by a player comprising a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters; a character action order determination module for determining the action order of the plurality of characters; a character action mode selection module for selecting a character action mode based on an operation signal from the input device and the plurality of pieces of character data; a character action controller for performing character action control based on the character action mode selected by the character action mode selection module; a special character action controller for executing selection of the character action mode by the character action mode selection module and the character action control by the character action controller in accordance with the action order of the plurality of characters determined by the character action order determination module; a character action display controller for performing display control of selection of the character action mode and the character action control executed by the special character action controller; a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by the character action display controller; and a specific image control limitation module for limiting the specific image control of the specific image control execution module while the special character action controller performs the character action control.
- (2) The gaming program product described in (1) for further comprising a specific image control invalidation module for invalidating the specific image control as limitation of the specific image control while the character action controller performs the character action control.
- (3) A gaming machine including an input device that can be operated by a player; a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters; a character action order determination module for determining the action order of the plurality of characters; a character action mode selection module for selecting a character action mode based on an operation signal from the input device and the plurality of pieces of character data;
- a character action controller for performing character action control based on the character action mode selected by the character action mode selection module; a special character action controller for executing selection of the character action mode by the character action mode selection module and the character action control by the character action controller in accordance with the action order of the plurality of characters determined by the character action order determination module; a character action display controller for performing display control of selection of the character action mode executed by the special character action controller and the character action control; a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by the character action display controller; and a specific image control limitation module for limiting the specific image control of the specific image control execution module while the special character action controller performs the character action control.
- According to the invention described in (1) or (3), selection of the action mode of the character based on the operation signal from the input device that can be operated by the player and a plurality of pieces of character data and action control of the character based on the action mode of the selected character are executed in accordance with the action order of the characters, and display control of selection of the action mode of the character to be executed and action control of the character is performed. While the character action control is performed in the character action mode selection and the character action control, execution of specific image control is limited. Therefore, while the character action control whose control load grows relatively is performed, if the control load of image control other than execution of specific image control whose control load is high grows, the specific image control is limited, so that degradation of the image can be prevented and the loss of the interest in the game can be suppressed. While the character action mode selection with relatively small control load is made, the specific image control is executed and the characters to be displayed are displayed without being skipped because the number of displayed characters is not limited, and further specific image processing is performed and game presence can be enhanced.
- According to the invention described in (2), as limitation of specific image control, the specific image control is invalidated while the character action control is performed. Therefore, only image control other than the specific image control whose control load is high can be executed and still more degradation of the image can be prevented and the loss of the interest in the game can be suppressed.
- According to the invention, the loss of the interest in a game can be suppressed and the game presence can be enhanced.
- Additional objects and advantage of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objects and advantages of the invention may be realized and obtained by means of the instrumentalities and combinations particularly pointed out hereinafter.
- The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention, and together with the general description given above and the detailed description of the embodiments given below, serve to explain the principals of the invention.
- In the drawings:
-
FIG. 1 is a drawing to show the general configuration of a gaming machine incorporating the invention; -
FIG. 2 is a block diagram to show the system configuration of the gaming machine inFIG. 1 ; -
FIG. 3 is a drawing to show the character individual skills of play characters A and B; -
FIG. 4 shows display examples of a title screen and a world map; -
FIG. 5 is a schematic representation to show a battle scene; -
FIG. 6 is a schematic representation to show a battle scene; -
FIG. 7 is a schematic representation to show a battle scene; -
FIG. 8 is a drawing to show the display mode of a judgment ring displayed at the command determination time; -
FIG. 9 is a drawing to show the display mode of the judgment ring after the command determination; -
FIG. 10 is a drawing to show other examples of 120% areas; -
FIG. 11 is a flowchart to show a procedure of main game processing; -
FIG. 12 is a flowchart to show a procedure of battle processing; -
FIG. 13 is a flowchart to show a procedure of command processing; -
FIG. 14 is a flowchart to show a procedure of judgment processing; -
FIG. 15 is a flowchart to show a procedure of display control processing; -
FIG. 16 is a drawing to show the configuration of a network gaming system; -
FIG. 17 is a schematic representation to show image processing executed based on the distance from an eyepoint position; and -
FIG. 18 is a schematic representation to show a battle scene. - Referring now to the accompanying drawings, there are shown preferred embodiments of the invention.
- (Configuration of Gaming Machine)
-
FIG. 1 shows the general configuration of a gaming machine incorporating the invention. Thegaming machine 200 is made up of a machinemain unit 1, a input device 4 (that can be operated by a player) for outputting a control command to the machinemain unit 1 in response to player operation, and adisplay 15 for displaying an image based on an image signal from the machinemain unit 1. In thegaming machine 200, a game is executed as various images, such as a plurality of characters including a player character and an enemy character, are displayed on a display surface (screen) 16 of thedisplay 15 such as a CRT. - A game executed in the
gaming machine 200 is executed as a gaming program recorded on an external record medium separate from the machinemain unit 1 is read. In addition to a CD-ROM or a DVD-ROM, an FD (flexible disk) or any other record medium can be used as the external record medium recording the gaming program. In the embodiment, a DVD-ROM is used as the external record medium. Acover 2 that can be opened and closed is provided in the top center of the machinemain unit 1. As thecover 2 is opened, a DVD-ROM 31 (seeFIG. 2 ) can be placed in a DVD-ROM drive 29 (seeFIG. 2 ) as a record medium drive provided inside the machinemain unit 1. - The
input device 4 includes various input parts for outputting a control command to a CPU 21 (seeFIG. 2 ) in the machinemain unit 1 in response to operation of the player. Theinput device 4 is provided in the left portion with an upbutton 7, adown button 8, aleft button 9, and aright button 10 mainly operated by the player to move a character appearing in a game or move an option of a menu as the input parts. Theinput device 4 is provided in the right portion with aΔ button 11, a ◯button 12, aX button 13, and a □button 14 mainly operated by the player to determine or cancel various items. Theinput device 4 is provided in the center with aselection button 6 at the top and astart button 5 at the bottom. - The
display 15 has input terminals of a video signal and an audio signal, which are connected to a video output terminal and an audio output terminal of the machinemain unit 1 byterminal cables display 15 is an existing television having in one piece thescreen 16 that can display image data output from animage output section 25 described later (seeFIG. 2 ) andspeakers audio output section 27 described later (seeFIG. 2 ). The machinemain unit 1 and theinput device 4 are connected by asignal cable 20 as shown inFIG. 1 . - The machine
main unit 1 is provided on one side with amemory slot 3 as an insertion slot of a memory card 32 (seeFIG. 2 ). Thememory card 32 is a storage medium for temporarily recording game data when the player interrupts the game, etc. The data recorded on thememory card 32 is read through acommunication interface 30 described later (seeFIG. 2 ) having a card reader function. - (Electric Configuration of Gaming Machine)
-
FIG. 2 shows the system configuration of thegaming machine 200. The machinemain unit 1 includes theCPU 21 as a controller,ROM 22 andRAM 23 as storage module, animage processing section 24, theimage output section 25, anaudio processing section 26, theaudio output section 27, adecoder 28, the DVD-ROM drive 29, and thecommunication interface 30. - The DVD-
ROM 31 can be attached to and detached from the DVD-ROM drive 29 and the gaming program in the DVD-ROM 31 placed in the DVD-ROM drive 29 is read by theCPU 21 in accordance with a basic operation program of an OS (operating system), etc., stored in theROM 22. The read gaming program is converted into predetermined signals by thedecoder 28 for storage in theRAM 23. - The gaming program stored in the
RAM 23 is executed by theCPU 21 in accordance with the basic operation program or an input signal from theinput device 4. Image data and audio data are read from the DVD-ROM 31 in response to the executed gaming program. The image data is sent to theimage processing section 24 and the audio data is sent to theaudio processing section 26. - The
image processing section 24 converts the received image data into an image signal and supplies the image signal to thedisplay 15 through theimage output section 25, thereby displaying an image on thescreen 16. Particularly, theimage processing section 24 has functions of calculating the position relationship between the display object placed in a virtual three-dimensional coordinate space (for example, a display object such as a character) and the eyepoint position every predetermined timing, generating image data viewing the display object from the eyepoint position, and displaying an image based on the generated image data on thescreen 16. - The
audio processing section 26 converts the received audio data into an audio signal and supplies the audio signal to thespeakers audio output section 27. - The
communication interface 30 enables theinput device 4 and thememory card 32 to be connected detachably to the machinemain unit 1. Through thecommunication interface 30, data is read from and written into thememory card 32 and a signal from theinput device 4 is sent to the components of theCPU 21, etc. - (Character Individual Skills)
- The
RAM 23 stores the gaming program in the DVD-ROM 31 and parameters concerning characters based on thememory card 32. As a specific example of the parameters concerning characters, the character individual skills will be discussed withFIG. 3 .FIG. 3 is a schematic representation to show the character individual skills of player characters A and B. The character individual skills of the player characters A and B will be discussed below; the character individual skills of other player characters and enemy characters are also to be stored in theRAM 23. - The character individual skills shown in
FIG. 3 are stored for each of the characters appearing in a game. The types of character individual skills include hit points (HP), magic points (MP), sanity points (SP), physical offensive power (STR), physical defensive power (VIT), agility (AGL), magic offensive power (INT), magic defensive power (POW), and luck (LUC). Each of them is represented by a numeric value and a different value is set depending on the type of character although the character level is the same. - The character individual skills are set in response to the character level (LV). This LV changes with the experience value stored cumulatively in response to the experience of a battle, etc., in a game. Particularly, as for the HP, MP, and SP, the maximum HP, the maximum MP, and the maximum SP corresponding to the character individual skills and the actual HP, MP, and SP changing in a game are stored. Of course, the AGL and the LUC also change based on special item or special act as described later.
- The character individual skills are loaded into the
RAM 23 as described above. The character individual skills change with the arm, the protector, the item, etc., with which the character is equipped. The character individual skills change with the magic worked on the character and the used item. - Thus, the
CPU 21 reads the parameters concerning the characters such as the character individual skills, etc., stored in theRAM 23 from theRAM 23. TheCPU 21 for loading such a character individual skill table into theRAM 23 and theRAM 23 correspond to an example of a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters. The DVD-ROM 31 storing such a character individual skill table also corresponds to the character data storage module. - (Display Screen)
- Next, specific examples of display screens displayed on the
screen 16 accompanying the game content executed by theCPU 21 based on the gaming program recorded in the DVD-ROM 31 will be discussed with FIGS. 4 to 7. - When the DVD-
ROM 31 is placed in the DVD-ROM drive 29 and power of the machinemain unit 1 is turned on, “opening demonstration” is displayed on thescreen 16. The “opening demonstration” is effect display for telling the player about the start of a game. After the “opening demonstration” is displayed for a predetermined time, a “title screen” drawing a game title large is displayed as shown inFIG. 4A . - Here, specifically the character string of the game title, SHADOW HEARTS, is displayed and two options (NEW GAME and CONTINUE) are displayed below the game title. A
cursor 41 is displayed at the left position of the option of either NEW GAME or CONTINUE and as the player operates the upbutton 7 or thedown button 8, the position of thecursor 41 is changed. When the player operates the ◯button 12, the option pointed to by thecursor 41 is selected. - If the player selects NEW GAME on the “title screen,” a prolog and the game content are displayed and then a “world map” is displayed as shown in
FIG. 4B . On the other hand, if the player selects CONTINUE on the “title screen,” the “world map” is displayed based on the saved data when the previous game was over without displaying the prolog or the game content. - Specifically, the main cities of “A country” as the stage of the game story are displayed on the “world map” and options indicated by five city names (
CITY A 42 a,CITY B 42 b,CITY C 42 c,CITY D 42 d, andCITY E 42 e) are displayed. They are options to make a transition to a provided “submap.” As the player operates the upbutton 7 or thedown button 8, thecursor 41 indicating each option moves and as the player operates the ◯button 12, one option is selected. When one “submap” is thus selected, the “world map” makes a transition to the screen corresponding to the “submap” and the player can play various games set in response to the “submap.” Specifically, the visual scene in each city is prerender-displayed as a background image conforming to scene development and while the player characters move therein, various events are conquered and the story proceeds. - When the player operates the □
button 14 on the “world map,” a “menu screen” is displayed, enabling the player to make various settings, etc., on the menu screen. When the “world map” is displayed, if the player selects a city, the start screen of the “submap” corresponding to the city is displayed. As the action on the “submap,” the player character can walk, can speak to a pedestrian, and can make a purchase. - As the game according to the embodiment, a play character which acts based on operation of the player and an enemy character which acts based only on the gaming program appear and a game developed centering on the battle between the characters is realized on the
screen 16. In the embodiment, four characters ofplayer character A 111,player character B 112,player character C 113, and player character D 114 (seeFIG. 5A ) appear as the player characters and the game proceeds in the party unit made up of the four characters. Various types of status are set for each character. The experience value, money, arms, skill, and the like added by the number of gaming times, the number of times an enemy character has been beaten, etc, are defined as the status. - Then, if the player character party starting action on the “submap” encounters an enemy character, a battle with the enemy character is started as shown in
FIG. 5A . In this case, for example, a battle with three enemy characters of enemy character a 115,enemy character b 116, andenemy character c 117 is started. - In such a battle screen, the action order of all characters (containing the player characters and the enemy characters) is determined and the action is selected and is controlled according to the action order.
- Specifically, a battle between the
player character A 111, theplayer character B 112, theplayer character C 113, and theplayer character D 114 and the enemy character a 115, theenemy character b 116, and theenemy character c 117 is started as shown inFIG. 5A . A predictedaction order image 118 is displayed in the upper portion of the screen as shown inFIG. 5A . Anaction character image 119 indicating the character whose action turn comes is displayed. - If the
player character A 111 becomes an action character according to the action order, an action selection screen is displayed as shown inFIG. 5B and the player selects the action type of theplayer character A 111 such as attack by operating theinput device 4. An action object selection screen is displayed as shown inFIG. 6A and the player selects the action object of theplayer character A 111 in the action type. A judgment ring screen is displayed and ajudgment ring 100 and arotation bar 101 are displayed as shown inFIG. 6B and the action success or failure, the action effect, and the like are determined based on operation of theinput device 4. The judgment ring is described later in detail with FIGS. 8 to 10. - An action effect screen of the
player character A 111 is displayed as shown inFIG. 7A andFIG. 7B based on the action type, the action object, the action success or failure, the action effect, etc., of theplayer character A 111 selected and determined based on operation of theinput device 4. - Although described later in detail, blurring processing to really represent the depth on a virtual three-dimensional space is executed in the action selection screen, the action object selection screen, and the judgment ring screen as shown in
FIGS. 5 and 6 ; however, the blurring processing is invalidated (limited) in the action effect screen as shown inFIG. 7 . - [Blurring Processing]
- The blurring processing is processing to produce the effect of a virtual three-dimensional space for providing perspective and an infinite feeling for the game screen. Such blurring processing will be discussed with
FIGS. 17 and 18 . - The blurring processing is executed based on the distance between an
eyepoint position 214 and acharacter 212 in the depth direction thereof as shown inFIG. 17 . Specifically, theeyepoint position 214 and the position of thecharacter 212 are at a distance of symbol a in the depth direction. A plurality ofreference positions eyepoint position 214. The two reference positions sandwiching the character 212 (for example, the reference positions 210 a and 210 b) are determined based on the distance between theeyepoint position 214 and thecharacter 212. That is, thefirst reference position 210 a shorter than the character distance and thesecond reference position 210 b longer than the character distance are detected based on the character distance between the eyepoint position and the character in the depth direction. - The permeability of texture at the
reference position 210 a is set as α1 and the permeability of texture at thereference position 210 b is set as α2. The permeability α1 with the distance in the depth direction from theeyepoint position 214 being short is set lower than the permeability α2 with the distance in the depth direction from theeyepoint position 214 being long. - The distances from the two
reference positions character 212 are calculated. Specifically, the distance from thefirst reference position 210 a to thecharacter 212 becomes symbol b1 and the distance from thesecond reference position 210 b to thecharacter 212 becomes symbol b2. The permeability at thecharacter 212 is calculated based on the distances of thecharacter 212 from thefirst reference position 210 a and thesecond reference position 210 b and the permeability at thefirst reference position 210 a and the permeability at thesecond reference position 210 b. Accordingly, it is made possible to display an image low in permeability, namely, a highly reflective and clear image at the reference position near to theeyepoint position 214. - In addition to the blurring processing, bilinear complementation, trilinear complementation, etc., is also executed for the color shade of texture actually put on the polygon of the
character 212. - As a specific example, if the distance from the eyepoint position increases in the order of the
player character A 111, theplayer character B 112, theplayer character C 113, and theplayer character D 114, theplayer character A 111 is displayed as the clearest image and theplayer character B 112, theplayer character C 113, and theplayer character D 114 are displayed as clear images in this order, as shown inFIG. 18 . To represent the blurring degree of each display object, the types of lines indicating the characters are changed from one player character to another inFIG. 18 . Specifically, theplayer character C 113 and theplayer character D 114 are indicated using dashed lines; in fact, however, they are not displayed as dashed lines. - Thus, blurring control is performed based on the permeability set in response to the distance in the depth direction from the eyepoint position, whereby, for example, an image more blurred in response to the distance from the focus (point set as a focal point) from the eyepoint position can be generated and representation of the depth of field (depth of focus) is made possible. Accordingly, unlike a conventional image where all subjects in a screen are brought into focus, a real and natural game image brought into focus in response to the distance from the viewpoint like a view image in the real world can be generated. Consequently, the virtual reality of the player can be improved markedly.
- (Description of Judgment Ring)
- The above-described judgment ring will be discussed with FIGS. 8 to 10.
- The
judgment ring 100 as shown inFIG. 8 is displayed just before action control of the player character based on a selection command is performed against the target character, and thejudgment ring 100 is used to determine the parameters required for determining the effect described above.FIG. 8 shows thejudgment ring 100 at the command determination time when theplayer character A 111 becomes an action character and soft hit is selected. - The
judgment ring 100 as a reference area. is displayed in a state in which it is inclined in a slanting direction as shown inFIG. 6B . Displayed on thejudgment ring 100 is therotation bar 101 as a varying area for clockwise rotating like a clock hand with the center point of thejudgment ring 100 as a support, as shown inFIG. 8 . This means that therotation bar 101 as a varying area varies relatively to the reference area. A variable display area whose display mode changes with the passage of time is formed of the reference area. and the varying area varying relatively to the reference area. - Also displayed on the
judgment ring 100 are areas colored in predetermined angle ranges, which will be hereinafter referred to as timing areas. The timing areas are “effective areas” relatively advantageous to the player. In thejudgment ring 100, the areas except the “effective areas” become “non-effective areas” relatively disadvantageous to the player. The timing areas contain a 120% area as a “special effective area” described later. - That is, the reference area. is made up of the effective areas relatively advantageous to the player and the non-effective areas relatively disadvantageous to the player, and the effective areas contain the special effective area furthermore advantageous to the player. Accordingly, the effect of the action mode is determined as any of the first effect relatively advantageous to the player, the second effect relatively disadvantageous to the player, or the third effect more relatively advantageous to the player than the first effect.
- Then, the settings of the parameters are changed depending on whether or not the player can operate the ◯
button 12 when rotation of therotation bar 101 is started and therotation bar 101 passes through any of the timing areas. The timing areas include three timing areas as shown inFIG. 8 . The timing area through which therotation bar 101 first passes is a “first timing area” 102, the timing area through which therotation bar 101 next passes is a “second timing area” 103, and the timing area through which therotation bar 101 last passes is a “third timing area” 104. - For example, when the player can well operate the ◯
button 12 on any of the three timing areas, namely, the player can operate the ◯button 12 with therotation bar 101 on any of the three timing areas, then the action taken by the player character against the enemy character becomes effective. If a FIGHT command is selected, three attacks are made on the enemy character to cause damage thereto by predetermined offensive power. If a SPECIAL command is selected and recovery magic is used, magic having predetermined recovery power can be worked on the player character three times for giving recovery power to the player character. - In contrast, if the player upsets the operation timing of the ◯
button 12 on one timing area, the effect assigned to the timing area becomes ineffective. Particularly, if the player fails three times, the effect becomes zero. In the embodiment, the player visually recognizes the effective areas of thejudgment ring 100; the point is that the five senses of the player may be influenced to enable the player to recognize the operation timing. For example, it is also possible to adopt an auditory configuration wherein specific voice (sound) is generated for a predetermined time and the player is requested to operate in the generation section or a tactile configuration wherein theinput device 4 or a portable terminal is vibrated and the player is requested to operate in the vibration generation section. - The
judgment ring 100 is formed according to the angle ranges of the timing areas. Specifically, if the action character is theplayer character A 111 and an attack command is selected, it is determined that the top angle and the termination angle of thefirst timing area 102 are 45 degrees and 135 degrees, that those of thesecond timing area 103 are 180 degrees and 247 degrees, and that those of thethird timing area 104 are 292 degrees and 337 degrees. As shown inFIG. 8 , the “120% area” in thefirst timing area 102 is arange 102 a of 105 degrees resulting from subtracting 30 degrees from the termination angle 135 degrees to the termination angle 135 degrees; the “120% area” in thesecond timing area 103 is arange 103 a of 224 degrees resulting from subtracting 23 degrees from the termination angle 247 degrees to the termination angle 247 degrees; and the “120% area” in thethird timing area 104 is arange 104 a of 322 degrees resulting from subtracting 15 degrees from the termination angle 337 degrees to the termination angle 337 degrees. -
FIG. 9 shows the display mode of thejudgment ring 100 after the command determination. It shows a state in which therotation bar 101 starts to rotate and passes through thefirst timing area 102. - The “120% areas” are not limited to those described above. For example, the “120% area” may be provided in the range of the top angle to a predetermined angle as shown in
FIG. 10A or two “120% areas” may be provided in one timing area as shown inFIG. 10B .FIG. 10A shows the case where therange 102 a of thetop angle 45 degrees to theangle 65 degrees (45 degrees+20 degrees) is set as the “120% area”.FIG. 10B shows the case where therange 102 a of thetop angle 45 degrees to theangle 65 degrees (45 degrees+20 degrees) and the range of theangle 105 degrees resulting from subtracting 30 degrees from the termination angle 135 degrees to the termination angle 135 degrees are set as the “120% areas”. - (Operation of Gaming Machine)
- Various types of processing executed in the configuration described above will be discussed below with FIGS. 11 to 15.
- (Main Game Processing)
- If the DVD-
ROM 31 is placed in the DVD-ROM drive 29 when the power of the machinemain unit 1 is on, “opening demonstration” is displayed on thescreen 16 as described above and then main game processing as shown inFIG. 11 is executed. - First, whether or not NEW GAME of the two options is selected on the “title screen” as shown in
FIG. 4A is determined (ST1) as shown inFIG. 11 . If it is determined that NEW GAME is selected (YES at ST1), a prolog and the game content are displayed (ST2). If it is not determined that NEW GAME is selected (NO at ST1), namely, if it is determined that CONTINUE is selected on the “title screen,” the saved data when the previous game was over is set without displaying the prolog or the game content (ST3). - Next, the “world map” shown in
FIG. 4B is displayed (ST4). Whether or not any of the options displayed on the “world map” is selected is determined (ST5). When the determination at ST5 is YES, a start screen of the “submap” responsive to the selected option is displayed and the party of the player characters starts action on the “submap” (ST6). On the other hand, when the determination at ST5 is NO, whether or not the player operates the □button 14 on the “world map” for making a “menu screen” display request is determined (ST20). When the determination at ST20 is YES, the “menu screen” is displayed and various types of setting processing are performed in response to operation of the player (ST21) and then the process is transferred to ST5. On the other hand, when the determination at ST20 is NO, again the process is transferred to ST5. The action on the “submap” is for the player character to walk, talk to a pedestrian, do shopping, etc. The player can also display the “menu screen” by operating the □button 14 on the “submap” and various types of operation are made possible. For example, if the player selects a TOOL command, tool command processing is executed and the skills of the player character can be recovered; if the player selects a DEALING command, dealing processing is executed and dealing of the possessed item is made possible. - Next, whether or not the player character party starting action on the “submap” encounters an enemy character is determined (ST7). When the determination at ST7 is YES, “battle processing” is started (ST8). When the “battle processing” is started, a transition is made to a “battle scene” where a battle is fought between the player character party and the enemy character. The “battle processing” is described later with
FIG. 12 . On the other hand, when the determination at ST7 is NO, whether or not some event occurs is determined (ST9). When the determination at ST9 is YES, the process goes to ST16; when the determination at ST9 is NO, again the process is transferred to ST6. - Next, whether or not the player character party succeeds in escaping from the enemy character in the “battle scene” executed by performing the “battle processing” is determined (ST10). When the determination at ST10 is YES, the process goes to ST16. On the other hand, if the player character party fails in escaping from the enemy character or the player character party fights a battle with the enemy character, whether or not the player character party wins the enemy character in the “battle scene” is determined (ST11). When the determination is YES, namely, when the player character party wins the enemy character, points of the experience value, etc., are added or an item, money, etc., is given to each character of the party in response to the type of enemy character and the battle substance (ST12). The level of each character is raised in response to the experience value of the character (ST13). On the other hand, when the determination at ST11 is NO, namely, when the player character party cannot win the enemy character, whether all characters of the player character party die is determined (ST14). When the determination at ST14 is YES, the game is over (ST15) and the main game processing is terminated. When the determination at ST14 is NO, the process is transferred to ST16.
- At ST16, a movie responsive to the situation is displayed and subsequently whether or not the selected submap request condition has been cleared is determined (ST17). When the determination at ST17 is NO, again the process is transferred to ST6. When the determination at ST17 is YES, whether or not a transition is to be made to the ending is determined (ST18). When the determination at ST18 is YES, a predetermined ending is displayed (ST19) and the main game processing is terminated. On the other hand, when the determination at ST18 is NO, again the process is transferred to ST4.
- (Battle Processing)
- The “battle processing” will be discussed with
FIG. 12 . - First, processing of setting the parameters concerning the characters and calculating and setting the turn interval value is executed (ST30) as shown in
FIG. 12 . In this processing, theCPU 21 reads the parameters concerning the characters from a predetermined region of theRAM 23, the DVD-ROM 31 and sets the parameters in a predetermined region of theRAM 23. The characters mentioned here correspond to a plurality of characters containing the player characters and the enemy characters appearing in a “battle scene.” The turn interval value is a value to determine the action order and is calculated every plurality of characters (containing the player characters, the enemy characters). The turn interval value is determined in response to the agility (AGL) and luck (LUC) set every plurality of characters and an execution command correction value corresponding to the executed command type. TheCPU 21 sets the turn interval value calculated every plurality of characters appearing in a “battle scene” in a predetermined region of theRAM 23. This means that theCPU 21 for executing such processing corresponds to an example of a character action order determination module for determining the action order of the characters. Upon completion of the processing, the process is transferred to ST31. - At ST31, a battle scene start screen of a “battle scene” as shown in
FIG. 5A is displayed. On the start screen, the player character party (player character A 111,player character B 112,player character C 113, and player character D 114) is displayed toward the player. The enemy characters (for example, enemy character a 115,enemy character b 116, and enemy character c 117) are displayed at the positions corresponding to the player characters on the opposed side to the player characters. Information concerning the status of each player character is displayed in the lower right portion of the start screen although not shown inFIG. 5A . Further, theaction order image 118 for executing actions in the player characters and the enemy characters and theaction character image 119 indicating the character whose action turn comes are displayed in the upper portion of the start screen. - At ST32, “turn order processing” is performed to manage the order in which the player characters and the enemy characters can take action of attack, etc. In this processing, the
CPU 21 manages the turn order of the characters for which command selection is made effective based on the turn interval value calculated from the skills, etc., of the characters. This means that theCPU 21 for executing such processing corresponds to an example of the character action order determination module for determining the action order of the characters. - The
CPU 21 displays an image indicating the turn order on thescreen 16. TheCPU 21 zooms up the player character for which command selection is made effective (here, the player character A 111) and displays a “command selection screen” as shown inFIG. 5B . Upon completion of the processing, the process is transferred to ST33. - At ST33, whether or not the character for which command selection is made effective in the “turn order processing” is an enemy character is determined. If the determination at ST33 is YES, automatic processing is performed in accordance with the gaming program so that the enemy character makes an attack on the player character (ST34). In the battle automatic processing, the
CPU 21 executes action effect screen display control processing in the enemy character. Accordingly, processing of displaying an action effect screen in the enemy character is executed in display control processing described later (seeFIG. 15 ). This means that theCPU 21 for executing such processing performs display control of action control of the character in a battle scene. TheCPU 21 for executing such processing corresponds to an example of a character action display controller. Although described later in detail, specific image control is limited (invalidated) in the action effect screen. Upon completion of the processing, the process is transferred to ST36. - On the other hand, if it is determined at ST33 that the character for which command selection is made effective is the player character, subsequently “command processing” of accepting command selection of the player is performed (ST35). In the processing, a command is selected in response to operation input from the
input device 4 and the action mode based on the selected command is determined. This means that theCPU 21 for executing such processing executes selection of the action mode of the character and action control of the character (described later) in accordance with the action order of the characters determined at ST30, ST32, ST36, etc. TheCPU 21 for executing such processing corresponds to an example of a special character action controller. - The
CPU 21 displays acommand menu 44 indicating commands to determine the action mode of theplayer character A 111 as options on thescreen 16. TheCPU 21 moves a selection cursor 45 (seeFIGS. 5 and 6 ) displayed at the left of the command menu 44 (seeFIGS. 5 and 6 ) as the player operates the upbutton 7 or thedown button 8 of theinput device 4. When the player operates the ◯button 12, the command with theselection cursor 45 displayed at the left position is selected and the action mode of theplayer character A 111 is determined. Various commands represented by ATTACK, MAGIC, ITEM, DEFEND, and ESCAPE are displayed on thecommand menu 44. - Effect display responsive to the determined action mode is produced. For example, if the player selects attack, magic, specific act, item use command (“action” command described later), display processing such that action is taken against the target character as the action target of the player character, the enemy character, etc., is executed. In the “command processing,” “judgment processing” for enabling technical intervention according to the operation timing of the player is also performed. The “command processing” is described later in detail with
FIG. 13 . Upon completion of the processing, the process is transferred to ST36. - At ST36, whenever the character takes action, the turn order is updated. In the processing, the
CPU 21 stores the character taking the action in a predetermined region of theRAM 23 and updates the turn order of the character taking the action. Accordingly, when the “turn order processing” is again executed, the turn order is compared and the character for which command selection of the character made to execute action is made effective can be determined. If all characters execute action, the characters taking action are stored as the characters not taking action. This means that theCPU 21 for executing such processing corresponds to an example of the character action order determination module for determining the action order of the characters. Upon completion of the processing, the process is transferred to ST37. - At ST37, update display processing of the turn order is executed. In the processing, the
CPU 21 updates and displays the turn order to execute action in the next turn based on the turn order updated at ST36. Upon completion of the processing, the process is transferred to ST38. - At ST38, whether or not the “battle processing” exit condition is satisfied is determined. When the determination at ST38 is NO, the process returns to ST32; when the determination at ST38 is YES, “soul point addition processing” is executed (ST39) and the “battle processing” is exited. The “battle processing” exit condition is any of the fact that the enemy characters appearing on the battle screen suffer a crushing defeat, the fact that the player selects an “ESCAPE” command and the player character party succeeds in escaping from the enemy characters, the fact that the player character party suffers a crushing defeat, or the fact that such an event for terminating the battle occurs.
- (Command Processing)
- The “command processing” will be discussed with
FIG. 13 . - First, command screen display control processing is executed (ST200) as shown in
FIG. 13 . In the processing, theCPU 21 sets data to display a command selection screen in theRAM 23. Accordingly, command screen display processing is executed based on the setup data in display control processing described later (seeFIG. 15 ). This means that theCPU 21 for executing such processing performs display control of selection of the action mode of the character in a battle scene. TheCPU 21 for executing such processing corresponds to an example of the character action display controller. Upon completion of the processing, the process is transferred to ST201. - At ST201, whether or not the command is an ACTION SELECTION command is determined. In this processing, the
CPU 21 determines whether or not the command is an ACTION SELECTION command in response to an input signal from theinput device 4, etc. The ACTION SELECTION command mentioned here includes the ATTACK command and the magic command as well as a SPECIFIC ACT command such as a FUSION command, an ITEM command, etc. If theCPU 21 determines that the command is an ACTION SELECTION command, theCPU 21 executes “judgment processing” of physical attack, magic, specific act, item use, etc., (ST202) and exits the subroutine. The “judgment processing” is described later in detail withFIG. 14 . On the other hand, if theCPU 21 does not determine that the command is an ACTION SELECTION command, theCPU 21 transfers the process to ST203. - Subsequently, at ST203, the
CPU 21 determines whether or not the command is a DEFEND command. If theCPU 21 determines that the command is a DEFEND command, theCPU 21 executes defense processing (ST204) and exits the subroutine. On the other hand, if theCPU 21 does not determine that the command is a DEFEND command, theCPU 21 executes escape processing (ST205). Upon completion of the processing, the subroutine is exited. - (Judgment Processing)
- The “judgment processing” will be discussed with
FIG. 14 . - First, the
CPU 21 executes action selection screen display control processing for setting the data to display an action selection screen on thescreen 16 in the RAM 23 (ST221) and executes “command acceptance processing” (ST222). Accordingly, processing of displaying an action selection screen as shown inFIG. 5B is executed based on the setup data in display control processing described later (seeFIG. 15 ). This means that theCPU 21 for executing such processing executes display control of selection of the action mode of the character in a battle scene. TheCPU 21 for executing such processing corresponds to an example of a character action display controller. TheCPU 21 determines the action to be executed based on the character data corresponding to the character stored in theRAM 23 in response to an operation input signal supplied from theinput device 4. Specifically, theCPU 21 determines the type of attack, magic, specific act, item use. For example, when an ACTION SELECTION command is selected, the type of hit is determined. The hit types are soft hit, normal hit, hard hit, etc. Upon completion of the processing, the process is transferred to ST223. - At ST223, the
CPU 21 executes action target selection screen display control processing for setting the data to display an action target selection screen on thescreen 16 in theRAM 23 and executes action target selection command acceptance processing (ST224). Accordingly, processing of displaying an action target selection screen as shown inFIG. 6A is executed based on the setup data in the display control processing described later (seeFIG. 15 ). This means that theCPU 21 for executing such processing executes display control of selection of the action mode of the character in a battle scene. TheCPU 21 for executing such processing corresponds to an example of the character action display controller. TheCPU 21 determines the character (target character) as the target of the action (attack, attack magic use, recovery magic use, specific act use, item use, etc.) taken based on the command selected at ST 222 in response to an operation input signal supplied from theinput device 4 and stores the character in a predetermined region of theRAM 23. Upon completion of the processing, the process is transferred to ST225. - At ST225, “judgment ring determination processing” is executed. In the processing, the
CPU 21 determines the display mode of thejudgment ring 100 and therotation bar 101 in response to the skills, etc., of the character taking action. This means that theCPU 21 determines the display mode of thejudgment ring 100 and therotation bar 101 based on the character data corresponding to the character stored in theRAM 23. Upon completion of the processing, the process is transferred to ST226. - At ST226, the
CPU 21 executes judgment ring screen display control processing for setting the data to display a judgment ring screen on thescreen 16 in theRAM 23. Accordingly, processing of displaying a judgment ring screen as shown inFIG. 6B is executed based on the setup data in the display control processing described later (seeFIG. 15 ). This means that theCPU 21 for executing such processing executes display control of selection of the action mode of the character in a battle scene. TheCPU 21 for executing such processing corresponds to an example of the character action display controller. Upon completion of the processing, the process is transferred to ST227. - At ST227, “judgment ring judgment processing” is executed. In the processing, the
CPU 21 determines the success or failure of action of attack, etc., the effect of action, etc., in response to operation input from theinput device 4. Upon completion of the processing, the process is transferred to ST228. - Thus, the
CPU 21 selects the character action mode based on the operation signal from theinput device 4 and the character data stored in theRAM 23. TheCPU 21 for executing such processing corresponds to an example of a character action mode selection module. - At ST228, update processing of the individual skill parameters of HP, MP, SP, AGL, LUC, etc., and the status is executed. In the processing, the
CPU 21 updates the values of HP, MP, and SP based on the damage amount or the recovery value calculated by performing the “judgment ring judgment processing.” HP and MP are incremented or decremented and SP is decremented in response to the damage amount, the recovery value, etc. Whenever such processing is executed, SP is decremented by one. That is, SP is decremented by one every turn of the character. TheCPU 21 updates and stores the individual skill parameters of AGL, LUC, etc., based on the used special item or the executed special act. Further, theCPU 21 updates the status of the character in response to the action executed by performing the “judgment ring judgment processing.” At the time, if the status of the character is updated to “abnormal status,” the character enters an abnormal state different from the usual state. The “abnormal status” varies depending on the type of attack item, magic, etc. For example, “poison” abnormal status is an abnormal status in which the physical strength of the character is automatically decreased every turn for the character to take action when the character receives magic from the enemy or receives attack based on a predetermined item. “Lithification” abnormal status is an abnormal status in which the character is hardened like a stone and command entry is made impossible when the character receives magic from the enemy or receives attack based on a predetermined item. Upon completion of the processing, the process is transferred to ST229. - At ST229, “action effect screen display control processing” is executed. In the processing, the
CPU 21 executes action effect screen display control processing for setting in theRAM 23, data to display an action effect screen as shown inFIG. 7 A andFIG. 7B on thescreen 16 where predetermined action of character, etc., (attack, magic, specific act, use of item, etc.) is taken based on the action type selected by performing the “command acceptance processing,” the action target selected by performing the “action target selection command acceptance processing,” and the success or failure of the action, the effect of the action executed by performing the “judgment ring judgment processing.” Accordingly, processing of displaying the action effect screen in the player character is executed based on the setup data in the display control processing described later (seeFIG. 15 ). Although described later in detail, specific image control is limited (invalidated) in the action effect screen. TheCPU 21 also displays a parameter image of HP, MP, SP, etc., on thescreen 16 based on the updated parameters. Upon completion of the processing, the subroutine is exited. - Thus, the
CPU 21 performs character action control based on the action mode of the character selected at steps ST221 to ST227, etc. TheCPU 21 for executing such processing corresponds to an example of a character action controller. TheCPU 21 for executing such processing performs display control of action control of the character in a battle scene. TheCPU 21 for executing such processing corresponds to an example of the character action display controller. - (Display Control Processing)
- The “display control processing” called at a predetermined timing in the
image processing section 24 unlike the processing described above will be discussed withFIG. 15 . In the processing, a screen is displayed based on the screen display data set in theRAM 23 in the various types of display control processing described above. - First, the
image processing section 24 sets the eyepoint position, the light source position, etc., and executes display object three-dimensional coordinate calculation processing (ST231). In the processing, theimage processing section 24 selects the display object based on the data supplied from theCPU 21, such as the character to be displayed. Theimage processing section 24 reads the three-dimensional coordinates of all polygons concerning the selected display objects and places the display objects in virtual three-dimensional coordinates. Upon completion of the processing, the process is transferred to ST232. - At ST232, display object three-dimensional conversion processing is executed. In the processing, the
image processing section 24 calculates the positional relationship with the display object with the eyepoint position as the reference based on the eyepoint position and the three-dimensional coordinates of the placed display object. Upon completion of the processing, the process is transferred to ST233. - At ST233, whether or not action control is being performed is determined. In the processing, the
image processing section 24 determines whether or not action control is being performed depending on whether or not the action effect screen in the player character or the enemy character is displayed based on the data set in the various types of display control processing. If theimage processing section 24 determines that action control is being performed, it skips ST234 and transfers the process to ST235. On the other hand, if theimage processing section 24 does not determine that action control is being performed, it transfers the process to ST234. - At ST234, specific image control processing is executed. In the processing, the
image processing section 24 executes blurring processing as the specific image control processing. Theimage processing section 24 calculates the distance between the eyepoint position and the center of the display object. TheCPU 21 detects two reference positions based on the distance between the eyepoint position and the center of the display object and calculates the permeability at the center of the display object based on the distances between the two reference positions and the center of the display object, the permeability set in the two reference positions. Accordingly, theCPU 21 sets data to display an image of the depth of focus based on the distance of the display object from the eyepoint position. Upon completion of the processing, the process is transferred to ST235. - At ST235, rendering processing is executed. In the processing, the
image processing section 24 generates data for visualization based on the positional relationship with the display object with the eyepoint position calculated at ST232 as the reference and the like. Particularly, when theimage processing section 24 executes ST234, it executes rendering processing executing the specific image control processing; when theimage processing section 24 does not execute ST234, it executes rendering processing skipping the specific image control processing. Upon completion of the processing, the process is transferred to ST236. - This means that the
image processing section 24 for executing such processing executes specific image control for selection of the action mode of the character subjected to display control. On the other hand, theimage processing section 24 invalidates specific image control and limits specific image control while character action control is performed. Theimage processing section 24 for executing such processing corresponds to an example of a specific image control execution module, an example of a specific image control limitation module, and an example of a specific image control invalidation module. - At ST236, whether or not processing in all display objects terminates is determined. In the processing, the
image processing section 24 determines whether or not processing in all display objects terminates depending on whether or not the processing at steps ST231 to ST235 has been executed for all display objects (for example, characters, etc.) to be displayed based on the data set in the various types of display control processing. If theimage processing section 24 determines that processing in all display objects terminates, it transfers the process to ST237. On the other hand, if theimage processing section 24 does not determine that processing in all display objects terminates, it again transfers the process to ST231 for executing the processing at steps ST231 to ST235 for the remaining display objects. - At ST237, image display processing is executed. In the processing, the
image processing section 24 draws a display object with the eyepoint position as the reference as a result of the rendering at ST235 and displays the drawn image on thescreen 16. This means that theimage processing section 24 for executing such processing executes display control of selection of the action mode of the character executed in a battle scene and action control of the character. Theimage processing section 24 for executing such processing corresponds to an example of the character action display controller. Upon completion of the processing, the subroutine is exited. - Thus, selection of the action mode of the character based on the operation signal from the input device that can be operated by the player and a plurality of pieces of character data and action control of the character based on the action mode of the selected character are executed in accordance with the action order of the characters, and display control of selection of the action mode of the character to be executed and action control of the character is performed. While the character action control is performed in the character action mode selection and the character action control, execution of specific image control is limited. Therefore, while the character action control whose control load grows relatively is performed, if the control load of image control other than execution of specific image control whose control load is high grows, the specific image control is limited, so that degradation of the image can be prevented and the loss of the interest in the game can be suppressed. While the character action mode selection with relatively small control load is made, the specific image control is executed and the characters to be displayed are displayed without being skipped because the number of displayed characters is not limited, and further specific image processing is performed and game presence can be enhanced. As limitation of specific image control, the specific image control is invalidated while the character action control is performed. Therefore, only image control other than the specific image control whose control load is high can be executed and still more degradation of the image can be prevented and the loss of the interest in the game can be suppressed.
- (Program)
- The above-described gaming program will be discussed in detail. The gaming program specifically is a program for causing a computer to function as the following modules. In other words, the gaming program is also a program for causing a computer to execute the following modules (processes, steps). The gaming program is also a program for causing a computer to implement various functions of the following processes. Such a computer includes an input device that can be operated by the player, a display module for displaying an image concerning a game.
- (A1) Character data storage module (process) for storing a plurality of pieces of character data concerning a plurality of characters.
- (A2) Character action order determination module (process) for determining the action order of the plurality of characters.
- (A3) Character action mode selection module (process) for selecting the character action mode based on an operation signal from the input device and the plurality of pieces of character data.
- (A4) Character action controller (process) for performing character action control based on the character action mode selected by the character action mode selection module.
- (A5) Special character action controller (process) for executing selection of the character action mode by the character action mode selection module and the character action control by the character action controller in accordance with the action order of the plurality of characters determined by the character action order determination module.
- (A6) Character action display controller (process) for performing display control of selection of the character action mode and the character action control executed by the special character action controller.
- (A7) Specific image control execution module (process) for executing specific image control for selection of the character action mode and the character action control subjected to display control by the character action display controller.
- (A8) Specific image control limitation module (process) for limiting the specific image control of the specific image control execution module while the special character action controller performs the character action control.
- (A9) Specific image control invalidation module (process) for invalidating the specific image control as limitation of the specific image control while the character action controller performs the character action control.
- (Storage Medium)
- In a computer-readable record medium recording such a gaming program, a skill parameter and a possessed item parameter may be stored every plurality of characters in addition to the gaming program.
- Although the embodiment has been described, the invention is not limited to the specific embodiment. For example, the
input device 4 operated by the player may be integrated into the machinemain unit 1. - In the embodiment, after all characters appearing in a “battle scene” execute action, again the turn order to execute action of all characters is determined, but the invention is not limited to the mode and another embodiment may be adopted. For example, the character terminating execution of action may execute the next action before all characters execute action.
- In the embodiment, image control of executing the blurring processing as specific image control is adopted, but the invention is not limited to it. For example, fog processing of masking the display screen color by separately setup color data called fog data for adjustment may be performed as specific image control. Using the fog processing, a distant object can be viewed as a dim object according to the fog color. For example, if the fog color is white, the fog effect is produced; if the fog color is blue, the effect dim in the distance is produced. In the embodiment, as limitation of specific image control, the specific image control is invalidated while the character action control is performed, but the invention is not limited to the mode. For example, while the character action control is performed, the specific image control may be limited by switching execution of blurring processing, etc., for each frame so as to lighten addition of image control without completely invalidating the specific image control. The specific image control may be limited while the character action control is performed. For example, the specific image control need not always be limited over the time during which the character action control is performed; the specific image control may be limited at least in partial time.
- Further, the invention can also be applied to a portable gaming machine or a desk-top gaming machine including in one piece an operation section that can be operated by a player, a display section for displaying an image and audio (sound), a storage section for storing a gaming program, and a control section for executing control processing in accordance with the gaming program.
- Further, the invention can also be applied to a network game of the type wherein the above-described gaming program is stored in a server connected to a network such as Internet 56 (see
FIG. 16 ), etc., and a player can play a game by connecting to the server from a personal computer, a mobile telephone, a portable information terminal (PDA), etc. - A network game system in
FIG. 16 will be discussed by way of example. In the network game system,mobile telephones PDC network 51 capable of conducting packet communications, for example, throughbase stations information center 55 is accessed through thePDC network 51 in response to player operation and the game state. Theinformation center 55 acquires various pieces of information through a network such as theInternet 56 fromservers mobile telephones mobile telephones server 58 inFIG. 16 , the server storing the game data, etc., may be connected to theinformation center 55 by a private or leasedcommunication line 60 not via the network such theInternet 56. - To play a game, the player previously downloads a gaming program from the
server mobile telephone mobile telephone mobile telephone server mobile telephone mobile telephone PDC network 51. - In the embodiment, the
judgment ring 100 containing the reference area. and therotation bar 101 as a varying area are provided, but the invention is not limited to the mode and another embodiment may be adopted. For example, the judgment ring may be a varying area and an area like the rotation bar may be the reference area. That is, the reference area. or the varying area is formed containing a plurality of effective areas relatively advantageous to the player and a non-effective area relatively disadvantageous to the player. Such a judgment ring may be unused. - Although the embodiment of the invention has been described, it is to be understood that the embodiment is illustrative and not restrictive and that the invention is not limited to the specific embodiment thereof. That is, the invention is mainly characterized by a gaming program for causing a computer including an input device that can be operated by a player to function as a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters; a character action order determination module for determining the action order of the plurality of characters; a character action mode selection module for selecting a character action mode based on an operation signal from the input device and the plurality of pieces of character data; a character action controller for performing character action control based on the character action mode selected by the character action mode selection module; a special character action controller for executing selection of the character action mode by the character action mode selection module and the character action control by the character action controller in accordance with the action order of the plurality of characters determined by the character action order determination module; a character action display controller for performing display control of selection of the character action mode executed by the special character action controller and the character action control; a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by the character action display controller; and a specific image control limitation module for limiting the specific image control of the specific image control execution module while the special character action controller performs the character action control. However, the specific configurations of the input device, the character data storage module, the character action order determination module, the character action mode selection module, the character action controller, the special character action controller, the character action display controller, the specific image control execution module, the specific image control limitation module, the specific image control invalidation module etc., can be changed in design as required.
- As the advantages described in the embodiment of the invention, the most favorable advantages produced from the invention are only enumerated and the advantages of the invention are not limited to those described in the embodiment of the invention.
- Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Claims (3)
1. A gaming program product for use in a computer having an input device that can be operated by a player comprising:
a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters;
a character action order determination module for determining the action order of the plurality of characters;
a character action mode selection module for selecting a character action mode based on an operation signal from the input device and the plurality of pieces of character data;
a character action controller for performing character action control based on the character action mode selected by said character action mode selection module;
a special character action controller for executing selection of the character action mode by said character action mode selection module and the character action control by said character action controller in accordance with the action order of the plurality of characters determined by said character action order determination module;
a character action display controller for performing display control of selection of the character action mode and the character action control executed by said special character action controller;
a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by said character action display controller; and
a specific image control limitation module for limiting the specific image control of said specific image control execution module while said special character action controller performs the character action control.
2. The gaming program product as claimed in claim 1 for further comprising:
a specific image control invalidation module for invalidating the specific image control as limitation of the specific image control while said character action controller performs the character action control.
3. A gaming machine comprising:
an input device that can be operated by a player;
a character data storage module for storing a plurality of pieces of character data concerning a plurality of characters;
a character action order determination module for determining the action order of the plurality of characters;
a character action mode selection module for selecting a character action mode based on an operation signal from said input device and the plurality of pieces of character data;
a character action controller for performing character action control based on the character action mode selected by said character action mode selection module;
a special character action controller for executing selection of the character action mode by said character action mode selection module and the character action control by said character action controller in accordance with the action order of the plurality of characters determined by said character action order determination module;
a character action display controller for performing display control of selection of the character action mode executed by said special character action controller and the character action control;
a specific image control execution module for executing specific image control for selection of the character action mode and the character action control subjected to display control by said character action display controller; and
a specific image control limitation module for limiting the specific image control of said specific image control execution module while said special character action controller performs the character action control.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005106337A JP2006280731A (en) | 2005-04-01 | 2005-04-01 | Game program, game apparatus and recording medium |
JP2005-106337 | 2005-04-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20060223633A1 true US20060223633A1 (en) | 2006-10-05 |
Family
ID=37071293
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/392,618 Abandoned US20060223633A1 (en) | 2005-04-01 | 2006-03-30 | Gaming program, gaming machine, and record medium |
Country Status (2)
Country | Link |
---|---|
US (1) | US20060223633A1 (en) |
JP (1) | JP2006280731A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210394064A1 (en) * | 2019-03-07 | 2021-12-23 | Cygames, Inc. | Information processing program, information processing method, information processing device, and information processing system |
US11318382B2 (en) * | 2014-02-25 | 2022-05-03 | Gree, Inc. | Game control method, system, and non-transitory computer-readable recording medium |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10504317B2 (en) * | 2007-04-30 | 2019-12-10 | Cfph, Llc | Game with player actuated control structure |
EP3113159A1 (en) * | 2015-06-30 | 2017-01-04 | Thomson Licensing | Method and device for processing a part of an immersive video content according to the position of reference parts |
CN108525300B (en) * | 2018-04-27 | 2020-01-24 | 腾讯科技(深圳)有限公司 | Position indication information display method, position indication information display device, electronic device and storage medium |
Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5498002A (en) * | 1993-10-07 | 1996-03-12 | Gechter; Jerry | Interactive electronic games and screen savers with multiple characters |
US6146277A (en) * | 1996-08-21 | 2000-11-14 | Konami Co., Ltd. | Command input method and recording medium |
US6154197A (en) * | 1995-08-10 | 2000-11-28 | Kabushiki Kaisha Sega Enterprises | Virtual image generation method and its apparatus |
US6217446B1 (en) * | 1996-12-06 | 2001-04-17 | Kabushi Kaisha Sega Enterprises | Game device and picture processing device |
US6322448B1 (en) * | 1996-06-05 | 2001-11-27 | Kabushiki Kaisha Sega Enterprises | Fictitious virtual centripetal calculation and simulation system |
US6409604B1 (en) * | 1999-06-30 | 2002-06-25 | Square Co., Ltd. | Computer readable program product, method of controlling display of game and game system |
US6496981B1 (en) * | 1997-09-19 | 2002-12-17 | Douglass A. Wistendahl | System for converting media content for interactive TV use |
US6666764B1 (en) * | 1999-04-27 | 2003-12-23 | Konami Co., Ltd. | Method of controlling a character in a video game |
US6712700B1 (en) * | 1999-10-29 | 2004-03-30 | Kabushiki Kaisha Square Enix | Stereo model displaying method and apparatus in video game, game apparatus, and computer-readable recording medium stored with stereo model displaying program for video game |
US6716103B1 (en) * | 1999-10-07 | 2004-04-06 | Nintendo Co., Ltd. | Portable game machine |
US6717575B2 (en) * | 2000-02-17 | 2004-04-06 | Sony Computer Entertainment Inc. | Image drawing method, image drawing apparatus, recording medium, and program |
US6771265B1 (en) * | 1999-12-28 | 2004-08-03 | Kabushiki Kaisha Square Enix | Computer-readable recording medium storing a program for 3-D computer image processing, method of depicting shadow, and video game apparatus |
US20040166935A1 (en) * | 2001-10-10 | 2004-08-26 | Gavin Andrew Scott | Providing game information via characters in a game environment |
US20050054402A1 (en) * | 2003-07-17 | 2005-03-10 | Namco Ltd. | Game apparatus, game control method and recording medium storing program for executing the method |
US20050208993A1 (en) * | 2004-03-11 | 2005-09-22 | Aruze Corp. | Gaming machine and program thereof |
US6971957B2 (en) * | 2002-06-07 | 2005-12-06 | Nintendo Co., Ltd. | Game system and program using a shadow volume to display a shadow of an object in a three dimensional video game |
US20060040738A1 (en) * | 2002-11-20 | 2006-02-23 | Yuichi Okazaki | Game image display control program, game device, and recording medium |
US7079176B1 (en) * | 1991-11-25 | 2006-07-18 | Actv, Inc. | Digital interactive system for providing full interactivity with live programming events |
US7091931B2 (en) * | 2001-08-17 | 2006-08-15 | Geo-Rae Co., Ltd. | Method and system of stereoscopic image display for guiding a viewer's eye motion using a three-dimensional mouse |
US7104891B2 (en) * | 2002-05-16 | 2006-09-12 | Nintendo Co., Ltd. | Game machine and game program for displaying a first object casting a shadow formed by light from a light source on a second object on a virtual game space |
US7254265B2 (en) * | 2000-04-01 | 2007-08-07 | Newsight Corporation | Methods and systems for 2D/3D image conversion and optimization |
US7309287B2 (en) * | 2003-12-10 | 2007-12-18 | Nintendo Co., Ltd. | Game machine having display screen with touch panel |
-
2005
- 2005-04-01 JP JP2005106337A patent/JP2006280731A/en not_active Withdrawn
-
2006
- 2006-03-30 US US11/392,618 patent/US20060223633A1/en not_active Abandoned
Patent Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7079176B1 (en) * | 1991-11-25 | 2006-07-18 | Actv, Inc. | Digital interactive system for providing full interactivity with live programming events |
US5498002A (en) * | 1993-10-07 | 1996-03-12 | Gechter; Jerry | Interactive electronic games and screen savers with multiple characters |
US6154197A (en) * | 1995-08-10 | 2000-11-28 | Kabushiki Kaisha Sega Enterprises | Virtual image generation method and its apparatus |
US6322448B1 (en) * | 1996-06-05 | 2001-11-27 | Kabushiki Kaisha Sega Enterprises | Fictitious virtual centripetal calculation and simulation system |
US6146277A (en) * | 1996-08-21 | 2000-11-14 | Konami Co., Ltd. | Command input method and recording medium |
US6217446B1 (en) * | 1996-12-06 | 2001-04-17 | Kabushi Kaisha Sega Enterprises | Game device and picture processing device |
US6496981B1 (en) * | 1997-09-19 | 2002-12-17 | Douglass A. Wistendahl | System for converting media content for interactive TV use |
US6666764B1 (en) * | 1999-04-27 | 2003-12-23 | Konami Co., Ltd. | Method of controlling a character in a video game |
US6409604B1 (en) * | 1999-06-30 | 2002-06-25 | Square Co., Ltd. | Computer readable program product, method of controlling display of game and game system |
US6716103B1 (en) * | 1999-10-07 | 2004-04-06 | Nintendo Co., Ltd. | Portable game machine |
US6712700B1 (en) * | 1999-10-29 | 2004-03-30 | Kabushiki Kaisha Square Enix | Stereo model displaying method and apparatus in video game, game apparatus, and computer-readable recording medium stored with stereo model displaying program for video game |
US6771265B1 (en) * | 1999-12-28 | 2004-08-03 | Kabushiki Kaisha Square Enix | Computer-readable recording medium storing a program for 3-D computer image processing, method of depicting shadow, and video game apparatus |
US6717575B2 (en) * | 2000-02-17 | 2004-04-06 | Sony Computer Entertainment Inc. | Image drawing method, image drawing apparatus, recording medium, and program |
US7254265B2 (en) * | 2000-04-01 | 2007-08-07 | Newsight Corporation | Methods and systems for 2D/3D image conversion and optimization |
US7091931B2 (en) * | 2001-08-17 | 2006-08-15 | Geo-Rae Co., Ltd. | Method and system of stereoscopic image display for guiding a viewer's eye motion using a three-dimensional mouse |
US20040166935A1 (en) * | 2001-10-10 | 2004-08-26 | Gavin Andrew Scott | Providing game information via characters in a game environment |
US7104891B2 (en) * | 2002-05-16 | 2006-09-12 | Nintendo Co., Ltd. | Game machine and game program for displaying a first object casting a shadow formed by light from a light source on a second object on a virtual game space |
US6971957B2 (en) * | 2002-06-07 | 2005-12-06 | Nintendo Co., Ltd. | Game system and program using a shadow volume to display a shadow of an object in a three dimensional video game |
US20060040738A1 (en) * | 2002-11-20 | 2006-02-23 | Yuichi Okazaki | Game image display control program, game device, and recording medium |
US20050054402A1 (en) * | 2003-07-17 | 2005-03-10 | Namco Ltd. | Game apparatus, game control method and recording medium storing program for executing the method |
US7309287B2 (en) * | 2003-12-10 | 2007-12-18 | Nintendo Co., Ltd. | Game machine having display screen with touch panel |
US20050208993A1 (en) * | 2004-03-11 | 2005-09-22 | Aruze Corp. | Gaming machine and program thereof |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11318382B2 (en) * | 2014-02-25 | 2022-05-03 | Gree, Inc. | Game control method, system, and non-transitory computer-readable recording medium |
US20210394064A1 (en) * | 2019-03-07 | 2021-12-23 | Cygames, Inc. | Information processing program, information processing method, information processing device, and information processing system |
Also Published As
Publication number | Publication date |
---|---|
JP2006280731A (en) | 2006-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR100743128B1 (en) | Image processing method, video game apparatus and storage medium | |
JP4756632B2 (en) | GAME PROGRAM AND GAME DEVICE | |
EP0990457A1 (en) | Recording medium and entertainment system | |
US7497777B2 (en) | Gaming machine and computer-readable program product | |
US7204758B2 (en) | Video game apparatus and control method thereof, and program of video game and computer-readable recording medium having program recorded thereon | |
JPH1142370A (en) | Game device, game image processing method and readable recording medium | |
US20040259613A1 (en) | Gaming machine and computer-readable program product | |
US20040259636A1 (en) | Gaming machine and computer-readable program product | |
JP2001149640A (en) | Game machine, game processing method, and recording medium recording program | |
JP4181200B2 (en) | Video game processing apparatus and video game processing program | |
US20060223633A1 (en) | Gaming program, gaming machine, and record medium | |
JP2001232056A (en) | Video game device, image expression method, program control method and storage medium | |
US6283854B1 (en) | Video game apparatus, video game method and storage medium | |
JP4864120B2 (en) | GAME PROGRAM, GAME DEVICE, GAME CONTROL METHOD | |
US6672964B2 (en) | Video game with readily distinguishable characters displaying the character's line of sight | |
CN115708956A (en) | Game picture updating method and device, computer equipment and medium | |
US20020137563A1 (en) | Role playing video game using cards | |
JP6211667B1 (en) | System, method, and program for providing game | |
US6585594B2 (en) | Storage medium storing display control program, entertainment apparatus, and display control program | |
JP2005204947A (en) | Program, information storage medium and image generation system | |
JP2001013952A (en) | Image generating system and information recording medium | |
JP3496149B1 (en) | Shooting game apparatus and program | |
US20020082078A1 (en) | Storage medium storing display control program, entertainment apparatus, and display control program | |
JP4194456B2 (en) | Game device and program for causing computer to function | |
JP7068417B2 (en) | Systems, methods, and programs for providing games |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ARUZE CORP., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HAMAMOTO, IZUMI;REEL/FRAME:017701/0622 Effective date: 20060322 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |