US20140274373A1 - System and method for animating virtual characters - Google Patents

System and method for animating virtual characters Download PDF

Info

Publication number
US20140274373A1
US20140274373A1 US13/831,333 US201313831333A US2014274373A1 US 20140274373 A1 US20140274373 A1 US 20140274373A1 US 201313831333 A US201313831333 A US 201313831333A US 2014274373 A1 US2014274373 A1 US 2014274373A1
Authority
US
United States
Prior art keywords
virtual character
character
virtual
toy
bones
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/831,333
Inventor
Adam Olshan
Daniel Doptis
Anthony Pardee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Activision Publishing Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/831,333 priority Critical patent/US20140274373A1/en
Assigned to ACTIVISION PUBLISHING, INC. reassignment ACTIVISION PUBLISHING, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PARDEE, ANTHONY, DOPTIS, DANIEL, OLSHAN, ADAM
Assigned to BANK OF AMERICA, N.A. reassignment BANK OF AMERICA, N.A. SECURITY AGREEMENT Assignors: ACTIVISION BLIZZARD, INC.
Publication of US20140274373A1 publication Critical patent/US20140274373A1/en
Assigned to BLIZZARD ENTERTAINMENT, INC., ACTIVISION PUBLISHING, INC., ACTIVISION BLIZZARD INC., ACTIVISION ENTERTAINMENT HOLDINGS, INC. reassignment BLIZZARD ENTERTAINMENT, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: BANK OF AMERICA, N.A.
Priority to US16/539,827 priority patent/US10885694B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/825Fostering virtual characters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/98Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers

Definitions

  • the present invention relates generally to animating a virtual character, such as those found in video games or animated movies, and more particularly to animating a virtual character comprised of parts of other virtual characters.
  • Animating a virtual character typically involves building an animated three-dimensional (3D) character model that is rigged with a virtual skeleton.
  • a character rig generally includes the character's virtual skeleton bound to the character's 3D mesh or skin.
  • the virtual skeleton typically includes a plurality of joints and/or bones that can be manipulated to move or deform the model into different poses.
  • the virtual skeleton provides the basic form of the virtual character.
  • a humanoid virtual character may have a virtual skeleton that has bones representing a human form (e.g., a head bone, spine bones, arm/hand bones, leg/feet bones, pelvic bones, etc.).
  • a reptilian virtual character may have a virtual skeleton that has bones representing a reptile (e.g., a tail bone, four leg bones, spine bones, elongated skull and facial bones, etc.).
  • FIG. 12 depicts a generic humanoid virtual skeleton 1200 comprising a torso body part 1210 and a legs body part 1220 with various bones and joints. Surfaces, which may used to convey skin, hair, texture, eyes, mouth, etc., may also be added to the virtual character.
  • a given virtual character may have one animation rig and, correspondingly, one virtual skeleton.
  • Animators create animation clips (or animations) for the virtual character that manipulate the virtual character's joints and/or bones into various positions and poses.
  • These animation clips can be used to define the virtual character's movements and behaviors.
  • virtual characters in a video game may have predefined animation clips associated with movements and actions such as idling, walking, running, attacking, jumping, receiving damage, casting spells, climbing, flying, speaking, using items, or any other movement or action.
  • the animation clips may apply to and/or control all or a subset of the virtual skeleton's bones. Often, the animation clips for each virtual character will be used to impart personality to the virtual character.
  • the idling animation for virtual character A may be rigid and upright, suggesting a formal or restricted personality.
  • the idling animation for virtual character B may be slouched and relaxed, suggesting a laid back personality.
  • Virtual characters may be non-human characters and/or objects, including monsters, animals, robots, weapons, clothing, vehicles, or any other in-game characters or objects.
  • the animation clips defined for virtual characters may help portray the unique characteristics of those virtual characters.
  • the idle animations for a monster-like virtual character may include menacing actions like showing his teeth or growling.
  • multiple animations may affect one or more bones of a virtual character's skeleton.
  • This technique is sometimes called layering.
  • virtual character A may be animated by simultaneously applying a running animation clip and a laughing animation clip to the virtual character.
  • the running animation may control all of the bones of virtual character A, while the laughing animation may only control a subset of bones, such as those in the face.
  • virtual character A appears to be laughing while running.
  • Different weights may be applied to the various animation clips that are layered on a given virtual character's skeleton.
  • the weights define the relative impact of the layered animation clips.
  • the laughing animation clip may be heavily weighted relative to the running animation clip with respect to the virtual character's facial bones, thus allowing the laughing animation to assert more control over the facial bones when the two animations are layered.
  • aspects of the present invention relate to situations in which virtual characters are combinations of interchangeable parts from one or more different and independently animated virtual characters.
  • Such combination characters will be referred to herein as composite virtual characters.
  • An example composite virtual character might have the upper body of a reptilian virtual character and the lower body of a robot virtual character.
  • this is merely an example, and the number of composite virtual characters is limited only by the number of virtual characters available for combination.
  • Animating composite virtual characters presents certain challenges. To illustrate some of these challenges, consider the case of a composite character that combines the upper body of virtual character A with the lower body of virtual character B. If virtual character A and virtual character B have distinct and conflicting animation clips denoting different behaviors and personalities, simply applying the predefined animation clips for virtual character A and virtual character B to their respective body parts may result in disjointed and unconvincing animations for the composite virtual character. For example, virtual character A's personality may be portrayed as rigid and formal, thus his idle animation clip may be stiff and relatively still. Virtual character B, on the other hand, might be portrayed as excitable and nervous, and thus his idle animation clip may include fidgeting and twitching motions.
  • a software program running on a gaming platform comprises a plurality of virtual characters, each comprising a plurality of interchangeable body parts.
  • An animation clip may be defined for an interchangeable body part of a first virtual character.
  • the animation clip may be defined for the first virtual character's interchangeable body party using a virtual skeleton that has the bones of the interchangeable body part and the bones of one or more generic body parts.
  • the defined animation clip may control one or more of the bones of the interchangeable body part and one or more of the bones of the one or more generic body parts.
  • the plurality of virtual characters comprises a second virtual character with interchangeable body parts.
  • the defined animation clip for the first virtual character's interchangeable body part may be applied to the generic bones of the second virtual character.
  • the software program running on the gaming platform may command display of a composite virtual character on a display device associated with the gaming platform, the composite virtual character comprising the first interchangeable body part from the first virtual character and a second interchangeable body part from the second virtual character.
  • the software program may control animation of the second interchangeable body part from the second virtual character using the defined animation clip for the first interchangeable body part from the first virtual character.
  • the first virtual character may correspond to a first toy comprising a plurality of toy parts
  • the second virtual character may correspond to a second toy comprising a plurality of toy parts.
  • Toy parts from the first toy and second toy may be connected, combined or assembled together to form a toy assembly representing the composite virtual character.
  • the toy assembly and/or individual toy parts may be configured to communicate with the gaming platform.
  • the toy assembly may communicate with the gaming platform either directly or via a peripheral device.
  • the software program running on the gaming platform may be used to identify the individual toy parts and determine the corresponding toy assembly and composite virtual character.
  • the gaming platform then displays the composite virtual character in a virtual environment on the display device.
  • a user of the gaming platform can interchange the first toy part and second toy part with additional toy parts from a plurality of virtual characters.
  • the interchanging of toy parts causes a contemporaneous graphical display of the new toy assembly's corresponding composite virtual character. Accordingly, a user can affect the appearance and interaction between the composite virtual character and the virtual environment by modifying the physical toy parts and accessory parts.
  • each toy part may be configured with an identification tag, such as an RFID tag with a numeric or alphanumeric code providing an identification of the toy part.
  • Each toy part may communicate with the gaming platform, either directly, via a peripheral or via other toy parts or any combination thereof, to provide the gaming platform with the identification information in the tag.
  • a peripheral is in communication with the gaming platform.
  • the toy assembly comprising the plurality of toy parts may be placed on or in proximity of the peripheral.
  • the toy part closest to the peripheral may include an antenna for communicating with the peripheral.
  • the other toy parts comprising the toy assembly may communicate with the toy part closes to the peripheral either through wireless transmission or wired transmission.
  • each toy part includes a rewritable memory.
  • Information relating to the toy part may be stored in the memory.
  • information pertaining to the ownership of the toy part, the use of the toy part in connection with one or more gaming platforms or attributes of the toy part within the virtual environment may be stored in the memory.
  • data relating to accomplishments and challenges overcome by the user in the video game may be stored in the memory of the toy part.
  • the user may be given opportunities to modify certain virtual attributes associated with one or more toy parts as he or she plays the video game.
  • the stored information may be used in subsequent gaming sessions and across various gaming platforms so that the virtual attributes of each toy part an each accessory part persist.
  • toy parts may comprise accessories.
  • a toy part may be a weapon, clothing item, hat, shield, armor, shoes or other accessories that may be connected, attached, interlocked with or otherwise combined with a toy assembly having one or more parts.
  • Some aspects of the invention provide a computer-implemented method for animating a composite virtual character, comprising: displaying a composite virtual character comprising a first part from a first virtual character and a second part from a second virtual character, and animating the composite virtual character, wherein animating the composite virtual character comprises substantially simultaneously animating first portions of the first part from the first virtual character and second portions of the second part from the second virtual character using an animation defined for the first virtual character and animating at least first portions of the second part from the second virtual character using an animation defined for the second virtual character.
  • a video game system comprising: a game device having an input device and a processor for executing program code for providing for play of a videogame; a plurality of physical parts from a plurality of toy figures that are physically combinable to form a composite toy assembly, the plurality of physical parts including memory providing a unique identification of each physical part and configured to communicate said unique identification; said program code having instructions for providing a graphical display of a composite virtual character representing said composite toy assembly, said composite virtual character comprised of virtual parts representing physical parts combined to form the composite toy assembly; said program code further having instructions for providing a virtual environment for said composite virtual character; wherein said movement and actions of said composite virtual character are controlled, in response to inputs received by said input device, by at least one animation defined for at least one of said virtual parts and at least one other animation defined for at least one other one of said virtual parts, with the at least one animation defined for at least one of said virtual parts completely controlling movement and actions of some elements of said composite virtual character and partially controlling movement and actions of other elements
  • Some aspects of the invention provide a computer implemented method including animating a character comprised of different portions derived from other characters, some of the different portions including body elements of a first type and some of the different portions including body elements of a second type, the method comprising: applying, for each of the different portions including body elements of the first type, an animation routine defined for the different portion for the other character from which the different portion was derived for the body elements of the first type; and applying in a weighted manner, for each of the different portions including body elements of the second type, the animation routines defined for the different portion for the other characters for the body elements of the second type.
  • Some aspects of the invention provide a method of animating a character defined by a combination of elements of other characters, some of whose elements are considered generic to a plurality of characters and some of whose elements are considered specific to each of the other characters, with no animation routines being predefined for the character but with animation routines being predefined for the other characters, the method comprising: determining characters serving as a source of elements of the character; receiving a command for display of a particular animation routine for the character; generating the particular animation routine for the character by: applying corresponding character specific predefined animation routines to elements of the character that are considered specific to the characters serving as the source of elements of the character, and applying weighted sums of the corresponding character specific predefined animation routines to elements of the character that are considered generic to the characters serving as the source of elements of the character.
  • FIG. 1 is a block diagram illustrating an example of a video game system in accordance with aspects of the present invention
  • FIG. 2A depicts an example of a toy assembly for use in conducting a video game in accordance with aspects of the present invention
  • FIG. 2B depicts an example of a toy assembly for use in conducting a video game in accordance with aspects of the present invention
  • FIG. 2C depicts an example of a composite toy assembly for use in conducting a video game in accordance with aspects of the present invention
  • FIG. 3 is a diagram depicting electronic components of toy parts in accordance with aspects of the present invention.
  • FIG. 4 is a diagram depicting electronic components of toy parts in accordance with aspects of the present invention.
  • FIG. 5 illustrates an example of a video game system in accordance with aspects of the invention
  • FIG. 6 is a flow diagram of a process for selecting and communicating with toy parts in accordance with aspects of the present invention.
  • FIG. 7 is a flow diagram of a process for conducting video game play in accordance with aspects of the present invention.
  • FIG. 8 is a flow diagram of a process for conducting video game play in accordance with aspects of the present invention.
  • FIG. 9 depicts a flow diagram of a process for identification of one or more toy assemblies by the game platform.
  • FIG. 10 depicts a flow diagram of a process defining an animation for a virtual character's interchangeable body part in accordance with the present invention.
  • FIG. 11 depicts a flow diagram of a process for animating composite virtual characters in accordance with aspects of the present invention.
  • FIG. 12 depicts an exemplary generic virtual skeleton in accordance with the present invention.
  • FIG. 13 depicts an exemplary modified virtual skeleton in accordance with the present invention.
  • FIG. 14 depicts an exemplary virtual skeleton of a composite virtual character in accordance with the present invention.
  • FIG. 1 is a block diagram illustrating an example of a video game system in accordance with aspects of the present invention.
  • the video game system 100 includes game system 140 and a toy assembly comprised of a plurality of component toy parts 120 a - n .
  • the toy assembly may consist of toy parts associated with a single character, or the toy assembly may consist of toy parts from a plurality of characters (referred to as a “composite toy assembly”).
  • the toy parts may be physically combined, coupled, connected or otherwise adjoined to create a toy assembly.
  • the toy parts may be coupled in an interlocked fashion to create a toy assembly, for example via a physical locking mechanism, electromagnetic or other locking mechanism.
  • the toy parts 120 a - n may be connected by a force, for example a physical or electromagnetic force, such as by way of interlocking physical components, frictional fittings, or magnetic couplings, or by way of other known connections.
  • Each of the toy parts 120 a - n may include a rewriteable data storage component, such as RAM or rewritable RFID tag.
  • the memory or tag may store data reflecting the identification of the toy part.
  • the memory may store other data corresponding to a portion of a character or other object within the game executed on game platform 140 which the toy part represents.
  • the other data may include data such as strength, experience, wealth, health, ownership, achievements, activity level, use or other game play data of the portion of the character or other object.
  • the toy part corresponds to an arm of a character in game play
  • the memory of the toy part may store information regarding strength or health of the arm.
  • the memory may store other data, for example the other data mentioned above, with respect to a character or object as a whole, and in some embodiments all toy parts which in combination correspond to the character or object may store some or all of such information.
  • the memory may be rewritable so that the stored attributes and characteristics of the toy parts may be updated during each game session and utilized in subsequent game sessions.
  • the game platform 140 is a system for executing game software and in various embodiments may comprise a device such as a personal computer, laptop, tablet, game console, portable game platform, or mobile device, or in some embodiments one or more devices in communication with one or more servers.
  • the game platform 140 comprises a processor for executing program instructions providing for game play and associated circuitry, a video game controller 180 , a display device 170 , and in some embodiments a peripheral device (not shown in FIG. 1 ) for communicating with a toy or toy parts.
  • the game platform 140 may connect or be coupled to a display device or have a display device integrated with or within the game platform for displaying graphics associated with the game operating on the game platform 140 .
  • the instructions providing for game play may be stored on removable media, for example, an optical disk or cartridge, or otherwise stored in memory of the game platform.
  • the game platform for example a game console, may include an optical drive, for example, a DVD-ROM drive, for reading the instructions for game play.
  • the instructions providing for game play may be stored in a remote server that are accessed by a game platform, for example a computer, PC, game console, or mobile device.
  • the instructions providing for game play may be stored locally in the game device memory.
  • the toy parts 120 a - n may communicate with game platform 140 directly or via a peripheral device.
  • a first toy part 120 a may communicate information to second toy part 120 b and the second toy part 120 b may communicate information relating to both first toy part 120 a and second toy part 120 b to game platform 140 , either directly or via peripheral 130 as depicted in FIG. 1 .
  • multiple toy parts may communicate information to the second toy part, either directly or through one or more intervening toy parts, with the second toy part communicating information to the game platform, either directly or through the peripheral.
  • the toy parts 120 a - n communicate with game platform 140 independently.
  • FIGS. 2A and 2B depict examples of toy assemblies for use in conducting a video game in accordance with aspects of the present invention.
  • the toy assemblies depicted in FIGS. 2A and 2B each consist of toy parts for a single character.
  • FIG. 2A depicts a toy assembly 200 configured as a reptilian toy figure.
  • FIG. 2B depicts a toy assembly 250 configured as a robot toy figure.
  • toy assemblies 200 and 250 could instead be configured as an action figure, robot figure, a vehicle, humanoid figure, monster figure, or other toy figure.
  • Toy assembly 200 and 250 of FIGS. 2A and 2B each include two toy parts: a torso 220 , 260 and legs 230 , 270 .
  • toy parts Although two toy parts are shown, the number and type of toy parts are exemplary only and should not be considered as limiting.
  • the head and/or arms included in torso 220 , 260 and the tail 240 included in the legs 230 may also be provided as separate toy parts.
  • FIG. 2C depicts an example of a composite toy assembly for use in conducting a video game in accordance with aspects of the present invention.
  • composite toy assemblies consists of toy parts from a plurality of characters.
  • Composite toy assembly 280 includes two toy parts: a torso 290 and legs 295 .
  • composite toy assembly 280 combines the torso of toy assembly 200 and legs of toy assembly 250 .
  • Each of the different toy parts may be part of a class of toy parts for use in various toy assemblies. That is, a toy assembly may be configured according to preference using a plurality of interchangeable torso parts and a plurality of interchangeable leg parts. For example, either torso 220 or torso 260 may be replaced with a different torso from a different character to create a new composite character.
  • the toy parts comprise accessories or other objects to be used by the toy character.
  • a toy part may comprise a weapon, shield, tool, clothing, accoutrements or other item.
  • the toy parts may be physically combined, coupled, connected or otherwise adjoined to create a toy assembly.
  • the toy parts may be coupled in an interlocked fashion to create a toy assembly, for example via a physical locking mechanism, electromagnetic mechanism or other locking mechanism.
  • the connectors for each of the toy parts may be configured so as to restrict connection of toy parts, for example, to restrict use of a torso toy part to replace a legs toy part.
  • Each toy part includes machine-readable information, for example, memory, a radio frequency identification (RFID) tag or a barcode.
  • RFID radio frequency identification
  • the machine-readable information may be sensed, read, and/or in some embodiments written, directly by a game console, or in some embodiments indirectly by way of sending data and commands to the toy to write the data to memory of the toy parts.
  • the machine-readable information may include a numeric identifier.
  • the communication with the toy may be conducted via a peripheral such as a peripheral or other reader.
  • the machine-readable information allows the reader, or the processor of the game console, to distinguish one toy part from other toy parts, and the machine-readable information may therefore be considered to include a toy part identifier, and in some embodiments, each particular toy part may have its own distinct identifier.
  • the machine readable information includes additional information related to player achievement in a video game when the part is in use.
  • FIG. 3 is a diagram depicting an embodiment of the electronic components of toy parts in connection with the present invention.
  • First toy part 310 comprises an RFID tag 315 .
  • RFID tag 315 utilizes a wireless system that uses radio-frequency electromagnetic fields to transfer data from (and in various embodiments to) the tag coupled, for example for purposes of automatic identification and tracking. Some tags require no battery and are powered by the electromagnetic fields used to read them. Others use a local power source and emit radio waves (electromagnetic radiation at radio frequencies).
  • RFID tag 315 contains numerical information for identifying first toy part 310 .
  • First toy part 310 may be physically coupled to a second toy part 320 .
  • Second toy part 320 includes a circuit 325 , for example an inductor circuit, for receiving the RFID electromagnetic field from RFID tag 315 in first toy part 310 .
  • the numerical information in RFID tag 315 is transmitted to the inductor circuit 325 .
  • Inductor circuit 325 is electronically coupled to an interface 327 , such as a near field transmitter, in second toy part 320 .
  • Interface 327 communicates with peripheral 330 .
  • the near field transmitter may also be an RFID tag, in some embodiments.
  • the peripheral 330 includes a radio-frequency interface 335 to communicate with toys and/or toy parts.
  • the radio-frequency interface is an RFID interface.
  • the peripheral may include a different interface for communicating with toys, such as an optical interface or a wired interface.
  • the toy may include a wired connection to the peripheral device, or in some embodiments, a wired connection to the game platform, possibly dispensing with the peripheral device.
  • the toy may include wireless communication capabilities of the type commonly used with computers, for example Bluetooth, NFC or Wi-Fi capabilities.
  • the peripheral 330 may then transmit the information received from RFID tag 315 associated with first toy part 310 and information received from an RFID tag in second toy part 320 to a game platform utilizing antenna 340 .
  • FIG. 4 is a diagram depicting an embodiment of the electronic components of toy parts in connection with the present invention.
  • First toy part 410 comprises an RFID tag or a storage device.
  • the RFID tag contains numerical information for identifying first toy part 410 .
  • First toy part 410 may be physically coupled to second toy part 420 .
  • First toy part 410 includes one or more plugs or connectors 430 that fit into a receptacle(s) 440 on second toy part 420 .
  • the connector(s) 430 and receptacle(s) 440 may be used to provide an electrical connection between the two toy parts to enable the transmission of data between the connected toy parts.
  • the first toy part 410 and second toy part 420 may utilize magnetic connectors to maintain contact between the toy parts.
  • the first toy part 410 and second toy part 420 may each have a magnetic element.
  • the connector(s) 430 and receptacle(s) 440 When the connector(s) 430 and receptacle(s) 440 are brought into proximity, the magnetic attraction between the magnet and its complement, whether another magnet or a ferromagnetic material, maintains the toy parts in contact with one another.
  • the magnetic elements may maintain the contacts in an electrically conductive relationship.
  • first toy part 410 When the connectors associated with first toy part 410 are in contact with receptacle associated with second toy part 420 , data from a memory device or numerical information in an RFID tag in first toy part 410 may be transmitted to second toy part 420 for subsequent transmission to a game platform, and/or in some embodiments vice versa.
  • strength of transmitters for communication between the two toy parts are selected to be sufficiently low to require contact between the toy parts to allow for successful communication between the toy parts.
  • Such a configuration may be beneficial, for example, to reduce or eliminate interference with other communications to the game platform or a peripheral, or receipt of extraneous communications by same.
  • FIG. 5 illustrates an example of a video game system in accordance with aspects of the invention.
  • the video game system includes a game console 550 with a processor for executing program instructions providing for game play and associated circuitry, user input devices such as a game controller 555 , a display device 560 for displaying game action, a peripheral device 540 , and a toy assembly 575 .
  • Toy assembly 575 is comprised of a plurality of interconnected toy parts, including head part 575 a , torso part 575 b , arm parts 575 c , leg parts 575 d , and tail part 575 e , each of which includes memory storing identification information.
  • the peripheral device 540 may provide the capability to read and write information to the toy assembly 575 and/or its component toy parts.
  • the processor responsive to inputs from the user input devices and the peripheral device, generally commands display on the display device of game characters in and interacting with a virtual world of game play and possibly each other.
  • the processor responsive to inputs from the peripheral device, may be used to add characters and objects to the virtual world, with the characters able to manipulate the added objects and move about the virtual world.
  • the processor may include characters in game play based on inputs from the peripheral device, and the processor may control actions and activities of game characters based on inputs from the user input devices.
  • the instructions providing for game play are generally stored on removable media, for example, an optical disk.
  • the game console may include an optical drive, for example, a DVD-ROM drive, for reading the instructions for game play.
  • the game console may be a personal computer, including similar internal circuitry as herein described, as well as, for example, a built-in display and built-in user input devices, such as a keyboard and a touch pad.
  • the instructions providing for game play may be stored in a remote server that are accessed by a computer or mobile device.
  • the instructions providing for game play may be stored locally in the game device memory.
  • the display device is generally coupled to the game platform by a cable, although in some embodiments a wireless connection may be used.
  • the display device is a liquid crystal display.
  • the display device is a television.
  • the display device is a cathode ray display, a plasma display, an electroluminescent display, an LED or OLED display, or other display.
  • a display screen 570 of the display device displays video images of game play, generally as commanded by the processor or other associated circuitry of the game platform.
  • the display screen shows a screen shot of video game play. As illustrated, the screen shot shows a display of a character, generally controlled by and animated in accordance with user inputs, approaching an inanimate item in the form of what may be considered a castle.
  • the peripheral device in some embodiments and as shown in FIG. 5 , has a substantially flat upper surface for placement of toys thereon.
  • the game player generally places game toys, for example, toy assembly 575 in the form and representative of a dragon as shown in FIG. 5 , on the flat surface of the peripheral device during game play.
  • the toy assembly 575 is generally in the form of and representative of a game item such as a game character or other game item. In several embodiments, the toy assembly is associated with a game character during game play.
  • Peripheral 540 includes a surface 545 where toy assembly 575 may be placed. Peripheral 540 may be coupled with a game platform 550 either through a wired or wireless connection.
  • Game platform 550 may be any form of game platform, such as game console (e.g., Xbox, Playstation, Wii, NDS), computer, mobile device or other device for executing game software either locally or from a server.
  • the game platform 550 executes software for a video game.
  • the game platform 550 may be connected to a display 560 . In other embodiments, a display may be incorporated into the game platform 550 , such as in mobile devices or portable computer devices.
  • the display 560 provides for the visual display of graphics associated with the game 570 .
  • a software program running on the game platform 550 allows the game platform 550 to identify the individual toy parts and determine the corresponding toy assembly 575 .
  • the game platform 550 displays graphically a virtual character representing the toy assembly 575 comprised of the toy parts assembled or combined together.
  • the toy assembly 575 may be a composite toy assembly comprised of toy parts from different toy figures.
  • the corresponding virtual character representing the toy assembly 757 would be a composite virtual character.
  • the virtual character or composite virtual character may be displayed in a virtual environment on a display device 560 associated with the game platform 550 .
  • the toy parts interact dynamically with the software program so that the virtual character representing the toy on the display device corresponds to the physical appearance of the toy assembly.
  • the user can interchange toy parts with a contemporaneous graphical display of the corresponding virtual character. Accordingly, a user can affect in real time the appearance and interaction between the virtual character and the virtual environment by modifying the physical toy parts and accessory parts.
  • a user may control the movements of the virtual character (or composite virtual character) in the game using a controller 555 .
  • the controller 555 may be a separate from the game platform 550 or integrated therein.
  • Each toy part 575 a - e may include a memory or tag for identifying the part.
  • each part 575 a - e includes an RFID tag with a numerical code to uniquely identify the part.
  • the information pertaining to the identification of each part 575 a - e may be communicated to the game platform 550 through the peripheral 540 .
  • the toy parts 575 a - e may communicate with the game platform 550 directly.
  • the toy parts 575 a - e may communicate with each other and provide combined information to the game platform 550 either directly or through a peripheral 540 .
  • each toy part includes a rewritable memory.
  • Information relating to the toy part may be stored in the memory.
  • information pertaining to the ownership of the toy part, the use of the toy part in connection with one or more game platforms, achievements accomplished in the game while using the toy part, or attributes of the toy part within the virtual environment may be stored and updated in the memory.
  • data relating to accomplishments and challenges overcome by the user in the video game may be stored in the memory of the toy part.
  • the user may be given opportunities to modify certain virtual attributes associated with one or more toy parts as he or she plays the video game.
  • the stored information may be used in subsequent game sessions and across various game platforms so that the virtual attributes of each toy part and each accessory part persist.
  • FIG. 6 is a flow diagram of a process for selecting and communicating with toy parts in accordance with aspects of the present invention.
  • the process is performed by a game platform, for example as discussed with respect to FIG. 1 .
  • the process identifies toy parts.
  • the process may identify toy parts within a defined region. For example, the process may determine what toy parts are on the surface of a video game peripheral as shown in FIG. 5 .
  • the toy parts may be identified by RFID, barcodes, or optical recognition.
  • identification of toy parts includes a video game peripheral reading identifiers of the toys and supplying the identifiers to a video game console.
  • the process selects a toy part for communication.
  • the process may select multiple toy parts of a toy assembly for communication.
  • the process may select the toy part by transmitting a selection command having an identifier matching the identifier of the toy part.
  • the process expects to receive an acknowledgment of the selection from the toy part. When an acknowledgment is not received, the process may retransmit the selection command or may signal a video game associated with the process that the selected toy is not available.
  • the process configures a virtual character.
  • the process may configure the virtual character based on the identified parts.
  • the identified parts may be from different characters.
  • the process may configure the virtual character based on configuration information indicating how the identified toy parts are connected.
  • the configuration information may include the identification of coupled toy parts and information regarding the connector and receptacle through which the toy parts are coupled.
  • the process communicates with the toy parts.
  • the process may read from a particular memory location of the toy parts or may write to a particular memory location of the toy parts.
  • the process communicates with the toy parts during game, for example communicates relating to presence of a corresponding virtual character in the game or changes to the states of the virtual character.
  • the process expects to receive an acknowledgment or response from the toy parts, and when not received, the process may retransmit the command or may signal the video game associated with the process that the selected toy part is not available. The process thereafter returns.
  • FIG. 7 is a flow diagram of a process for conducting video game play in accordance with aspects of the present invention.
  • the process is performed by a game platform, for example as discussed with respect to FIG. 1 .
  • the process requests toy part identification.
  • the process may identify toy parts within a defined region. For example, the process may determine what toy parts are on the surface of a video game peripheral as shown in FIG. 5 .
  • the toy parts may be identified by RFID, barcodes, or optical recognition.
  • identification of toy parts includes a video game peripheral reading identifiers of the toys and supplying the identifiers to a video game console.
  • the process determines a toy configuration based on the toy part identifications.
  • the process may use a lookup table or other database to determine a configuration based on the toy parts identified.
  • the process may communicate with the toy parts to receive connection information indicating the other parts a particular toy part is connected to and an indication of which connector of the toy part is used to make such connection.
  • the process may generate a virtual character or composite virtual character corresponding to a physical toy assembly including each of the identified toy parts.
  • the process conducts video game play using the virtual character or composite virtual character.
  • data relating to accomplishments and challenges overcome by the user in the video game may be stored in the memory of the toy parts of the toy assembly.
  • FIG. 8 is a flow diagram of a process for conducting video game play in accordance with aspects of the present invention.
  • the process is performed by a game platform, for example as discussed with respect to FIG. 1 .
  • the process determines a toy configuration based on toy parts identified.
  • the process may use a lookup table or other database to determine a configuration based on the toy parts identified.
  • the process may communicate with the toy parts to receive connection information indicating the other parts a particular toy part is connected to and an indication of which connector of the toy part is used to make such connection.
  • the process detects a change in the configuration of the physical toy.
  • a change may be detected when an identified toy part is removed from a defined area, for example, a surface of peripheral 550 shown in FIG. 5 .
  • the process may receive toy part identification information.
  • the process may identify toy parts located in a predefined region.
  • the process may determine the toy part identification only for the new toy parts added.
  • the process may determine a new toy configuration.
  • the process may use a lookup table or other database to determine a configuration based on the toy parts identified including the new toy part(s).
  • the process may communicate with the toy parts to receive connection information indicating the other parts a particular toy part is connected to and an indication of which connector of the toy part is used to make such connection.
  • the process may conduct game play with a virtual character or composite virtual character corresponding to the new toy assembly. Thereafter, the process returns.
  • FIG. 9 depicts a block diagram of a process for identification of one or more toy assemblies by the game platform.
  • the toy assemblies comprise two parts—a top part and a bottom part.
  • a top toy part is properly connected to the bottom toy part a complete toy assembly is assembled.
  • the system will not recognize toy parts that do not comprise a complete toy assembly.
  • the system starts up. At this time, the system is capable of communicating with the toy parts and receive identification information for toy parts.
  • the system determines if a complete toy assembly is in communication with the game platform. If no complete toy assembly is detected by the system, the system prompts the user to place a complete toy assembly in communication with the game platform at block 930 .
  • the system determines if more than one toy assembly is in communication with the game platform. If only a single complete toy assembly is in communication with the game platform, the system can depict the toy assembly in the game environment for game play in block 950 . If multiple complete toy assemblies come into communication with the game platform asynchronously, the system can determine the respective toy assemblies based on the timing of the communication of the toy parts with the system in block 970 .
  • the game system can determine that the first toy assembly comprises toy part A and toy part B (as opposed to some other combination with toy part X or toy part Y) because toy part A and toy part B are in communication with the system at or about the same time, and toy part X and toy part Y come into communication with the system at a different time.
  • the system may have difficulty identifying which toy parts constitute the respective toy assemblies, since four or more toy parts have been identified by the game platform at or about the same time. In this situation, the user may be prompted to replace the toy assemblies in communication with the system at different times in block 980 .
  • toy parts may be associated with a player based on the toy part identification number. Therefore, the game can easily recognize two players using the same type of parts and still update each toy's data based on player association.
  • the first toy part may comprise an RFID chip that provides an indication of whether a second toy part is in contact with the first toy part.
  • FIG. 10 is a flow diagram of a process for defining an animation for a virtual character's interchangeable body part in accordance with the present invention.
  • the process is performed by a computer, and in some embodiments the process is performed by a network of computers.
  • the process begins with a generic virtual skeleton and then modifies the generic virtual skeleton to reflect the specific features of the interchangeable body part.
  • the process then defines the animation for the interchangeable body part using the modified virtual skeleton that has the bones of the interchangeable body part and generic bones corresponding to the remaining generic body parts.
  • FIG. 12 depicts an exemplary generic virtual skeleton in accordance with the present invention.
  • the generic virtual skeleton 1200 of FIG. 12 is in the form of a humanoid with a generic torso body part 1210 and a generic legs body part 1220 , each having a plurality of bones and/or joints.
  • the shape, form, number, and arrangement of the bones and/or joints in the virtual skeleton are merely exemplary, and essentially any form of a generic virtual skeleton may be adopted.
  • the generic virtual skeleton may be the basis for a plurality of virtual characters.
  • the virtual skeletons of a plurality of virtual characters may be derived from the generic virtual skeleton and share one or more of the generic virtual skeleton's bones.
  • the process modifies the generic virtual skeleton to reflect the specific features and form of the virtual character's interchangeable body part.
  • the modification may be made based on inputs provided by an animator or game designer.
  • generic bones may be removed, offset, or otherwise modified such that the bones of the generic virtual skeleton become representative of the bones of the virtual character's interchangeable body part.
  • new bones may be added.
  • the result of the modifications is a modified virtual skeleton that includes the unique set of bones representative of the virtual character's interchangeable body part and the generic bones of the remaining generic body parts.
  • FIG. 13 depicts an exemplary modified virtual skeleton in accordance with the present invention. In the embodiment of FIG.
  • the virtual character has an interchangeable legs body part that takes the form of squid-like tentacles 1320 .
  • the generic legs body part of the generic virtual skeleton has been modified to reflect the virtual character's squid-like tentacles.
  • the remainder of the body i.e., torso 1310 ) remains generic. The process then proceeds to block 1015 of FIG. 10 .
  • the process defines the animation for the interchangeable body part using the modified virtual skeleton.
  • the animation may be defined based on inputs by an animator or game designer.
  • the defined animation controls the bones in the virtual character's interchangeable body part, the generic bones corresponding to the remaining body parts of the generic virtual character, or both.
  • an animator providing information defining an attack animation clip for the virtual character's squid-like legs body part 1320 may provide inputs specifying the animation to control one or more unique tentacle bones.
  • the animator may provide inputs specifying how the animation controls one or more generic bones in the generic torso body part 1310 . As discussed in more detail below with respect to FIG.
  • the animation clip may be layered onto other virtual characters that share the same generic bones.
  • the ability to layer animations defined for one virtual character onto other virtual characters provides an efficient and effective process for smoothly and cohesively animating composite virtual characters.
  • the process assigns weights to one or more bones in the modified virtual skeleton.
  • the weights may be assigned based on inputs by an animator or game designer. These weights specify the effect of the animation relative to other animations that may be layered on the virtual character. In some embodiments, this step may be skipped, and no weights are assigned.
  • FIG. 11 is a flow diagram of a process for animating composite virtual characters in accordance with the present invention.
  • the process is performed by a game platform, for example as discussed with respect to FIG. 1 .
  • the process determines the body parts of the virtual characters that comprise the composite virtual character.
  • a composite virtual character may be comprised of two body parts, e.g., a torso body part from a first virtual character and a legs body part from a second virtual character.
  • FIG. 14 depicts an exemplary virtual skeleton of a composite virtual character comprising torso 1410 from a first virtual character and legs 1420 from a second virtual character.
  • FIG. 14 depicts an exemplary virtual skeleton of a composite virtual character comprising torso 1410 from a first virtual character and legs 1420 from a second virtual character.
  • legs body part of the second virtual character 1420 is squid-like and in the form of tentacles.
  • Torso body part of the first virtual character 1410 is humanoid but one-handed. In some embodiments, these virtual skeletons were defined in accordance with the process defined with respect to FIG. 10 .
  • the process determines whether body parts remain to be processed for the composite virtual character. If no body parts need to be processed for the composite virtual character, the process returns. If body parts need to be processed for the composite virtual character, the process proceeds to block 1115 . Referring to the above example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14 , the process may determine that the torso and/or legs body parts of the composite virtual character need to be animated and thus proceed to block 1115 . On the other hand, if all body parts have been animated or no animations need to be applied, the process returns.
  • the process selects a body part of the composite virtual character for processing and determines the defined animations corresponding to the selected body part that need to be applied. Referring again to the example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14 , the process may select for processing the torso body part 1410 of the composite virtual character.
  • the process determines whether animations defined for the selected body part remain to be applied to the composite virtual character. If no animations corresponding to the selected body part remain to be applied to the composite virtual character, the process returns to block 1110 . If animations remain, the process proceeds to step 1125 . Continuing the example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14 , the process may determine that an attack animation and run animation defined for torso 1410 of the composite virtual character are to be applied to the composite virtual character. Because the process determines that animations remain to be applied, the process proceeds to block 1125 . In some embodiments, the animations defined for a particular body part may be defined in accordance with the process of FIG. 10 .
  • the process selects an animation defined for the selected body part to apply to the composite virtual character.
  • the process then proceeds to block 1130 .
  • the process may select an attack animation defined for torso 1410 of the composite virtual character.
  • the process applies the animation defined for the selected body part to the selected body part.
  • the animation may apply to all of the bones in the selected body part. In other cases, the animation may affect a subset or none of the bones in the selected body part.
  • the process then proceeds to step 1130 .
  • the process may apply the attack animation defined for torso 1410 of the composite virtual character to one or more applicable bones in torso 1410 .
  • the process determines whether generic bones exist in body parts other than the selected body part. If generic bones exist in the other body parts, the process proceeds to block 1140 . If generic bones do not exist in the other body parts, the process returns to block 1120 . Following the example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14 , the process may determine that no generic bones exist in the squid-like tentacle legs body part of the composite virtual character. Because generic bones do not exist in the legs body part, the process would proceed to block 1120 . On the other hand, if generic bones did exist in the legs body part, the process would proceed to block 1140 .
  • the process applies the animation defined for the selected body part to the generic bones in other body parts.
  • the process then returns to block 1120 .
  • the process may apply the animation defined for torso 1410 to generic bones in legs 1420 .
  • the animations defined for a particular body part in this case, the torso body part
  • the animation may be layered onto those generic bones.
  • FIGS. 10 and 11 allows animations independently defined for one virtual character to be applied to another virtual character, thus providing an improved method for animating composite virtual characters.
  • the processes outlined in FIGS. 10 and 11 are merely exemplary, and it should be appreciated that certain steps may occur in different orders or simultaneously and still conform to the teachings of the invention.
  • the selection and application of animations to the bones in the selected body part and generic bones in other body parts as described in blocks 1125 - 1140 may be executed simultaneously for a single pose of the entire composite skeleton.

Abstract

System and methods for defining and applying animations to virtual characters with interchangeable body parts that may be combined to form composite virtual characters are disclosed. According to aspects of the invention, a software program running on a gaming platform comprises a plurality of virtual characters, each comprising a plurality of interchangeable body parts. An animation clip may be defined for an interchangeable body part of a first virtual character. The animation clip may be defined for the first virtual character's interchangeable body party using a virtual skeleton that has the bones of the interchangeable body part and the bones of one or more generic body parts. The defined animation clip may control one or more of the bones of the interchangeable body part and one or more of the bones of the one or more generic body parts.

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates generally to animating a virtual character, such as those found in video games or animated movies, and more particularly to animating a virtual character comprised of parts of other virtual characters.
  • Animating a virtual character typically involves building an animated three-dimensional (3D) character model that is rigged with a virtual skeleton. A character rig generally includes the character's virtual skeleton bound to the character's 3D mesh or skin. The virtual skeleton typically includes a plurality of joints and/or bones that can be manipulated to move or deform the model into different poses. The virtual skeleton provides the basic form of the virtual character. For example, a humanoid virtual character may have a virtual skeleton that has bones representing a human form (e.g., a head bone, spine bones, arm/hand bones, leg/feet bones, pelvic bones, etc.). On the other hand, a reptilian virtual character may have a virtual skeleton that has bones representing a reptile (e.g., a tail bone, four leg bones, spine bones, elongated skull and facial bones, etc.). FIG. 12, discussed in greater detail below, depicts a generic humanoid virtual skeleton 1200 comprising a torso body part 1210 and a legs body part 1220 with various bones and joints. Surfaces, which may used to convey skin, hair, texture, eyes, mouth, etc., may also be added to the virtual character.
  • Traditionally, a given virtual character may have one animation rig and, correspondingly, one virtual skeleton. Animators create animation clips (or animations) for the virtual character that manipulate the virtual character's joints and/or bones into various positions and poses. These animation clips can be used to define the virtual character's movements and behaviors. For example, virtual characters in a video game may have predefined animation clips associated with movements and actions such as idling, walking, running, attacking, jumping, receiving damage, casting spells, climbing, flying, speaking, using items, or any other movement or action. The animation clips may apply to and/or control all or a subset of the virtual skeleton's bones. Often, the animation clips for each virtual character will be used to impart personality to the virtual character. For example, the idling animation for virtual character A may be rigid and upright, suggesting a formal or restricted personality. On the other hand, the idling animation for virtual character B may be slouched and relaxed, suggesting a laid back personality. Virtual characters, as the term is used herein, may be non-human characters and/or objects, including monsters, animals, robots, weapons, clothing, vehicles, or any other in-game characters or objects. The animation clips defined for virtual characters may help portray the unique characteristics of those virtual characters. For example, the idle animations for a monster-like virtual character may include menacing actions like showing his teeth or growling.
  • In a given period of time (or timeline), multiple animations may affect one or more bones of a virtual character's skeleton. This technique is sometimes called layering. For example, virtual character A may be animated by simultaneously applying a running animation clip and a laughing animation clip to the virtual character. The running animation may control all of the bones of virtual character A, while the laughing animation may only control a subset of bones, such as those in the face. By layering these two animation clips, virtual character A appears to be laughing while running. Different weights may be applied to the various animation clips that are layered on a given virtual character's skeleton. The weights define the relative impact of the layered animation clips. For example, the laughing animation clip may be heavily weighted relative to the running animation clip with respect to the virtual character's facial bones, thus allowing the laughing animation to assert more control over the facial bones when the two animations are layered.
  • Aspects of the present invention relate to situations in which virtual characters are combinations of interchangeable parts from one or more different and independently animated virtual characters. Such combination characters will be referred to herein as composite virtual characters. An example composite virtual character might have the upper body of a reptilian virtual character and the lower body of a robot virtual character. Of course, this is merely an example, and the number of composite virtual characters is limited only by the number of virtual characters available for combination.
  • Animating composite virtual characters presents certain challenges. To illustrate some of these challenges, consider the case of a composite character that combines the upper body of virtual character A with the lower body of virtual character B. If virtual character A and virtual character B have distinct and conflicting animation clips denoting different behaviors and personalities, simply applying the predefined animation clips for virtual character A and virtual character B to their respective body parts may result in disjointed and unconvincing animations for the composite virtual character. For example, virtual character A's personality may be portrayed as rigid and formal, thus his idle animation clip may be stiff and relatively still. Virtual character B, on the other hand, might be portrayed as excitable and nervous, and thus his idle animation clip may include fidgeting and twitching motions. By simply applying virtual character A's idle animation clip to the composite character's upper body and virtual character B's idle animation clip to the composite character's lower body, the resulting idle animation for the composite character would appear to portray contradictory personalities for the upper and lower bodies and would lack cohesiveness.
  • Another challenge of animating composite virtual characters is properly aligning and layering the animations between the various parts of the combined virtual characters, as Character A and Character B's animations are built for skeletons that have no notion of the final composite skeleton. The attack animation for virtual character A's upper body, for example, may not align properly with the attack animation for virtual character B's lower body. Traditionally, one solution to this problem has been to create a skeleton mapper that maps all bones from a particular animation to fit into the new skeleton rig of a composite character. This process, however, is labor intensive and requires an animator or some other designer to make decisions as to which bone animations will affect the respective bone(s) in the composite skeleton.
  • Another solution to the described challenges is to create animation clips for every conceivable combination of virtual characters from scratch. But this solution is also burdensome and impractical. Furthermore, anytime a new virtual character is introduced, animations for an entire new set of composite virtual characters using this new virtual character must be created.
  • BRIEF SUMMARY OF THE INVENTION
  • In accordance with aspects of the invention, a software program running on a gaming platform comprises a plurality of virtual characters, each comprising a plurality of interchangeable body parts. An animation clip may be defined for an interchangeable body part of a first virtual character. The animation clip may be defined for the first virtual character's interchangeable body party using a virtual skeleton that has the bones of the interchangeable body part and the bones of one or more generic body parts. The defined animation clip may control one or more of the bones of the interchangeable body part and one or more of the bones of the one or more generic body parts.
  • In accordance with aspects of the invention, the plurality of virtual characters comprises a second virtual character with interchangeable body parts. The defined animation clip for the first virtual character's interchangeable body part may be applied to the generic bones of the second virtual character.
  • In accordance with aspects of the invention, the software program running on the gaming platform may command display of a composite virtual character on a display device associated with the gaming platform, the composite virtual character comprising the first interchangeable body part from the first virtual character and a second interchangeable body part from the second virtual character. The software program may control animation of the second interchangeable body part from the second virtual character using the defined animation clip for the first interchangeable body part from the first virtual character.
  • According to aspects of the invention, the first virtual character may correspond to a first toy comprising a plurality of toy parts, and the second virtual character may correspond to a second toy comprising a plurality of toy parts. Toy parts from the first toy and second toy may be connected, combined or assembled together to form a toy assembly representing the composite virtual character. The toy assembly and/or individual toy parts may be configured to communicate with the gaming platform. The toy assembly may communicate with the gaming platform either directly or via a peripheral device. The software program running on the gaming platform may be used to identify the individual toy parts and determine the corresponding toy assembly and composite virtual character. The gaming platform then displays the composite virtual character in a virtual environment on the display device.
  • In some embodiments, a user of the gaming platform can interchange the first toy part and second toy part with additional toy parts from a plurality of virtual characters. The interchanging of toy parts causes a contemporaneous graphical display of the new toy assembly's corresponding composite virtual character. Accordingly, a user can affect the appearance and interaction between the composite virtual character and the virtual environment by modifying the physical toy parts and accessory parts.
  • In some embodiments, each toy part may be configured with an identification tag, such as an RFID tag with a numeric or alphanumeric code providing an identification of the toy part. Each toy part may communicate with the gaming platform, either directly, via a peripheral or via other toy parts or any combination thereof, to provide the gaming platform with the identification information in the tag. For example, in some embodiments, a peripheral is in communication with the gaming platform. The toy assembly comprising the plurality of toy parts may be placed on or in proximity of the peripheral. The toy part closest to the peripheral may include an antenna for communicating with the peripheral. The other toy parts comprising the toy assembly may communicate with the toy part closes to the peripheral either through wireless transmission or wired transmission.
  • In some embodiments, each toy part includes a rewritable memory. Information relating to the toy part may be stored in the memory. For example, information pertaining to the ownership of the toy part, the use of the toy part in connection with one or more gaming platforms or attributes of the toy part within the virtual environment may be stored in the memory. For example, as the user uses the toy part in connection with playing a video game on a gaming platform, data relating to accomplishments and challenges overcome by the user in the video game may be stored in the memory of the toy part. As another example, the user may be given opportunities to modify certain virtual attributes associated with one or more toy parts as he or she plays the video game. The stored information may be used in subsequent gaming sessions and across various gaming platforms so that the virtual attributes of each toy part an each accessory part persist.
  • In some embodiments, toy parts may comprise accessories. For example, a toy part may be a weapon, clothing item, hat, shield, armor, shoes or other accessories that may be connected, attached, interlocked with or otherwise combined with a toy assembly having one or more parts.
  • Some aspects of the invention provide a computer-implemented method for animating a composite virtual character, comprising: displaying a composite virtual character comprising a first part from a first virtual character and a second part from a second virtual character, and animating the composite virtual character, wherein animating the composite virtual character comprises substantially simultaneously animating first portions of the first part from the first virtual character and second portions of the second part from the second virtual character using an animation defined for the first virtual character and animating at least first portions of the second part from the second virtual character using an animation defined for the second virtual character.
  • Some aspects of the invention provide a video game system, comprising: a game device having an input device and a processor for executing program code for providing for play of a videogame; a plurality of physical parts from a plurality of toy figures that are physically combinable to form a composite toy assembly, the plurality of physical parts including memory providing a unique identification of each physical part and configured to communicate said unique identification; said program code having instructions for providing a graphical display of a composite virtual character representing said composite toy assembly, said composite virtual character comprised of virtual parts representing physical parts combined to form the composite toy assembly; said program code further having instructions for providing a virtual environment for said composite virtual character; wherein said movement and actions of said composite virtual character are controlled, in response to inputs received by said input device, by at least one animation defined for at least one of said virtual parts and at least one other animation defined for at least one other one of said virtual parts, with the at least one animation defined for at least one of said virtual parts completely controlling movement and actions of some elements of said composite virtual character and partially controlling movement and actions of other elements of said composite virtual character.
  • Some aspects of the invention provide a computer implemented method including animating a character comprised of different portions derived from other characters, some of the different portions including body elements of a first type and some of the different portions including body elements of a second type, the method comprising: applying, for each of the different portions including body elements of the first type, an animation routine defined for the different portion for the other character from which the different portion was derived for the body elements of the first type; and applying in a weighted manner, for each of the different portions including body elements of the second type, the animation routines defined for the different portion for the other characters for the body elements of the second type.
  • Some aspects of the invention provide a method of animating a character defined by a combination of elements of other characters, some of whose elements are considered generic to a plurality of characters and some of whose elements are considered specific to each of the other characters, with no animation routines being predefined for the character but with animation routines being predefined for the other characters, the method comprising: determining characters serving as a source of elements of the character; receiving a command for display of a particular animation routine for the character; generating the particular animation routine for the character by: applying corresponding character specific predefined animation routines to elements of the character that are considered specific to the characters serving as the source of elements of the character, and applying weighted sums of the corresponding character specific predefined animation routines to elements of the character that are considered generic to the characters serving as the source of elements of the character.
  • These and other aspects of the invention are more fully comprehended upon review of this disclosure.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 is a block diagram illustrating an example of a video game system in accordance with aspects of the present invention;
  • FIG. 2A depicts an example of a toy assembly for use in conducting a video game in accordance with aspects of the present invention;
  • FIG. 2B depicts an example of a toy assembly for use in conducting a video game in accordance with aspects of the present invention;
  • FIG. 2C depicts an example of a composite toy assembly for use in conducting a video game in accordance with aspects of the present invention;
  • FIG. 3 is a diagram depicting electronic components of toy parts in accordance with aspects of the present invention;
  • FIG. 4 is a diagram depicting electronic components of toy parts in accordance with aspects of the present invention;
  • FIG. 5 illustrates an example of a video game system in accordance with aspects of the invention;
  • FIG. 6 is a flow diagram of a process for selecting and communicating with toy parts in accordance with aspects of the present invention;
  • FIG. 7 is a flow diagram of a process for conducting video game play in accordance with aspects of the present invention;
  • FIG. 8 is a flow diagram of a process for conducting video game play in accordance with aspects of the present invention; and
  • FIG. 9 depicts a flow diagram of a process for identification of one or more toy assemblies by the game platform.
  • FIG. 10 depicts a flow diagram of a process defining an animation for a virtual character's interchangeable body part in accordance with the present invention.
  • FIG. 11 depicts a flow diagram of a process for animating composite virtual characters in accordance with aspects of the present invention.
  • FIG. 12 depicts an exemplary generic virtual skeleton in accordance with the present invention.
  • FIG. 13 depicts an exemplary modified virtual skeleton in accordance with the present invention.
  • FIG. 14 depicts an exemplary virtual skeleton of a composite virtual character in accordance with the present invention.
  • DETAILED DESCRIPTION
  • FIG. 1 is a block diagram illustrating an example of a video game system in accordance with aspects of the present invention. Referring to FIG. 1, the video game system 100 includes game system 140 and a toy assembly comprised of a plurality of component toy parts 120 a-n. The toy assembly may consist of toy parts associated with a single character, or the toy assembly may consist of toy parts from a plurality of characters (referred to as a “composite toy assembly”). The toy parts may be physically combined, coupled, connected or otherwise adjoined to create a toy assembly. In some embodiments, the toy parts may be coupled in an interlocked fashion to create a toy assembly, for example via a physical locking mechanism, electromagnetic or other locking mechanism. In various embodiments the toy parts 120 a-n may be connected by a force, for example a physical or electromagnetic force, such as by way of interlocking physical components, frictional fittings, or magnetic couplings, or by way of other known connections.
  • Each of the toy parts 120 a-n may include a rewriteable data storage component, such as RAM or rewritable RFID tag. The memory or tag may store data reflecting the identification of the toy part. In addition, in various embodiments the memory may store other data corresponding to a portion of a character or other object within the game executed on game platform 140 which the toy part represents. The other data may include data such as strength, experience, wealth, health, ownership, achievements, activity level, use or other game play data of the portion of the character or other object. For example, if the toy part corresponds to an arm of a character in game play, the memory of the toy part may store information regarding strength or health of the arm. In some embodiments the memory may store other data, for example the other data mentioned above, with respect to a character or object as a whole, and in some embodiments all toy parts which in combination correspond to the character or object may store some or all of such information. The memory may be rewritable so that the stored attributes and characteristics of the toy parts may be updated during each game session and utilized in subsequent game sessions.
  • The game platform 140 is a system for executing game software and in various embodiments may comprise a device such as a personal computer, laptop, tablet, game console, portable game platform, or mobile device, or in some embodiments one or more devices in communication with one or more servers. In some embodiments the game platform 140 comprises a processor for executing program instructions providing for game play and associated circuitry, a video game controller 180, a display device 170, and in some embodiments a peripheral device (not shown in FIG. 1) for communicating with a toy or toy parts.
  • The game platform 140 may connect or be coupled to a display device or have a display device integrated with or within the game platform for displaying graphics associated with the game operating on the game platform 140. The instructions providing for game play may be stored on removable media, for example, an optical disk or cartridge, or otherwise stored in memory of the game platform. Accordingly, the game platform, for example a game console, may include an optical drive, for example, a DVD-ROM drive, for reading the instructions for game play. In other embodiments, the instructions providing for game play may be stored in a remote server that are accessed by a game platform, for example a computer, PC, game console, or mobile device. In yet other embodiments, the instructions providing for game play may be stored locally in the game device memory.
  • The toy parts 120 a-n may communicate with game platform 140 directly or via a peripheral device. In some embodiments a first toy part 120 a may communicate information to second toy part 120 b and the second toy part 120 b may communicate information relating to both first toy part 120 a and second toy part 120 b to game platform 140, either directly or via peripheral 130 as depicted in FIG. 1. Similarly, in some embodiments multiple toy parts may communicate information to the second toy part, either directly or through one or more intervening toy parts, with the second toy part communicating information to the game platform, either directly or through the peripheral. In alternative embodiments, the toy parts 120 a-n communicate with game platform 140 independently.
  • FIGS. 2A and 2B depict examples of toy assemblies for use in conducting a video game in accordance with aspects of the present invention. The toy assemblies depicted in FIGS. 2A and 2B each consist of toy parts for a single character. FIG. 2A depicts a toy assembly 200 configured as a reptilian toy figure. FIG. 2B depicts a toy assembly 250 configured as a robot toy figure. Of course, toy assemblies 200 and 250 could instead be configured as an action figure, robot figure, a vehicle, humanoid figure, monster figure, or other toy figure. Toy assembly 200 and 250 of FIGS. 2A and 2B each include two toy parts: a torso 220, 260 and legs 230, 270. Although two toy parts are shown, the number and type of toy parts are exemplary only and should not be considered as limiting. For example, the head and/or arms included in torso 220, 260 and the tail 240 included in the legs 230 may also be provided as separate toy parts.
  • FIG. 2C depicts an example of a composite toy assembly for use in conducting a video game in accordance with aspects of the present invention. As discussed, composite toy assemblies consists of toy parts from a plurality of characters. Composite toy assembly 280 includes two toy parts: a torso 290 and legs 295. In this particular example, composite toy assembly 280 combines the torso of toy assembly 200 and legs of toy assembly 250.
  • Each of the different toy parts may be part of a class of toy parts for use in various toy assemblies. That is, a toy assembly may be configured according to preference using a plurality of interchangeable torso parts and a plurality of interchangeable leg parts. For example, either torso 220 or torso 260 may be replaced with a different torso from a different character to create a new composite character. In some embodiments, the toy parts comprise accessories or other objects to be used by the toy character. For example, a toy part may comprise a weapon, shield, tool, clothing, accoutrements or other item.
  • The toy parts may be physically combined, coupled, connected or otherwise adjoined to create a toy assembly. In some embodiments, the toy parts may be coupled in an interlocked fashion to create a toy assembly, for example via a physical locking mechanism, electromagnetic mechanism or other locking mechanism. In some embodiments, the connectors for each of the toy parts may be configured so as to restrict connection of toy parts, for example, to restrict use of a torso toy part to replace a legs toy part.
  • Each toy part includes machine-readable information, for example, memory, a radio frequency identification (RFID) tag or a barcode. The machine-readable information may be sensed, read, and/or in some embodiments written, directly by a game console, or in some embodiments indirectly by way of sending data and commands to the toy to write the data to memory of the toy parts. The machine-readable information may include a numeric identifier. In some embodiments, the communication with the toy may be conducted via a peripheral such as a peripheral or other reader. The machine-readable information allows the reader, or the processor of the game console, to distinguish one toy part from other toy parts, and the machine-readable information may therefore be considered to include a toy part identifier, and in some embodiments, each particular toy part may have its own distinct identifier. In addition, in some embodiments the machine readable information includes additional information related to player achievement in a video game when the part is in use.
  • FIG. 3 is a diagram depicting an embodiment of the electronic components of toy parts in connection with the present invention. First toy part 310 comprises an RFID tag 315. RFID tag 315 utilizes a wireless system that uses radio-frequency electromagnetic fields to transfer data from (and in various embodiments to) the tag coupled, for example for purposes of automatic identification and tracking. Some tags require no battery and are powered by the electromagnetic fields used to read them. Others use a local power source and emit radio waves (electromagnetic radiation at radio frequencies).
  • RFID tag 315 contains numerical information for identifying first toy part 310. First toy part 310 may be physically coupled to a second toy part 320. Second toy part 320 includes a circuit 325, for example an inductor circuit, for receiving the RFID electromagnetic field from RFID tag 315 in first toy part 310. When first toy part 310 and second toy part 320 are sufficiently proximate to one another or in contact with one another, the numerical information in RFID tag 315 is transmitted to the inductor circuit 325. Inductor circuit 325 is electronically coupled to an interface 327, such as a near field transmitter, in second toy part 320. Interface 327 communicates with peripheral 330. The near field transmitter may also be an RFID tag, in some embodiments.
  • The peripheral 330 includes a radio-frequency interface 335 to communicate with toys and/or toy parts. In many embodiments, the radio-frequency interface is an RFID interface. In other embodiments, the peripheral may include a different interface for communicating with toys, such as an optical interface or a wired interface. Further in some embodiments the toy may include a wired connection to the peripheral device, or in some embodiments, a wired connection to the game platform, possibly dispensing with the peripheral device. Similarly, in some embodiments the toy may include wireless communication capabilities of the type commonly used with computers, for example Bluetooth, NFC or Wi-Fi capabilities. The peripheral 330 may then transmit the information received from RFID tag 315 associated with first toy part 310 and information received from an RFID tag in second toy part 320 to a game platform utilizing antenna 340.
  • FIG. 4 is a diagram depicting an embodiment of the electronic components of toy parts in connection with the present invention. First toy part 410 comprises an RFID tag or a storage device. The RFID tag contains numerical information for identifying first toy part 410. First toy part 410 may be physically coupled to second toy part 420. First toy part 410 includes one or more plugs or connectors 430 that fit into a receptacle(s) 440 on second toy part 420. In some embodiments, the connector(s) 430 and receptacle(s) 440 may be used to provide an electrical connection between the two toy parts to enable the transmission of data between the connected toy parts. In some embodiments, the first toy part 410 and second toy part 420 may utilize magnetic connectors to maintain contact between the toy parts. For example, the first toy part 410 and second toy part 420 may each have a magnetic element. When the connector(s) 430 and receptacle(s) 440 are brought into proximity, the magnetic attraction between the magnet and its complement, whether another magnet or a ferromagnetic material, maintains the toy parts in contact with one another. In embodiments in which the connectors are used to provide electrical connection between the toy parts, the magnetic elements may maintain the contacts in an electrically conductive relationship. When the connectors associated with first toy part 410 are in contact with receptacle associated with second toy part 420, data from a memory device or numerical information in an RFID tag in first toy part 410 may be transmitted to second toy part 420 for subsequent transmission to a game platform, and/or in some embodiments vice versa. In some embodiments strength of transmitters for communication between the two toy parts are selected to be sufficiently low to require contact between the toy parts to allow for successful communication between the toy parts. Such a configuration may be beneficial, for example, to reduce or eliminate interference with other communications to the game platform or a peripheral, or receipt of extraneous communications by same.
  • FIG. 5 illustrates an example of a video game system in accordance with aspects of the invention. The video game system includes a game console 550 with a processor for executing program instructions providing for game play and associated circuitry, user input devices such as a game controller 555, a display device 560 for displaying game action, a peripheral device 540, and a toy assembly 575. Toy assembly 575 is comprised of a plurality of interconnected toy parts, including head part 575 a, torso part 575 b, arm parts 575 c, leg parts 575 d, and tail part 575 e, each of which includes memory storing identification information.
  • The peripheral device 540 may provide the capability to read and write information to the toy assembly 575 and/or its component toy parts. The processor, responsive to inputs from the user input devices and the peripheral device, generally commands display on the display device of game characters in and interacting with a virtual world of game play and possibly each other. In addition, the processor, responsive to inputs from the peripheral device, may be used to add characters and objects to the virtual world, with the characters able to manipulate the added objects and move about the virtual world. For example, the processor may include characters in game play based on inputs from the peripheral device, and the processor may control actions and activities of game characters based on inputs from the user input devices.
  • The instructions providing for game play are generally stored on removable media, for example, an optical disk. Accordingly, the game console may include an optical drive, for example, a DVD-ROM drive, for reading the instructions for game play. In some embodiments, the game console may be a personal computer, including similar internal circuitry as herein described, as well as, for example, a built-in display and built-in user input devices, such as a keyboard and a touch pad. In other embodiments, the instructions providing for game play may be stored in a remote server that are accessed by a computer or mobile device. In yet other embodiments, the instructions providing for game play may be stored locally in the game device memory.
  • The display device is generally coupled to the game platform by a cable, although in some embodiments a wireless connection may be used. In many embodiments, the display device is a liquid crystal display. In some embodiments, the display device is a television. In some embodiments, the display device is a cathode ray display, a plasma display, an electroluminescent display, an LED or OLED display, or other display. A display screen 570 of the display device displays video images of game play, generally as commanded by the processor or other associated circuitry of the game platform. In the embodiment of FIG. 5, the display screen shows a screen shot of video game play. As illustrated, the screen shot shows a display of a character, generally controlled by and animated in accordance with user inputs, approaching an inanimate item in the form of what may be considered a castle.
  • The peripheral device, in some embodiments and as shown in FIG. 5, has a substantially flat upper surface for placement of toys thereon. The game player generally places game toys, for example, toy assembly 575 in the form and representative of a dragon as shown in FIG. 5, on the flat surface of the peripheral device during game play. The toy assembly 575 is generally in the form of and representative of a game item such as a game character or other game item. In several embodiments, the toy assembly is associated with a game character during game play.
  • Peripheral 540 includes a surface 545 where toy assembly 575 may be placed. Peripheral 540 may be coupled with a game platform 550 either through a wired or wireless connection. Game platform 550 may be any form of game platform, such as game console (e.g., Xbox, Playstation, Wii, NDS), computer, mobile device or other device for executing game software either locally or from a server. The game platform 550 executes software for a video game. The game platform 550 may be connected to a display 560. In other embodiments, a display may be incorporated into the game platform 550, such as in mobile devices or portable computer devices.
  • The display 560 provides for the visual display of graphics associated with the game 570. A software program running on the game platform 550 allows the game platform 550 to identify the individual toy parts and determine the corresponding toy assembly 575. The game platform 550 then displays graphically a virtual character representing the toy assembly 575 comprised of the toy parts assembled or combined together. In some embodiments, the toy assembly 575 may be a composite toy assembly comprised of toy parts from different toy figures. In such embodiments, the corresponding virtual character representing the toy assembly 757 would be a composite virtual character. The virtual character or composite virtual character may be displayed in a virtual environment on a display device 560 associated with the game platform 550. The toy parts interact dynamically with the software program so that the virtual character representing the toy on the display device corresponds to the physical appearance of the toy assembly. The user can interchange toy parts with a contemporaneous graphical display of the corresponding virtual character. Accordingly, a user can affect in real time the appearance and interaction between the virtual character and the virtual environment by modifying the physical toy parts and accessory parts.
  • A user may control the movements of the virtual character (or composite virtual character) in the game using a controller 555. The controller 555 may be a separate from the game platform 550 or integrated therein.
  • Each toy part 575 a-e may include a memory or tag for identifying the part. For example, in some embodiments, each part 575 a-e includes an RFID tag with a numerical code to uniquely identify the part. The information pertaining to the identification of each part 575 a-e may be communicated to the game platform 550 through the peripheral 540. In alternative embodiments, the toy parts 575 a-e may communicate with the game platform 550 directly. In still other embodiments, the toy parts 575 a-e may communicate with each other and provide combined information to the game platform 550 either directly or through a peripheral 540. In other embodiments, each toy part includes a rewritable memory. Information relating to the toy part may be stored in the memory. For example, information pertaining to the ownership of the toy part, the use of the toy part in connection with one or more game platforms, achievements accomplished in the game while using the toy part, or attributes of the toy part within the virtual environment may be stored and updated in the memory. For example, as the user uses the toy part in connection with playing a video game on a game platform, data relating to accomplishments and challenges overcome by the user in the video game may be stored in the memory of the toy part. As another example, the user may be given opportunities to modify certain virtual attributes associated with one or more toy parts as he or she plays the video game. The stored information may be used in subsequent game sessions and across various game platforms so that the virtual attributes of each toy part and each accessory part persist.
  • FIG. 6 is a flow diagram of a process for selecting and communicating with toy parts in accordance with aspects of the present invention. In some embodiments the process is performed by a game platform, for example as discussed with respect to FIG. 1. At block 605, the process identifies toy parts. In some embodiments, the process may identify toy parts within a defined region. For example, the process may determine what toy parts are on the surface of a video game peripheral as shown in FIG. 5. In various embodiments, the toy parts may be identified by RFID, barcodes, or optical recognition. In one embodiment, identification of toy parts includes a video game peripheral reading identifiers of the toys and supplying the identifiers to a video game console.
  • In block 610, the process selects a toy part for communication. In some embodiments, the process may select multiple toy parts of a toy assembly for communication. The process may select the toy part by transmitting a selection command having an identifier matching the identifier of the toy part. In many embodiments, the process expects to receive an acknowledgment of the selection from the toy part. When an acknowledgment is not received, the process may retransmit the selection command or may signal a video game associated with the process that the selected toy is not available.
  • In block 615, the process configures a virtual character. The process may configure the virtual character based on the identified parts. In some embodiments, the identified parts may be from different characters. In some embodiments, the process may configure the virtual character based on configuration information indicating how the identified toy parts are connected. For example, the configuration information may include the identification of coupled toy parts and information regarding the connector and receptacle through which the toy parts are coupled.
  • In block 620, the process communicates with the toy parts. For example, the process may read from a particular memory location of the toy parts or may write to a particular memory location of the toy parts. In various embodiments the process communicates with the toy parts during game, for example communicates relating to presence of a corresponding virtual character in the game or changes to the states of the virtual character. In many embodiments, the process expects to receive an acknowledgment or response from the toy parts, and when not received, the process may retransmit the command or may signal the video game associated with the process that the selected toy part is not available. The process thereafter returns.
  • FIG. 7 is a flow diagram of a process for conducting video game play in accordance with aspects of the present invention. In some embodiments the process is performed by a game platform, for example as discussed with respect to FIG. 1. In block 705, the process requests toy part identification. In some embodiments, the process may identify toy parts within a defined region. For example, the process may determine what toy parts are on the surface of a video game peripheral as shown in FIG. 5. In various embodiments, the toy parts may be identified by RFID, barcodes, or optical recognition. In one embodiment, identification of toy parts includes a video game peripheral reading identifiers of the toys and supplying the identifiers to a video game console.
  • In block 710, the process determines a toy configuration based on the toy part identifications. In some embodiments, the process may use a lookup table or other database to determine a configuration based on the toy parts identified. In some embodiments, the process may communicate with the toy parts to receive connection information indicating the other parts a particular toy part is connected to and an indication of which connector of the toy part is used to make such connection.
  • At block 725, the process may generate a virtual character or composite virtual character corresponding to a physical toy assembly including each of the identified toy parts.
  • In block 730, the process conducts video game play using the virtual character or composite virtual character. As the virtual toy is used to progress through the video game, data relating to accomplishments and challenges overcome by the user in the video game may be stored in the memory of the toy parts of the toy assembly.
  • Thereafter the process returns.
  • FIG. 8 is a flow diagram of a process for conducting video game play in accordance with aspects of the present invention. In some embodiments the process is performed by a game platform, for example as discussed with respect to FIG. 1. In block 805, the process determines a toy configuration based on toy parts identified. In some embodiments, the process may use a lookup table or other database to determine a configuration based on the toy parts identified. In some embodiments, the process may communicate with the toy parts to receive connection information indicating the other parts a particular toy part is connected to and an indication of which connector of the toy part is used to make such connection.
  • In block 810, the process detects a change in the configuration of the physical toy. In some embodiments, a change may be detected when an identified toy part is removed from a defined area, for example, a surface of peripheral 550 shown in FIG. 5.
  • In block 815, the process may receive toy part identification information. In some embodiments, the process may identify toy parts located in a predefined region. In some embodiments, the process may determine the toy part identification only for the new toy parts added.
  • In block 820, the process may determine a new toy configuration. In some embodiments, the process may use a lookup table or other database to determine a configuration based on the toy parts identified including the new toy part(s). In some embodiments, the process may communicate with the toy parts to receive connection information indicating the other parts a particular toy part is connected to and an indication of which connector of the toy part is used to make such connection.
  • In block 825, the process may conduct game play with a virtual character or composite virtual character corresponding to the new toy assembly. Thereafter, the process returns.
  • FIG. 9 depicts a block diagram of a process for identification of one or more toy assemblies by the game platform. In the present embodiment, the toy assemblies comprise two parts—a top part and a bottom part. When a top toy part is properly connected to the bottom toy part a complete toy assembly is assembled. In some embodiments, the system will not recognize toy parts that do not comprise a complete toy assembly. In block 910, the system starts up. At this time, the system is capable of communicating with the toy parts and receive identification information for toy parts.
  • In block 920, the system determines if a complete toy assembly is in communication with the game platform. If no complete toy assembly is detected by the system, the system prompts the user to place a complete toy assembly in communication with the game platform at block 930.
  • In block 940, the system determines if more than one toy assembly is in communication with the game platform. If only a single complete toy assembly is in communication with the game platform, the system can depict the toy assembly in the game environment for game play in block 950. If multiple complete toy assemblies come into communication with the game platform asynchronously, the system can determine the respective toy assemblies based on the timing of the communication of the toy parts with the system in block 970. For example, if first complete toy assembly comprising toy part A and toy part B and a second toy assembly comprising toy part X and toy part Y are in communication with the game system, the game system can determine that the first toy assembly comprises toy part A and toy part B (as opposed to some other combination with toy part X or toy part Y) because toy part A and toy part B are in communication with the system at or about the same time, and toy part X and toy part Y come into communication with the system at a different time. If however the first toy assembly and second toy assembly come into communication with the system at or about the same time in block 960, the system may have difficulty identifying which toy parts constitute the respective toy assemblies, since four or more toy parts have been identified by the game platform at or about the same time. In this situation, the user may be prompted to replace the toy assemblies in communication with the system at different times in block 980.
  • After recognition and identification by the system, toy parts may be associated with a player based on the toy part identification number. Therefore, the game can easily recognize two players using the same type of parts and still update each toy's data based on player association.
  • In other embodiments, more sophisticated RFID chips may be utilized to provide communication between the various toy parts and the game platform. For example, the first toy part may comprise an RFID chip that provides an indication of whether a second toy part is in contact with the first toy part.
  • As discussed above, animating a composite virtual character as described herein presents certain challenges. In accordance with aspects of the invention, a process for defining animations for virtual characters that may be smoothly and cohesively applied to composite virtual characters is described.
  • FIG. 10 is a flow diagram of a process for defining an animation for a virtual character's interchangeable body part in accordance with the present invention. In some embodiments the process is performed by a computer, and in some embodiments the process is performed by a network of computers. As will be discussed in more detail below, the process begins with a generic virtual skeleton and then modifies the generic virtual skeleton to reflect the specific features of the interchangeable body part. The process then defines the animation for the interchangeable body part using the modified virtual skeleton that has the bones of the interchangeable body part and generic bones corresponding to the remaining generic body parts.
  • In block 1005, the process begins by defining a generic virtual skeleton corresponding to a generic virtual character. FIG. 12 depicts an exemplary generic virtual skeleton in accordance with the present invention. The generic virtual skeleton 1200 of FIG. 12 is in the form of a humanoid with a generic torso body part 1210 and a generic legs body part 1220, each having a plurality of bones and/or joints. However, the shape, form, number, and arrangement of the bones and/or joints in the virtual skeleton are merely exemplary, and essentially any form of a generic virtual skeleton may be adopted. The generic virtual skeleton may be the basis for a plurality of virtual characters. Thus, as will be appreciated more fully below, the virtual skeletons of a plurality of virtual characters may be derived from the generic virtual skeleton and share one or more of the generic virtual skeleton's bones.
  • Returning to FIG. 10, in block 1010, the process modifies the generic virtual skeleton to reflect the specific features and form of the virtual character's interchangeable body part. In some embodiments, the modification may be made based on inputs provided by an animator or game designer. In some embodiments, generic bones may be removed, offset, or otherwise modified such that the bones of the generic virtual skeleton become representative of the bones of the virtual character's interchangeable body part. In addition, or in the alternative, new bones may be added. The result of the modifications is a modified virtual skeleton that includes the unique set of bones representative of the virtual character's interchangeable body part and the generic bones of the remaining generic body parts. FIG. 13 depicts an exemplary modified virtual skeleton in accordance with the present invention. In the embodiment of FIG. 13, the virtual character has an interchangeable legs body part that takes the form of squid-like tentacles 1320. Thus, the generic legs body part of the generic virtual skeleton has been modified to reflect the virtual character's squid-like tentacles. The remainder of the body (i.e., torso 1310) remains generic. The process then proceeds to block 1015 of FIG. 10.
  • In block 1015, the process defines the animation for the interchangeable body part using the modified virtual skeleton. In some embodiments, the animation may be defined based on inputs by an animator or game designer. In some embodiments, the defined animation controls the bones in the virtual character's interchangeable body part, the generic bones corresponding to the remaining body parts of the generic virtual character, or both. Following the above example, an animator providing information defining an attack animation clip for the virtual character's squid-like legs body part 1320 may provide inputs specifying the animation to control one or more unique tentacle bones. In addition, or in the alternative, the animator may provide inputs specifying how the animation controls one or more generic bones in the generic torso body part 1310. As discussed in more detail below with respect to FIG. 11, by specifying how the animation clip controls the generic bones in generic torso 1310, the animation clip may be layered onto other virtual characters that share the same generic bones. The ability to layer animations defined for one virtual character onto other virtual characters provides an efficient and effective process for smoothly and cohesively animating composite virtual characters.
  • In block 1020, the process assigns weights to one or more bones in the modified virtual skeleton. In some embodiments, the weights may be assigned based on inputs by an animator or game designer. These weights specify the effect of the animation relative to other animations that may be layered on the virtual character. In some embodiments, this step may be skipped, and no weights are assigned.
  • The process then returns.
  • FIG. 11 is a flow diagram of a process for animating composite virtual characters in accordance with the present invention. In some embodiments the process is performed by a game platform, for example as discussed with respect to FIG. 1. At block 1105, the process determines the body parts of the virtual characters that comprise the composite virtual character. For example, a composite virtual character may be comprised of two body parts, e.g., a torso body part from a first virtual character and a legs body part from a second virtual character. FIG. 14 depicts an exemplary virtual skeleton of a composite virtual character comprising torso 1410 from a first virtual character and legs 1420 from a second virtual character. In the example of FIG. 14, legs body part of the second virtual character 1420 is squid-like and in the form of tentacles. Torso body part of the first virtual character 1410 is humanoid but one-handed. In some embodiments, these virtual skeletons were defined in accordance with the process defined with respect to FIG. 10.
  • At block 1110, the process determines whether body parts remain to be processed for the composite virtual character. If no body parts need to be processed for the composite virtual character, the process returns. If body parts need to be processed for the composite virtual character, the process proceeds to block 1115. Referring to the above example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14, the process may determine that the torso and/or legs body parts of the composite virtual character need to be animated and thus proceed to block 1115. On the other hand, if all body parts have been animated or no animations need to be applied, the process returns.
  • At block 1115, the process selects a body part of the composite virtual character for processing and determines the defined animations corresponding to the selected body part that need to be applied. Referring again to the example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14, the process may select for processing the torso body part 1410 of the composite virtual character.
  • At block 1120, the process determines whether animations defined for the selected body part remain to be applied to the composite virtual character. If no animations corresponding to the selected body part remain to be applied to the composite virtual character, the process returns to block 1110. If animations remain, the process proceeds to step 1125. Continuing the example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14, the process may determine that an attack animation and run animation defined for torso 1410 of the composite virtual character are to be applied to the composite virtual character. Because the process determines that animations remain to be applied, the process proceeds to block 1125. In some embodiments, the animations defined for a particular body part may be defined in accordance with the process of FIG. 10.
  • At block 1125, the process selects an animation defined for the selected body part to apply to the composite virtual character. The process then proceeds to block 1130. Following the example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14, the process may select an attack animation defined for torso 1410 of the composite virtual character.
  • At block 1130, the process applies the animation defined for the selected body part to the selected body part. In some cases, the animation may apply to all of the bones in the selected body part. In other cases, the animation may affect a subset or none of the bones in the selected body part. The process then proceeds to step 1130. Referring to the example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14, the process may apply the attack animation defined for torso 1410 of the composite virtual character to one or more applicable bones in torso 1410.
  • At block 1135, the process determines whether generic bones exist in body parts other than the selected body part. If generic bones exist in the other body parts, the process proceeds to block 1140. If generic bones do not exist in the other body parts, the process returns to block 1120. Following the example of animating the composite virtual character corresponding to the virtual skeleton of FIG. 14, the process may determine that no generic bones exist in the squid-like tentacle legs body part of the composite virtual character. Because generic bones do not exist in the legs body part, the process would proceed to block 1120. On the other hand, if generic bones did exist in the legs body part, the process would proceed to block 1140.
  • At block 1140, the process applies the animation defined for the selected body part to the generic bones in other body parts. The process then returns to block 1120. Following the above example, the process may apply the animation defined for torso 1410 to generic bones in legs 1420. As discussed above, the animations defined for a particular body part (in this case, the torso body part) may be animated using a virtual skeleton that has the unique set of bones designed for that body part and generic bones for the remaining body parts (in this case, the legs body part). Thus, if generic bones exist in the legs body part of the composite virtual character, the animation may be layered onto those generic bones.
  • The processes described in FIGS. 10 and 11 allows animations independently defined for one virtual character to be applied to another virtual character, thus providing an improved method for animating composite virtual characters. The processes outlined in FIGS. 10 and 11 are merely exemplary, and it should be appreciated that certain steps may occur in different orders or simultaneously and still conform to the teachings of the invention. For example, in some embodiments, the selection and application of animations to the bones in the selected body part and generic bones in other body parts as described in blocks 1125-1140 may be executed simultaneously for a single pose of the entire composite skeleton.
  • Although the invention has been discussed with respect to various embodiments, it should be recognized that the invention comprises the novel and non-obvious claims supported by this disclosure.

Claims (18)

What is claimed is:
1. A computer-implemented method for animating a composite virtual character, comprising:
displaying a composite virtual character comprising a first part from a first virtual character and a second part from a second virtual character, and
animating the composite virtual character, wherein animating the composite virtual character comprises substantially simultaneously animating first portions of the first part from the first virtual character and second portions of the second part from the second virtual character using an animation defined for the first virtual character and animating at least first portions of the second part from the second virtual character using an animation defined for the second virtual character.
2. The computer-implemented method of claim 1, wherein animating the composite character further comprises determining whether one or more generic bones exist in the second part from the second virtual character.
3. The computer-implemented method of claim 2, wherein the second portions of the second part from the second virtual character comprise the generic bones.
4. The computer-implemented method of claim 1, wherein animating the composite virtual character further comprises animating at least second portions of the first part of the first virtual character using the animation defined for the second virtual character.
5. The computer implemented method of claim 4, wherein the first portions of the first part of the first virtual character comprise a first set of bones of the first virtual character and the second portions of the first part of the first virtual character comprise a second set of bones of the first virtual character, the second set of bones not including any bones of the first set of bones.
6. The computer implemented method of claim 5, wherein the first set of bones of the first virtual character and the second set of bones of the first virtual character are predefined for the first virtual character.
7. The computer implemented method of claim 6, wherein the second set of bones comprise generic bones.
8. The computer implemented method of claim 1, further comprising animating second portions of the second part from the second virtual character using the animation defined for the second virtual character.
9. The computer implemented method of claim 1, wherein animating second portions of the second part from the second virtual character using the animation defined for the first virtual character and the animation defined for the second virtual character is performed using a weighting of the animation defined for the first virtual character and a weighting of the animation defined for the second virtual character.
10. A video game system, comprising:
a game device having an input device and a processor for executing program code for providing for play of a videogame;
a plurality of physical parts from a plurality of toy figures that are physically combinable to form a composite toy assembly, the plurality of physical parts including memory providing a unique identification of each physical part and configured to communicate said unique identification;
said program code having instructions for providing a graphical display of a composite virtual character representing said composite toy assembly, said composite virtual character comprised of virtual parts representing physical parts combined to form the composite toy assembly;
said program code further having instructions for providing a virtual environment for said composite virtual character;
wherein said movement and actions of said composite virtual character are controlled, in response to inputs received by said input device, by at least one animation defined for at least one of said virtual parts and at least one other animation defined for at least one other one of said virtual parts, with the at least one animation defined for at least one of said virtual parts completely controlling movement and actions of some elements of said composite virtual character and partially controlling movement and actions of other elements of said composite virtual character.
11. The video game system of claim 10, wherein the plurality of physical parts are physically combinable by way of magnetic couplings, the magnetic couplings being part of the physical parts.
12. The video game system of claim 10, wherein the at least one other animation defined for at least one other one of said virtual parts partially controls movement and actions of the other elements of said composite virtual character.
13. The video game system of claim 10, wherein the some elements and other elements of said composite virtual character comprise bones.
14. The video game system of claim 13, wherein each of the bones of the composite virtual character correspond to bones of the virtual parts, and the at least one animation defined for at least one of said virtual parts completely controls some of the bones corresponding to bones of said at least one of said virtual parts and partially controls movement and actions of other bones not corresponding to bones of said at least one of said virtual parts.
15. A computer implemented method including animating a character comprised of different portions derived from other characters, some of the different portions including body elements of a first type and some of the different portions including body elements of a second type, the method comprising:
applying, for each of the different portions including body elements of the first type, an animation routine defined for the different portion for the other character from which the different portion was derived for the body elements of the first type; and
applying in a weighted manner, for each of the different portions including body elements of the second type, the animation routines defined for the different portion for the other characters for the body elements of the second type.
16. The method of claim 15, wherein the body elements of the first type and the body elements of the second type are bones.
17. The method of claim 15, further comprising determining the different portions derived from other characters by at least:
receiving an indication of physical parts of toys combined to form a toy assembly
18. A method of animating a character defined by a combination of elements of other characters, some of whose elements are considered generic to a plurality of characters and some of whose elements are considered specific to each of the other characters, with no animation routines being predefined for the character but with animation routines being predefined for the other characters, the method comprising:
determining characters serving as a source of elements of the character;
receiving a command for display of a particular animation routine for the character;
generating the particular animation routine for the character by:
applying corresponding character specific predefined animation routines to elements of the character that are considered specific to the characters serving as the source of elements of the character, and
applying weighted sums of the corresponding character specific predefined animation routines to elements of the character that are considered generic to the characters serving as the source of elements of the character.
US13/831,333 2013-03-14 2013-03-14 System and method for animating virtual characters Abandoned US20140274373A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/831,333 US20140274373A1 (en) 2013-03-14 2013-03-14 System and method for animating virtual characters
US16/539,827 US10885694B2 (en) 2013-03-14 2019-08-13 System and method for animating virtual characters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/831,333 US20140274373A1 (en) 2013-03-14 2013-03-14 System and method for animating virtual characters

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/539,827 Continuation US10885694B2 (en) 2013-03-14 2019-08-13 System and method for animating virtual characters

Publications (1)

Publication Number Publication Date
US20140274373A1 true US20140274373A1 (en) 2014-09-18

Family

ID=51529558

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/831,333 Abandoned US20140274373A1 (en) 2013-03-14 2013-03-14 System and method for animating virtual characters
US16/539,827 Active US10885694B2 (en) 2013-03-14 2019-08-13 System and method for animating virtual characters

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/539,827 Active US10885694B2 (en) 2013-03-14 2019-08-13 System and method for animating virtual characters

Country Status (1)

Country Link
US (2) US20140274373A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130059696A1 (en) * 2010-03-23 2013-03-07 Industrial Research Limited Exercise system and controller
US20150155901A1 (en) * 2013-12-02 2015-06-04 Patent Category Corp. Holder for Smart Device
US20150335997A1 (en) * 2014-05-21 2015-11-26 Karthik Bala Contextual play pattern switching system and method
US20150335998A1 (en) * 2014-05-21 2015-11-26 Karthik Bala Video game portal
CN105447902A (en) * 2016-01-15 2016-03-30 网易(杭州)网络有限公司 An animation processing method and apparatus
WO2016153657A1 (en) * 2015-03-25 2016-09-29 Intel Corporation Reality animation mechanism
US20160310839A1 (en) * 2015-04-22 2016-10-27 Activision Publishing, Inc. Video game system and toy with rf antenna
US20160364178A1 (en) * 2015-06-12 2016-12-15 Nintendo Co., Ltd. Information processing apparatus, information processing system, storage medium and information processing method
WO2017107995A1 (en) * 2015-12-25 2017-06-29 Zheng Shi System and method for playing toy with combinatorial attributes
US20170200390A1 (en) * 2016-01-07 2017-07-13 Arthur E. Seymour Educational system utilizing a contest and a method of conducting the same
US20180028904A1 (en) * 2015-04-09 2018-02-01 Warner Bros. Entertainment Inc. Portal device and cooperating video game machine
US9901827B2 (en) 2015-01-06 2018-02-27 Spin Master Ltd. Methods and system relating to physical constructions and virtual representations
US10102316B2 (en) * 2015-12-15 2018-10-16 Dassault Systemes Simulia Corp. Virtual reality authoring method
CN110992495A (en) * 2019-12-26 2020-04-10 珠海金山网络游戏科技有限公司 Virtual model deformation method and device
US10627978B2 (en) 2017-12-05 2020-04-21 Dodles, Inc. Virtual stack button for software having a branched task structure
US10628537B2 (en) 2016-04-12 2020-04-21 Dassault Systemes Simulia Corp. Simulation augmented reality system for emergent behavior
US20200129875A1 (en) * 2016-01-06 2020-04-30 Evollve, Inc. Robot having a changeable character
WO2020190415A1 (en) * 2019-03-15 2020-09-24 Sony Interactive Entertainment Inc. Reinforcement learning to train a character using disparate target animation data
US10916046B2 (en) * 2019-02-28 2021-02-09 Disney Enterprises, Inc. Joint estimation from images
US11295504B1 (en) * 2019-08-01 2022-04-05 Meta Platforms, Inc. Systems and methods for dynamic digital animation
US11328480B1 (en) * 2019-11-14 2022-05-10 Radical Convergence Inc. Rapid generation of three-dimensional characters
US20220193550A1 (en) * 2019-04-30 2022-06-23 Netease (Hangzhou) Network Co.,Ltd. Action Generation Method, Electronic Device, and Non-Transitory Computer-Readable Medium
CN114870407A (en) * 2022-04-29 2022-08-09 深圳市中视典数字科技有限公司 Digital human body data acquisition system and method based on virtual reality
WO2023130815A1 (en) * 2022-01-07 2023-07-13 腾讯科技(深圳)有限公司 Scene picture display method and apparatus, terminal, and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113546415B (en) * 2021-08-11 2024-03-29 北京字跳网络技术有限公司 Scenario animation playing method, scenario animation generating method, terminal, device and equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5766077A (en) * 1995-05-26 1998-06-16 Kabushiki Kaisha Bandai Game apparatus with controllers for moving toy and character therefor
US6522331B1 (en) * 2000-02-01 2003-02-18 Stormfront Studios, Inc. Character animation using directed acyclic graphs
US20040142628A1 (en) * 2003-01-17 2004-07-22 Pino Scott A. Toy assembly

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5766077A (en) * 1995-05-26 1998-06-16 Kabushiki Kaisha Bandai Game apparatus with controllers for moving toy and character therefor
US6522331B1 (en) * 2000-02-01 2003-02-18 Stormfront Studios, Inc. Character animation using directed acyclic graphs
US20040142628A1 (en) * 2003-01-17 2004-07-22 Pino Scott A. Toy assembly

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"The Elder Scrolls IV: Oblivion", released by Bethesda Softworks, LLC, on March 20, 2006. pp1-2. *
Jodie Azhar, "Creating Realistic Locomotion in the Animation of a Fantasy Creature", Dec 06, 2010, page 1-11. http://ncca.bournemouth.ac.uk/gallery/view/427/Creating_realistic_Locomotion_in_the_Animation_of_a_Fantasy_Creature *

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130059696A1 (en) * 2010-03-23 2013-03-07 Industrial Research Limited Exercise system and controller
US9387396B2 (en) * 2010-03-23 2016-07-12 Callaghan Innovation Exercise system and controller
US20150155901A1 (en) * 2013-12-02 2015-06-04 Patent Category Corp. Holder for Smart Device
US9764228B2 (en) * 2014-05-21 2017-09-19 Activision Publishing, Inc. Contextual play pattern switching system and method
US20150335997A1 (en) * 2014-05-21 2015-11-26 Karthik Bala Contextual play pattern switching system and method
US20150335998A1 (en) * 2014-05-21 2015-11-26 Karthik Bala Video game portal
US9421463B2 (en) * 2014-05-21 2016-08-23 Activision Publishing, Inc. Video game portal
US9901827B2 (en) 2015-01-06 2018-02-27 Spin Master Ltd. Methods and system relating to physical constructions and virtual representations
WO2016153657A1 (en) * 2015-03-25 2016-09-29 Intel Corporation Reality animation mechanism
US20180028904A1 (en) * 2015-04-09 2018-02-01 Warner Bros. Entertainment Inc. Portal device and cooperating video game machine
US10583352B2 (en) * 2015-04-09 2020-03-10 Warner Bros. Entertainment Inc. Portal device and cooperating video game machine
US20230144168A1 (en) * 2015-04-09 2023-05-11 Warner Bros. Entertainment Inc. Portal device and cooperating video game machine
US11478695B2 (en) 2015-04-09 2022-10-25 Warner Bros. Entertainment Inc. Portal device and cooperating video game machine
US11766607B2 (en) * 2015-04-09 2023-09-26 Warner Bros. Entertainment Inc. Portal device and cooperating video game machine
US9604135B2 (en) * 2015-04-22 2017-03-28 Activision Publishing, Inc. Video game system and toy with RF antenna
US20160310839A1 (en) * 2015-04-22 2016-10-27 Activision Publishing, Inc. Video game system and toy with rf antenna
US20160364178A1 (en) * 2015-06-12 2016-12-15 Nintendo Co., Ltd. Information processing apparatus, information processing system, storage medium and information processing method
US10102316B2 (en) * 2015-12-15 2018-10-16 Dassault Systemes Simulia Corp. Virtual reality authoring method
WO2017107995A1 (en) * 2015-12-25 2017-06-29 Zheng Shi System and method for playing toy with combinatorial attributes
US11529567B2 (en) * 2016-01-06 2022-12-20 Evollve, Inc. Robot having a changeable character
US20200129875A1 (en) * 2016-01-06 2020-04-30 Evollve, Inc. Robot having a changeable character
US20170200390A1 (en) * 2016-01-07 2017-07-13 Arthur E. Seymour Educational system utilizing a contest and a method of conducting the same
CN105447902A (en) * 2016-01-15 2016-03-30 网易(杭州)网络有限公司 An animation processing method and apparatus
US10628537B2 (en) 2016-04-12 2020-04-21 Dassault Systemes Simulia Corp. Simulation augmented reality system for emergent behavior
US10769325B2 (en) 2016-04-12 2020-09-08 Dassault Systemes Simulia Corp. Simulation augmented reality system for emergent behavior
US10627978B2 (en) 2017-12-05 2020-04-21 Dodles, Inc. Virtual stack button for software having a branched task structure
US10916046B2 (en) * 2019-02-28 2021-02-09 Disney Enterprises, Inc. Joint estimation from images
US11132606B2 (en) * 2019-03-15 2021-09-28 Sony Interactive Entertainment Inc. Reinforcement learning to train a character using disparate target animation data
WO2020190415A1 (en) * 2019-03-15 2020-09-24 Sony Interactive Entertainment Inc. Reinforcement learning to train a character using disparate target animation data
US20220193550A1 (en) * 2019-04-30 2022-06-23 Netease (Hangzhou) Network Co.,Ltd. Action Generation Method, Electronic Device, and Non-Transitory Computer-Readable Medium
US11295504B1 (en) * 2019-08-01 2022-04-05 Meta Platforms, Inc. Systems and methods for dynamic digital animation
US11328480B1 (en) * 2019-11-14 2022-05-10 Radical Convergence Inc. Rapid generation of three-dimensional characters
US11816797B1 (en) 2019-11-14 2023-11-14 Radical Convergence Inc. Rapid generation of three-dimensional characters
CN110992495A (en) * 2019-12-26 2020-04-10 珠海金山网络游戏科技有限公司 Virtual model deformation method and device
WO2023130815A1 (en) * 2022-01-07 2023-07-13 腾讯科技(深圳)有限公司 Scene picture display method and apparatus, terminal, and storage medium
CN114870407A (en) * 2022-04-29 2022-08-09 深圳市中视典数字科技有限公司 Digital human body data acquisition system and method based on virtual reality

Also Published As

Publication number Publication date
US20200219303A1 (en) 2020-07-09
US10885694B2 (en) 2021-01-05

Similar Documents

Publication Publication Date Title
US10885694B2 (en) System and method for animating virtual characters
US11369864B2 (en) Interactive video game with toys having in interchangeable parts
EP3200886B1 (en) Game system
US9802126B2 (en) Interactive video game system comprising toys with rewritable memories
US10639544B2 (en) Gaming system for modular toys
US10561953B2 (en) Interactive video game system comprising toys with rewritable memories
US9649565B2 (en) Server based interactive video game with toys
US8882595B2 (en) Systems and methods of importing virtual objects using barcodes
US20120295702A1 (en) Optional animation sequences for character usage in a video game
US10238977B2 (en) Collection of marketing information developed during video game play
WO2016003844A1 (en) Interactive play sets
CN105474221A (en) Realizing boxed experience for digital content acquisition
US20180071626A1 (en) Tag reader and system comprising a tag reader
US10349250B2 (en) System and method for creating physical objects used with videogames
US20190133388A1 (en) System and method for presenting user progress on physical figures
KR20150116106A (en) Robot nurturance system of 3D printing robot and robot nurturance method using the same
Håkansson et al. Developing a Workflow for Cross-platform 3D Apps using Game Engines

Legal Events

Date Code Title Description
AS Assignment

Owner name: ACTIVISION PUBLISHING, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OLSHAN, ADAM;DOPTIS, DANIEL;PARDEE, ANTHONY;SIGNING DATES FROM 20130404 TO 20130422;REEL/FRAME:030391/0596

AS Assignment

Owner name: BANK OF AMERICA, N.A., WASHINGTON

Free format text: SECURITY AGREEMENT;ASSIGNOR:ACTIVISION BLIZZARD, INC.;REEL/FRAME:031435/0138

Effective date: 20131011

AS Assignment

Owner name: ACTIVISION ENTERTAINMENT HOLDINGS, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487

Effective date: 20161014

Owner name: ACTIVISION ENTERTAINMENT HOLDINGS, INC., CALIFORNI

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487

Effective date: 20161014

Owner name: ACTIVISION BLIZZARD INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487

Effective date: 20161014

Owner name: ACTIVISION PUBLISHING, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487

Effective date: 20161014

Owner name: BLIZZARD ENTERTAINMENT, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A.;REEL/FRAME:040381/0487

Effective date: 20161014

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION