US20090312100A1 - Face Simulation in Networking - Google Patents

Face Simulation in Networking Download PDF

Info

Publication number
US20090312100A1
US20090312100A1 US12/138,221 US13822108A US2009312100A1 US 20090312100 A1 US20090312100 A1 US 20090312100A1 US 13822108 A US13822108 A US 13822108A US 2009312100 A1 US2009312100 A1 US 2009312100A1
Authority
US
United States
Prior art keywords
game
application
person
picture
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/138,221
Inventor
Scott C. Harris
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harris Technology LLC
Original Assignee
Harris Technology LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harris Technology LLC filed Critical Harris Technology LLC
Priority to US12/138,221 priority Critical patent/US20090312100A1/en
Assigned to HARRIS TECHNOLOGY, LLC reassignment HARRIS TECHNOLOGY, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HARRIS, SCOTT C
Publication of US20090312100A1 publication Critical patent/US20090312100A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/33Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
    • A63F13/335Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using Internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/71Game security or game management aspects using secure communication between game devices and game servers, e.g. by encrypting game data or authenticating players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/407Data transfer via internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/53Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
    • A63F2300/532Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing using secure communication, e.g. by encryption, authentication
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6692Methods for processing data by generating or executing the game program for rendering three dimensional images using special effects, generally involving post-processing, e.g. blooming
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • A63F2300/695Imported photos, e.g. of the player

Definitions

  • the Internet also allows other actions to interact with others, including chatting in chat rooms, social networking, e-mail, and others. All of these applications represent “you” on the Internet by something on the internet. That something may be an avatar, or may be some other indicia indicative of “you” and “others”.
  • the present application describes special applications of avatars. Another application describes a special way of meeting on the Internet using this special avatar application.
  • FIG. 1 shows a basic block diagram of a person interacting with a computer system.
  • FIG. 2 illustrates a flowchart of operation
  • FIG. 3 shows a face modeling system
  • FIG. 4 shows a social networking system with huddles and storefronts.
  • FIG. 1 shows a block diagram of a “user” interacting with a computer-based interactive system.
  • the interactive system can be a personal computer running a program that interacts with the internet, or can be a dedicated video gaming console such as a Sony Playstation or Xbox, or any other type device that allows interaction via a user interface and display.
  • the user typically sits in a location in front of the display screen 105 .
  • camera 110 may be located to obtain a picture or image of at least the face of the user 100 .
  • this camera may be connected to a computer system.
  • this computer may be connected for example to a dedicated video game console such as a Sony PlayStation or the like.
  • the computer interacts with an application program 121 to create a video screen.
  • the computer may also connect to the Internet shown as 130 , or more generally to any other network connection.
  • the display screen 105 may display an interactive scene shown generally as 140 .
  • the picture and/or video obtained by the camera 110 is modified and displayed as part of the image displayed to at least one user viewing the image 141 on the display.
  • the display may be interactive, and view information indicative of other users such as 142 .
  • the other user 142 may be a current user—e.g. someone in some other location at a current time. the other user may also be a previous player, someone who played at a previous time.
  • the camera 110 takes pictures of users who are playing a game in one embodiment.
  • the camera can alternatively be built into the body of the console having only a lens exposed on the front surface.
  • the camera takes pictures of persons who are participating in a website, in another embodiment. Those pictures are stored in the computer.
  • the computer may execute the flowchart of FIG. 2 , which may be executed for example on a dedicated gaming console system such as an Xbox or the like.
  • a gaming console inputs a game that has copy protection and/or play protection from a removable memory source 121 , and plays the game directly from that source.
  • the source can be a prestamped DVD in an xbox, or a cartridge or other type device in certain Nintendo machines. Each of these devices checks to determine whether the game copy that is currently being played has been illegally copied. Only for example prestamped games with appropriate characteristics are allowed to play.
  • this system allows a dedicated game console of this type to include a picture of the actual person as part of the game being played.
  • the removable memory source 121 is a nonvolatile read/write memory, either completely changeable, or partly read only and partly changeable. This may use, for example, the techniques described in my co-pending application Ser. No. 12/013,434, filed Jan. 12, 2008, the disclosure of which is here with incorporated by reference.
  • the removable memory source 121 may include a read only portion 122 , as well as a read write portion 123 .
  • the read write portion may include the main executable of the game, or only a portion of the executable of the game.
  • the read only portion may include cryptographic keys, such as a private key for the game.
  • the read only portion 122 may include a private key which is used to form a message using a real-time clock that is stored within a gaming console 120 . The message is sent over the Internet 130 , and verified by a server before the game is allowed to be played.
  • a cryptographic signature within the read only portion 122 may be verified by the console 120 without sending it to the remote location, or by sending it to the remote location only at certain intervals.
  • One advantage of this system is that the game is stored in read/write memory, and hence updates to the game can be downloaded from the Internet 130 as necessary. Moreover, even if they game portion 123 is hacked, the encryption techniques may effectively prevent the console from playing the game or from connecting on the network.
  • the removable memory source in these embodiments may be a USB-based memory, has shown, or alternatively may be in a smartcard style form factor for example.
  • the computer system obtains a picture of the person in the field of view of the camera.
  • the picture is processed and stored as described in further detail herein.
  • the current picture is then used as the first person at 210 .
  • certain games and/or applications show other people in addition to (or in place of) the first person player.
  • the first person player For example, in a baseball or football game, there may be many players forming each team. Some of those players may have their faces selected from stored pictures at 215 . Presumably these people are friends or relatives of the person who owns the gaming console, since these are people who have played previously on the gaming console and have had their faces stored.
  • This gaming console may modify people within the game to display those people using faces and information from the current person's circle of friends. This compares with the prior art where these games have used either prestored chararacters/faces, or generic avatars.
  • the games and/or other application(s) display stored faces of previous players.
  • this face may be placed into any interactive application: games, chat, social networking or any other kind of website.
  • Processing of the face is shown in 205 .
  • the face is added to a body to be used as part of the application.
  • the face is edited at 208 prior to being used as part of the application.
  • the editing at 208 modifies the face picture by cartoonizing or photorealimizing this face picture prior to its use.
  • Cartoonizing may use, for example, conventional programs and plugins, such as those found in adobe photoshop, to make the image of the face look more like a cartoon.
  • Cartoonizing may also use the techniques described in, for example, WO/2006/003625 or United States Publication 20070008322.
  • a special photo-processing technique is disclosed herein for modifying the obtained pictures at 208 .
  • This uses special techniques to force the image to look less like an actual photograph. It also allows editing of only certain aspects of the image, prior to display. However, only certain aspects can be edited, to make sure that the image being displays bears certain resemblances to the actual user.
  • the inventor calls this technique a limited-photo-derealization system, since it makes the photo or frames of the video look more real; however it is “limited” in the amount of derealization of the photo that it can carry out.
  • the overall goal is to change some parts of the picture, while making the picture remain recognizable as the person.
  • a special image derealization system as described herein models the face according to a number of different parameters.
  • the face is passed through a feature-quantizing filter. That filter determines characteristics of the face including those illustrated in FIG. 3 .
  • Each of the different face characterization parameters may be measured. Without limitation, this may include a distance 301 between the eyes. It may include the shape of the eyes 302 ; the color of the eyes 303 ; the shape of the ear shown as 304 ; the position of the ears; type of earlobe. It can include the distance between the eyes and mouth; the size, shape and orientation of the nose, neck location; chin size and chin location; hair color; hair line. In general, all of the sizes and shapes of the features of the face are obtained: including skin type, color and tone.
  • the model quantizes the face characteristics, since it changes the face characteristics in a way that makes it look like the face.
  • the level of detail can be reduced. This may remove many of the specific characteristics of the face.
  • the model is then reconstituted to create a drawing of the face.
  • the reconstituted face looks like the original face, however has less individual detail than the original face.
  • Coarser quantization of the face shape and type make the face look more generic (more cartoonized) while still having an overall look like the original face. Hair color and hairstyle are also modeled.
  • the model and the modeling can be carried out relatively quickly.
  • the camera 110 can continually monitor the face, thereby detecting changes in the face. This monitoring allows detecting changes in the facial expressions, for example when the person smiles, cheek movements, and the like. Head movements may also be monitored in a similar way. These movements can be used as part of the model to show similar movements on the modeled individual.
  • One embodiment may require a specified number of changes per unit time, to ensure that the person is really sitting in front of the camera. Otherwise, the system could be spoofed by putting a regular picture in front of the camera.
  • An advantage of this system is that the image that is displayed on the website looks like the person who is sitting in front of the camera. It is not necessarily the exact image of the person, but is necessarily based on the person's actual looks. This is good because people interact in a number of ways. People react to others based on their looks. By seeing the way someone's eyes look, the way someone's face looks, their smile, and the like, people change the way they react and speak. In one embodiment, since eyes and mouth may be an extremely important part of the communication process, actual images of the eyes and/or mouth may be used.
  • the face modeling system also may allow certain kinds of edits. According to an embodiment, only some kinds of edits are permitted. Edits which would make too much change to the look of the user are disallowed.
  • the edits may allow the user to change some parts of the way they look, in the same way that a user might apply makeup before going out for a date or going out with a friend. The user can virtually apply their makeup, change their hair, etc; but cannot actually make themselves look different than their actual look.
  • the editing is not allowed to change anything that could not be changed by a user in the real world: e.g., makeup, clothing etc.
  • Another embodiment may allow surreal makeup: allowing changing things that could be done by plastic surgery, e.g., reducing weight, implants of various types, and others.
  • the editing which is allowed to be carried out at 208 may include makeup, hair, spot removal or other type imperfection removal on the face, getting or covering tattoos, putting on jewelry, or piercings, or other decorations that can be done in the real world.
  • a user can use makeup to change their skin tone or hair color.
  • a number of colorings may be controlled using color palettes, and sophisticated systems such as any of the different controls available in Adobe Photoshop.
  • a user can set their preferred settings for colors, hair, etc and save those settings as “presets” that can be used in other applications.
  • the whole body of the user is shown.
  • the user may have a body that is used to walk around in the virtual area defined by the game.
  • a social interaction embodiment is disclosed herein, that allows movement in an analogous way.
  • the body should match the face.
  • the skin tone and neck size is determined as part of the modeling, that skin tone and neck size should translate into a body that is selected for the face.
  • a database may be used to relate different face characteristics such as skin tone and head size to different sized people and body shape. This database may also use other user information, such as their height and weight. All of this can be used to make the body and actual face to look more natural.
  • different tiers of users are allowed to make different levels of changes to their looks.
  • the basic tier of users may only be allowed to change their hair and clothes.
  • Other tiers of people may be allowed to put on different kinds of makeup.
  • Other tiers may be allowed to carry out plastic surgery style changes to themselves via the looks editor.
  • Another embodiment matches a user's voice at 213 to the body and/or face.
  • a voice is recorded, and a voice model of that recorded voice is obtained.
  • the voice model is used with the face and body whenever the user speaks within the application. By using a person's real voice associated with the body and/or face, the speaking will appear to be more natural for the body. The inventor recognizes that a voice sounds more natural coming from a person who it looks like.
  • Another embodiment shown as 214 uses an age or sex detector, and attempts to determine characteristics of the age or sex of the person. By determining characteristics of the age or sex of the person, this determination can be used as part of determining if the person is doing appropriate things on the Internet. For example, a 50-year-old man may be prevented from speaking to a 12-year-old boys, by an automatically-enforced computer based rule.
  • Another embodiment may be used with a login system to determine whether a currently-obtained face matches a face previously registered, before allowing the person to continue in the game. This may avoid a mother posing as a daughter for example or the like.
  • Another embodiment relates to use with a social networking website.
  • a social networking site has a goal of allowing interaction with friends. But how do you actually make friends? In many sites, there is no easy way to make new friends beyond those you have in the real world. For example, in Facebook.com, you cannot really make friends: You can only come into face book with friends you already have, and try to make new connections based on those connections to those connections you already have. While the connections to connections may provide interesting results, it does not really provide a way to make new friends
  • the present system describes a way of sending anonymizing yourself, but also removes many of the aspects of social awkwardness.
  • the embodiment therefore uses the computer 120 connected to the Internet as a client, to contact a server which maintains a virtual system where a number of people, including yourself, can congregate.
  • This embodiment calls this congregation a “huddle”.
  • FIG. 4 shows a huddle including “me”, shown as 400 , but there are many other people within the huddle also shown as 402 , and others. In fact, while only two actual people are shown, there are many other people within the huddle shown as 406 , 408 .
  • Controls on the user interface for the computer allow a person to move around the huddle.
  • the person 400 is you, but choose look like you although you may not look exactly like you and the person 400 looks somewhat like you. As you move around, you “see” others. Based on seeing these others, you can tell if perhaps you know them, because it looks like virtual representations of the other people. You can also talk to them, otherwise interact with them, and follow their facial expressions. For example, each person may define around them a “wingspan” shown as 403 . By coming into the “wingspan” of a person, you enable communications with that person. However, unlike in the real world, you can stand in the huddle all by yourself without feeling self-conscious. You can talk to anyone in the huddle, and be semi-anonymous, or you can agree to exchange personal information.
  • the communication within the huddle may be using the audio system described above, or may be by text chat, or the like.
  • the social networking site may have more than one huddle.
  • FIG. 4 also shows how huddle A may include a different demographic of people then huddle B. Preferably the demographic of the huddles are different.
  • the huddles may be arranged by age and/or sex, to avoid 12-year-olds being in the same huddle with 50-year-olds. Another embodiment may automatically determine the age as described above, and prevent a user from entering a huddle that you do not belong in.
  • users on the website are allowed to enter one of a number of different huddles. You may select a huddle from among the huddles you are authorized to join. Alternatively, in another embodiment, the huddle may be selected for you randomly. In yet another embodiment, the website may automatically determine your location, for example by GPS or IP address, and use those GPS coordinates to set a huddle, so that people in the same geographic location are placed in the same huddle.
  • the GPS coordinates may also be automatically obtained from a telephone, e.g., a cell phone, that is on the person of the user; like a portable phone that is in communication with the computer 130 .
  • a telephone e.g., a cell phone
  • An embodiment sets a maximum number of people that can be located in the huddle. When the huddle is full, the user can ask to go on a waiting list. In another embodiment, the user can make a reservation to enter a huddle at a specified time. In yet another embodiment, the huddle may expand in size as more people come into it, preventing the huddles from filling.
  • the interactions in the huddle may take any of a number of different forms.
  • One form of interaction may be games or contests that are carried out in the huddle.
  • Another form of interaction may grade the way that people interact within the huddle. However these interactions are carried out, the participation is scored. Participants with higher scores can do more things. For example, participants with higher scores may be able to obtain more abilities to modify their looks. They may attain priorities on the waiting list. They may attain better ability to move within the huddle. They may be able to obtain different looks, for example more brightness within the huddle.
  • one desirable feature within the huddle is that everybody be willing to talk to everybody else. Accordingly, the pro-social behavior within the huddle may be scored. Low scores may be obtained for cliquish behavior, for example refusing to answer a question someone asks, or refusing to talk to someone. Reductions in scores may also be obtained for rudeness, for example banging into someone within the huddle without saying “excuse me”. Increases in score may be obtained for desirable actions such as speaking to strangers, or the like.
  • a number of people may decide to meet in a meeting place within the huddle. For example, people may send a huddle invitation saying “let's meet in the huddle A area A at 9 p.m”. The huddles may be divided into different areas that are different. People may want to reserve a spot to avoid a waiting list. You can send a meeting invitation which includes confirmation of the spot. The meeting invitation may be confirmed by the website by sending an code that allows entry into the huddle or adding a user's identification name to an “approved” list.
  • the huddle may have a center area shown as 410 .
  • That center area may include a bulletin board 411 that talks about different areas in the huddle and what is happening at those different areas either right now or at some time in the future.
  • Another embodiment puts all new entries into the huddle at a random location, or at the location which is least populated.
  • the invite code may be an area of the huddle into which the user wants to go.
  • the bulletin board 411 may include not only ongoing activities, but also requests. For example, any user may post a request such as “does anyone want to talk about sesame ice cream?”. The bulletin board may also say “activity x is going on in quadrant 89 in five minutes”. Different activities such as spelling bees, Sudoku, tests, and the like may be carried out.
  • the huddles can also include, however, commercial content.
  • huddles may have advertisers, and/or storefronts into which the user can be directed.
  • huddle B shows a Domino's advertisement shown as 420 .
  • a user can walk onto the advertisement to automatically be teleported to the storefronts.
  • the storefront can be used to provide, within the huddle, real time information about a transaction.
  • a pizza can be purchased using the on line virtual ordering system.
  • the user enters into the huddle, and is transported into the storefront shown as 430 . Within the storefront, the user is provided with a number of different possibilities.
  • the menu has a number of different items on it. Each of the items, for example, may be associated with different information about the item. By selecting the item, that information can be brought out such as a real photo of the item, a time to delivery, uprights, nutritional information.
  • the user can order any of these items off the menu, and upon pushing up a button, go to a payment window shown as 433 . After payment, the order is in process, and the user returns to the virtual restaurant storefront.
  • the real-time status window may say pizza is being made, and may show a camera version of the pizza being made using a web cam.
  • the order status changes to out for delivery.
  • the out for delivery status may have a real-time estimate of time to delivery.
  • There may also be a camera, for example, in the delivery person's car, which shows the real location of the delivery person.
  • a GPS tracker in either the delivery persons car or in a delivery person's cell phone may show the delivery person's real-time location on a map.
  • the status window may also show who's in front of me for delivery; estimated time to delivery, and the like.
  • the real-time status can be obtained from different terminals, for example the delivery person cell phone. It can be used for common carrier or the common carrier person's cell phone and shows their real-time status.
  • the user can enter and leave the storefront at any time. The user can leave and come back and see the real-time status upon returning.
  • a UPS storefront may be entered, and the packages associated with the user automatically tracked on a bulletin board in the storefront.
  • the system can carry out real time tracking of those packages, e.g, showing the inside of the truck on which the item is located, or showing GPS results on a map as to exactly where the package is located.
  • the system can also be used for other purposes besides stores. For example, this can be used for doctor appointments or other kinds of appointments.
  • the appointment can be made, and the user can see the waiting room, see their last visits, see test results and see doctor messages.
  • Each item within the store within the area becomes part of what is in essence a huddle.
  • this can be used, for example, for airlines, e.g. airline reservations, itineraries, flight status, statistics, and flight check-in. Real-time versions of what is happening on the flight, the waiting room at the airport, and the lines at the airport can also be seen in an analogous way.
  • airlines e.g. airline reservations, itineraries, flight status, statistics, and flight check-in. Real-time versions of what is happening on the flight, the waiting room at the airport, and the lines at the airport can also be seen in an analogous way.
  • the user's identity may be automatically ascertained, e.g., by determining a user name or other way. Based on the user's identity being detected in a store, all information associated with that identity can be displayed within that store, e.g., on simulated boards within the store.
  • the boards can display text, maps showing real time location of the delivery person/etc, windows showing camera results, and/or all of the information discussed above.
  • An advertiser for software or games or the like can also advertise their product. By stepping on the banner, this may open a new window, that provides you information about the product. If it may open a trial software version of the product that can only be played within the huddle, thereby avoiding the problems of piracy or improper use of pirated software. If the trial version is a big download, for example, a user may be guided to or otherwise allowed to do other things in the background.
  • the computers described herein may be any kind of computer, either general purpose, or some specific purpose computer such as a workstation.
  • the computer may be an Intel (e.g., Pentium or Core 2 duo) or AMD based computer, running Windows XP or Linux, or may be a Macintosh computer.
  • the computer may also be a laptop.
  • the programs may be written in C or Python, or Java, Brew or any other programming language.
  • the programs may be resident on a storage medium, e.g., magnetic or optical, e.g. the computer hard drive, a removable disk or media such as a memory stick or SD media, wired or wireless network based or Bluetooth based Network Attached Storage (NAS), or other removable medium or other removable medium.
  • the programs may also be run over a network, for example, with a server or other machine sending signals to the local machine, which allows the local machine to carry out the operations described herein.

Abstract

A system allows you to do things on the internet in a way you might do them in the real world. This system allows modifying your look on the internet, but only by an amount that changes your look like you could change it in the real world, e.g, with makeup or plastic surgery. More extensive changes are not allowed, to prevent the user from masking their identity or characteristics via the editing of their looks.

Description

    BACKGROUND
  • Different applications on the Internet involve interacting with other people on the Internet. Electronic games allow playing games, either against a simulated person or against other real people either locally or remote. Online gaming sites, such as world of warcraft and others, allow people to carry out different operations and have their own persona indicative of those operations.
  • The Internet also allows other actions to interact with others, including chatting in chat rooms, social networking, e-mail, and others. All of these applications represent “you” on the Internet by something on the internet. That something may be an avatar, or may be some other indicia indicative of “you” and “others”.
  • SUMMARY
  • The present application describes special applications of avatars. Another application describes a special way of meeting on the Internet using this special avatar application.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other aspects will now be described in detail with reference to the accompanying drawings, wherein:
  • FIG. 1 shows a basic block diagram of a person interacting with a computer system.
  • FIG. 2 illustrates a flowchart of operation;
  • FIG. 3 shows a face modeling system; and
  • FIG. 4 shows a social networking system with huddles and storefronts.
  • DETAILED DESCRIPTION
  • FIG. 1 shows a block diagram of a “user” interacting with a computer-based interactive system. The interactive system can be a personal computer running a program that interacts with the internet, or can be a dedicated video gaming console such as a Sony Playstation or Xbox, or any other type device that allows interaction via a user interface and display. The user typically sits in a location in front of the display screen 105.
  • In an embodiment, camera 110 may be located to obtain a picture or image of at least the face of the user 100. In one embodiment, this camera may be connected to a computer system. In another embodiment, this computer may be connected for example to a dedicated video game console such as a Sony PlayStation or the like. The computer interacts with an application program 121 to create a video screen. The computer may also connect to the Internet shown as 130, or more generally to any other network connection. The display screen 105 may display an interactive scene shown generally as 140. The picture and/or video obtained by the camera 110 is modified and displayed as part of the image displayed to at least one user viewing the image 141 on the display. The display may be interactive, and view information indicative of other users such as 142. The other user 142 may be a current user—e.g. someone in some other location at a current time. the other user may also be a previous player, someone who played at a previous time. For example, the camera 110 takes pictures of users who are playing a game in one embodiment. The camera can alternatively be built into the body of the console having only a lens exposed on the front surface. The camera takes pictures of persons who are participating in a website, in another embodiment. Those pictures are stored in the computer.
  • The computer may execute the flowchart of FIG. 2, which may be executed for example on a dedicated gaming console system such as an Xbox or the like. Such a gaming console inputs a game that has copy protection and/or play protection from a removable memory source 121, and plays the game directly from that source. For example, the source can be a prestamped DVD in an xbox, or a cartridge or other type device in certain Nintendo machines. Each of these devices checks to determine whether the game copy that is currently being played has been illegally copied. Only for example prestamped games with appropriate characteristics are allowed to play.
  • For the first time, this system allows a dedicated game console of this type to include a picture of the actual person as part of the game being played.
  • According to another embodiment, the removable memory source 121 is a nonvolatile read/write memory, either completely changeable, or partly read only and partly changeable. This may use, for example, the techniques described in my co-pending application Ser. No. 12/013,434, filed Jan. 12, 2008, the disclosure of which is here with incorporated by reference.
  • In one embodiment, the removable memory source 121 may include a read only portion 122, as well as a read write portion 123. The read write portion may include the main executable of the game, or only a portion of the executable of the game. The read only portion may include cryptographic keys, such as a private key for the game. In one embodiment, the read only portion 122 may include a private key which is used to form a message using a real-time clock that is stored within a gaming console 120. The message is sent over the Internet 130, and verified by a server before the game is allowed to be played. In another embodiment, a cryptographic signature within the read only portion 122 may be verified by the console 120 without sending it to the remote location, or by sending it to the remote location only at certain intervals.
  • One advantage of this system is that the game is stored in read/write memory, and hence updates to the game can be downloaded from the Internet 130 as necessary. Moreover, even if they game portion 123 is hacked, the encryption techniques may effectively prevent the console from playing the game or from connecting on the network.
  • The removable memory source in these embodiments may be a USB-based memory, has shown, or alternatively may be in a smartcard style form factor for example.
  • At 200, the computer system obtains a picture of the person in the field of view of the camera. The picture is processed and stored as described in further detail herein.
  • The current picture is then used as the first person at 210. However, certain games and/or applications show other people in addition to (or in place of) the first person player. For example, in a baseball or football game, there may be many players forming each team. Some of those players may have their faces selected from stored pictures at 215. Presumably these people are friends or relatives of the person who owns the gaming console, since these are people who have played previously on the gaming console and have had their faces stored. This gaming console may modify people within the game to display those people using faces and information from the current person's circle of friends. This compares with the prior art where these games have used either prestored chararacters/faces, or generic avatars. Here, the games and/or other application(s) display stored faces of previous players.
  • More generally, however, this face may be placed into any interactive application: games, chat, social networking or any other kind of website.
  • Processing of the face is shown in 205. The face is added to a body to be used as part of the application. In embodiments, as described herein, the face is edited at 208 prior to being used as part of the application.
  • If a crisp picture of the face was used, I recognize that the face itself might not look natural or realistic in the online environment. People also often don't like the way they look at any given moment—they would prefer to adjust that look. However, on the other side of this issue; some applications, such as the social networking applications described herein, make it desirable that the person who is speaking looks somewhat like the actual person both for realism and also for certain security aspects.
  • According to a first embodiment, the editing at 208 modifies the face picture by cartoonizing or photorealimizing this face picture prior to its use. Cartoonizing may use, for example, conventional programs and plugins, such as those found in adobe photoshop, to make the image of the face look more like a cartoon. Cartoonizing may also use the techniques described in, for example, WO/2006/003625 or United States Publication 20070008322.
  • A special photo-processing technique is disclosed herein for modifying the obtained pictures at 208. This uses special techniques to force the image to look less like an actual photograph. It also allows editing of only certain aspects of the image, prior to display. However, only certain aspects can be edited, to make sure that the image being displays bears certain resemblances to the actual user. The inventor calls this technique a limited-photo-derealization system, since it makes the photo or frames of the video look more real; however it is “limited” in the amount of derealization of the photo that it can carry out. The overall goal is to change some parts of the picture, while making the picture remain recognizable as the person.
  • A special image derealization system as described herein models the face according to a number of different parameters. In this embodiment, the face is passed through a feature-quantizing filter. That filter determines characteristics of the face including those illustrated in FIG. 3. Each of the different face characterization parameters may be measured. Without limitation, this may include a distance 301 between the eyes. It may include the shape of the eyes 302; the color of the eyes 303; the shape of the ear shown as 304; the position of the ears; type of earlobe. It can include the distance between the eyes and mouth; the size, shape and orientation of the nose, neck location; chin size and chin location; hair color; hair line. In general, all of the sizes and shapes of the features of the face are obtained: including skin type, color and tone.
  • Each of these features are characterized, thereby forming a model of the face. The model, in essence, however, quantizes the face characteristics, since it changes the face characteristics in a way that makes it look like the face. When the model is used to recreate a face image, the level of detail can be reduced. This may remove many of the specific characteristics of the face. The model is then reconstituted to create a drawing of the face. The reconstituted face looks like the original face, however has less individual detail than the original face. Coarser quantization of the face shape and type make the face look more generic (more cartoonized) while still having an overall look like the original face. Hair color and hairstyle are also modeled.
  • The model and the modeling can be carried out relatively quickly. In addition, the camera 110 can continually monitor the face, thereby detecting changes in the face. This monitoring allows detecting changes in the facial expressions, for example when the person smiles, cheek movements, and the like. Head movements may also be monitored in a similar way. These movements can be used as part of the model to show similar movements on the modeled individual.
  • One embodiment may require a specified number of changes per unit time, to ensure that the person is really sitting in front of the camera. Otherwise, the system could be spoofed by putting a regular picture in front of the camera.
  • An advantage of this system is that the image that is displayed on the website looks like the person who is sitting in front of the camera. It is not necessarily the exact image of the person, but is necessarily based on the person's actual looks. This is good because people interact in a number of ways. People react to others based on their looks. By seeing the way someone's eyes look, the way someone's face looks, their smile, and the like, people change the way they react and speak. In one embodiment, since eyes and mouth may be an extremely important part of the communication process, actual images of the eyes and/or mouth may be used.
  • However, there have been instances where people have logged on as someone else on a website and impersonated that other person. In the process of impersonating them, they may carry out undesirable actions such as scams or cyber bullying. Any damage of this system is that it deanonymizes the Internet, by forcing users to use at least a portion of their own likeness on the Internet to represent them. The picture shown on the internet is not the exact likeness of the user who is in front of the camera, but has a close enough likeness to avoid someone masquerading as a completely different person.
  • The face modeling system also may allow certain kinds of edits. According to an embodiment, only some kinds of edits are permitted. Edits which would make too much change to the look of the user are disallowed. The edits may allow the user to change some parts of the way they look, in the same way that a user might apply makeup before going out for a date or going out with a friend. The user can virtually apply their makeup, change their hair, etc; but cannot actually make themselves look different than their actual look. According to this embodiment, the editing is not allowed to change anything that could not be changed by a user in the real world: e.g., makeup, clothing etc. Another embodiment may allow surreal makeup: allowing changing things that could be done by plastic surgery, e.g., reducing weight, implants of various types, and others. This allows people to improve their look, but requires that the people keep their essential look. This thereby prevents the wholesale anonymizing of the Internet. The editing which is allowed to be carried out at 208, for example, may include makeup, hair, spot removal or other type imperfection removal on the face, getting or covering tattoos, putting on jewelry, or piercings, or other decorations that can be done in the real world.
  • In the real world, a user can use makeup to change their skin tone or hair color. A number of colorings may be controlled using color palettes, and sophisticated systems such as any of the different controls available in Adobe Photoshop. A user can set their preferred settings for colors, hair, etc and save those settings as “presets” that can be used in other applications.
  • By limiting the amount of change, we prevent people from looking like things they aren't. For example, we do not want to let an old man look like a young boy or a young girl. We don't want a parent to be able to look like their child. Rather, this system only allows changes to certain features that improve the look of the user with those certain features. However, according to this embodiment, changes that change the overall look of the user are not allowed.
  • In certain games, the whole body of the user is shown. For example, in so-called first-person games, the user may have a body that is used to walk around in the virtual area defined by the game. A social interaction embodiment is disclosed herein, that allows movement in an analogous way. According to this embodiment, the body should match the face. Since the skin tone and neck size is determined as part of the modeling, that skin tone and neck size should translate into a body that is selected for the face. A database may be used to relate different face characteristics such as skin tone and head size to different sized people and body shape. This database may also use other user information, such as their height and weight. All of this can be used to make the body and actual face to look more natural.
  • In another embodiment, different tiers of users are allowed to make different levels of changes to their looks. For example, the basic tier of users may only be allowed to change their hair and clothes. Other tiers of people may be allowed to put on different kinds of makeup. Other tiers may be allowed to carry out plastic surgery style changes to themselves via the looks editor.
  • Another embodiment matches a user's voice at 213 to the body and/or face. A voice is recorded, and a voice model of that recorded voice is obtained. The voice model is used with the face and body whenever the user speaks within the application. By using a person's real voice associated with the body and/or face, the speaking will appear to be more natural for the body. The inventor recognizes that a voice sounds more natural coming from a person who it looks like.
  • Another embodiment shown as 214, uses an age or sex detector, and attempts to determine characteristics of the age or sex of the person. By determining characteristics of the age or sex of the person, this determination can be used as part of determining if the person is doing appropriate things on the Internet. For example, a 50-year-old man may be prevented from speaking to a 12-year-old boys, by an automatically-enforced computer based rule.
  • Another embodiment may be used with a login system to determine whether a currently-obtained face matches a face previously registered, before allowing the person to continue in the game. This may avoid a mother posing as a daughter for example or the like.
  • Other applications become possible from this system; in which a person's likeness can be simulated and edited, but the amount of editing is limited to prevent changing the look of the person, e.g., changing looks that effect their age, sex or other features that may be important in a site that allows interacting with others.
  • Another embodiment relates to use with a social networking website.
  • A social networking site has a goal of allowing interaction with friends. But how do you actually make friends? In many sites, there is no easy way to make new friends beyond those you have in the real world. For example, in Facebook.com, you cannot really make friends: You can only come into face book with friends you already have, and try to make new connections based on those connections to those connections you already have. While the connections to connections may provide interesting results, it does not really provide a way to make new friends
  • In myspace.com, you can make friends by asking someone if they're willing to be your friend. However, you have no way to find these people other than their profiles.
  • Part of this embodiment, like other embodiments herein, tries to carry out actions on a website, using computer input devices and computer hardware, that simulate the way things are done in real life. The inventor recognizes that one way of interacting with other people is by “hanging out”. You may make friends by meeting people in a store, meeting people in a bar, just going through life. Sometimes you may just want to stand around to make friends. However standing around with nothing to do in real life, may be somewhat awkward. Also, there may be certain stigma associated with the idea of someone who just stands around and doesn't really do anything. What if you just want to stand around and watch? What if at other times, you just want to talk to people who walk by? Similarly, at a party you may stand around, but you might feel awkward if you're standing around by yourself. Or what about if you don't know anyone at the party? However, on the Internet, things are a little different.
  • The present system describes a way of sending anonymizing yourself, but also removes many of the aspects of social awkwardness. When you're on the Internet standing by yourself, you may feel a lot less self-conscious than you would if you were actually in a place standing by yourself. The embodiment therefore uses the computer 120 connected to the Internet as a client, to contact a server which maintains a virtual system where a number of people, including yourself, can congregate. This embodiment calls this congregation a “huddle”. For example, FIG. 4 shows a huddle including “me”, shown as 400, but there are many other people within the huddle also shown as 402, and others. In fact, while only two actual people are shown, there are many other people within the huddle shown as 406, 408. Controls on the user interface for the computer allow a person to move around the huddle. The person 400 is you, but choose look like you although you may not look exactly like you and the person 400 looks somewhat like you. As you move around, you “see” others. Based on seeing these others, you can tell if perhaps you know them, because it looks like virtual representations of the other people. You can also talk to them, otherwise interact with them, and follow their facial expressions. For example, each person may define around them a “wingspan” shown as 403. By coming into the “wingspan” of a person, you enable communications with that person. However, unlike in the real world, you can stand in the huddle all by yourself without feeling self-conscious. You can talk to anyone in the huddle, and be semi-anonymous, or you can agree to exchange personal information. The communication within the huddle may be using the audio system described above, or may be by text chat, or the like.
  • The social networking site may have more than one huddle. FIG. 4 also shows how huddle A may include a different demographic of people then huddle B. Preferably the demographic of the huddles are different. The huddles may be arranged by age and/or sex, to avoid 12-year-olds being in the same huddle with 50-year-olds. Another embodiment may automatically determine the age as described above, and prevent a user from entering a huddle that you do not belong in.
  • In one embodiment, users on the website are allowed to enter one of a number of different huddles. You may select a huddle from among the huddles you are authorized to join. Alternatively, in another embodiment, the huddle may be selected for you randomly. In yet another embodiment, the website may automatically determine your location, for example by GPS or IP address, and use those GPS coordinates to set a huddle, so that people in the same geographic location are placed in the same huddle.
  • The GPS coordinates, for example, may also be automatically obtained from a telephone, e.g., a cell phone, that is on the person of the user; like a portable phone that is in communication with the computer 130.
  • An embodiment sets a maximum number of people that can be located in the huddle. When the huddle is full, the user can ask to go on a waiting list. In another embodiment, the user can make a reservation to enter a huddle at a specified time. In yet another embodiment, the huddle may expand in size as more people come into it, preventing the huddles from filling.
  • The interactions in the huddle may take any of a number of different forms. One form of interaction may be games or contests that are carried out in the huddle. Another form of interaction may grade the way that people interact within the huddle. However these interactions are carried out, the participation is scored. Participants with higher scores can do more things. For example, participants with higher scores may be able to obtain more abilities to modify their looks. They may attain priorities on the waiting list. They may attain better ability to move within the huddle. They may be able to obtain different looks, for example more brightness within the huddle.
  • In addition to contests, one desirable feature within the huddle is that everybody be willing to talk to everybody else. Accordingly, the pro-social behavior within the huddle may be scored. Low scores may be obtained for cliquish behavior, for example refusing to answer a question someone asks, or refusing to talk to someone. Reductions in scores may also be obtained for rudeness, for example banging into someone within the huddle without saying “excuse me”. Increases in score may be obtained for desirable actions such as speaking to strangers, or the like.
  • A number of people may decide to meet in a meeting place within the huddle. For example, people may send a huddle invitation saying “let's meet in the huddle A area A at 9 p.m”. The huddles may be divided into different areas that are different. People may want to reserve a spot to avoid a waiting list. You can send a meeting invitation which includes confirmation of the spot. The meeting invitation may be confirmed by the website by sending an code that allows entry into the huddle or adding a user's identification name to an “approved” list.
  • In one embodiment, different areas in the huddle have different features and different operations. For example, the huddle may have a center area shown as 410. That center area may include a bulletin board 411 that talks about different areas in the huddle and what is happening at those different areas either right now or at some time in the future. In one embodiment, simply entering the huddle without a specific invitation or may cause the user to arrive automatically near the bulletin. Another embodiment puts all new entries into the huddle at a random location, or at the location which is least populated. As described above, the invite code may be an area of the huddle into which the user wants to go.
  • The bulletin board 411 may include not only ongoing activities, but also requests. For example, any user may post a request such as “does anyone want to talk about sesame ice cream?”. The bulletin board may also say “activity x is going on in quadrant 89 in five minutes”. Different activities such as spelling bees, Sudoku, tests, and the like may be carried out.
  • The huddles can also include, however, commercial content. For example, huddles may have advertisers, and/or storefronts into which the user can be directed. For example, huddle B shows a Domino's advertisement shown as 420. A user can walk onto the advertisement to automatically be teleported to the storefronts. Say the user wants to buy a pizza from Domino's. The storefront can be used to provide, within the huddle, real time information about a transaction. For example, a pizza can be purchased using the on line virtual ordering system. In the huddle embodiment, the user enters into the huddle, and is transported into the storefront shown as 430. Within the storefront, the user is provided with a number of different possibilities. One of these possibilities is a menu shown as 431. The menu has a number of different items on it. Each of the items, for example, may be associated with different information about the item. By selecting the item, that information can be brought out such as a real photo of the item, a time to delivery, uprights, nutritional information. The user can order any of these items off the menu, and upon pushing up a button, go to a payment window shown as 433. After payment, the order is in process, and the user returns to the virtual restaurant storefront.
  • Application of this system is described herein, to form a real-time monitor over the order, from the moment is placed until the moment it is delivered. This makes it possible to order on the website, and from the moment of ordering, obtain real-time viewing of the order's progress. In essence, this provides a real-time view into the supply chain.
  • The user in the restaurant has the user's status associated with the order. The user can walk at any time, for example, to a real-time status and board that shows the real-time status of the order. The order has already been transmitted to a bricks and mortar version of the store, for example the real Domino's where the pizza is being made. Note that while this describes being used with the huddle system of FIG. 4, it can also certainly be used with other websites, simply allowing the order and real-time status part.
  • In the real Domino's, there may be a number of cameras, shown for example as 434. The real-time status window may say pizza is being made, and may show a camera version of the pizza being made using a web cam. At sometime after that, the order status changes to out for delivery. The out for delivery status may have a real-time estimate of time to delivery. There may also be a camera, for example, in the delivery person's car, which shows the real location of the delivery person. As an alternative, a GPS tracker in either the delivery persons car or in a delivery person's cell phone may show the delivery person's real-time location on a map. The status window may also show who's in front of me for delivery; estimated time to delivery, and the like.
  • While this has been shown for the embodiment of a pizza at places like Domino's, it should be understood that this can be used for many other purposes. It may be used to secure a place in line at a restaurant, getting real-time status of who's in front of you, and where the various people are in their eating (for example “11 tables have received their bills”) and the like. He can be used for ordering clothing or other merchandise. The real-time status can be obtained from different terminals, for example the delivery person cell phone. It can be used for common carrier or the common carrier person's cell phone and shows their real-time status. Moreover, the user can enter and leave the storefront at any time. The user can leave and come back and see the real-time status upon returning. For example, you could leave the Domino's and go to another storefront or go to a different huddle, but return to see the real-time status. By sitting at your home computer, you can see the real-time status of any order you've made by entering that storefront. For example, a UPS storefront may be entered, and the packages associated with the user automatically tracked on a bulletin board in the storefront. Moreover, the system can carry out real time tracking of those packages, e.g, showing the inside of the truck on which the item is located, or showing GPS results on a map as to exactly where the package is located.
  • The system can also be used for other purposes besides stores. For example, this can be used for doctor appointments or other kinds of appointments. The appointment can be made, and the user can see the waiting room, see their last visits, see test results and see doctor messages. Each item within the store within the area becomes part of what is in essence a huddle.
  • According to another embodiment, this can be used, for example, for airlines, e.g. airline reservations, itineraries, flight status, statistics, and flight check-in. Real-time versions of what is happening on the flight, the waiting room at the airport, and the lines at the airport can also be seen in an analogous way.
  • When a user enters this storefront, all of the above-discussed information is displayed at different locations within the storefront. The user's identity may be automatically ascertained, e.g., by determining a user name or other way. Based on the user's identity being detected in a store, all information associated with that identity can be displayed within that store, e.g., on simulated boards within the store. The boards can display text, maps showing real time location of the delivery person/etc, windows showing camera results, and/or all of the information discussed above.
  • An advertiser for software or games or the like can also advertise their product. By stepping on the banner, this may open a new window, that provides you information about the product. If it may open a trial software version of the product that can only be played within the huddle, thereby avoiding the problems of piracy or improper use of pirated software. If the trial version is a big download, for example, a user may be guided to or otherwise allowed to do other things in the background.
  • The general structure and techniques, and more specific embodiments which can be used to effect different ways of carrying out the more general goals are described herein.
  • Although only a few embodiments have been disclosed in detail above, other embodiments are possible and the inventors intend these to be encompassed within this specification. The specification describes specific examples to accomplish a more general goal that may be accomplished in another way. This disclosure is intended to be exemplary, and the claims are intended to cover any modification or alternative which might be predictable to a person having ordinary skill in the art. For example, while the above describes certain kinds of operation over the internet, any other way of interacting via a shared network can be similarly controlled in this way.
  • Also, the inventors intend that only those claims which use the words “means for” are intended to be interpreted under 35 USC 112, sixth paragraph. Moreover, no limitations from the specification are intended to be read into any claims, unless those limitations are expressly included in the claims. The computers described herein may be any kind of computer, either general purpose, or some specific purpose computer such as a workstation. The computer may be an Intel (e.g., Pentium or Core 2 duo) or AMD based computer, running Windows XP or Linux, or may be a Macintosh computer. The computer may also be a laptop.
  • The programs may be written in C or Python, or Java, Brew or any other programming language. The programs may be resident on a storage medium, e.g., magnetic or optical, e.g. the computer hard drive, a removable disk or media such as a memory stick or SD media, wired or wireless network based or Bluetooth based Network Attached Storage (NAS), or other removable medium or other removable medium. The programs may also be run over a network, for example, with a server or other machine sending signals to the local machine, which allows the local machine to carry out the operations described herein.
  • Where a specific numerical value is mentioned herein, it should be considered that the value may be increased or decreased by 20%, while still staying within the teachings of the present application, unless some different range is specifically mentioned. Where a specified logical sense is used, the opposite logical sense is also intended to be encompassed.

Claims (20)

1. A method comprising:
displaying an application on a screen associated with a computer screen;
obtaining a first picture of a first person operating said application at a first time; and
using said first picture of said first person as a part of a display created by said application at a second time subsequent to said first time, wherein said second time is a time when said first picture of said first person operating said application is no longer being obtained.
2. A method as in claim 1, further comprising obtaining a second picture of a second person at said second time, and wherein said using comprises using said second picture of said second person also as part of said display created by said application.
3. A method as in claim 2, further comprising storing a picture of at least a plurality of players who use said application.
4. A method as in claim 3, wherein said application is a game.
5. A method as in claim 4, wherein said game has teams with multiple players, and at least multiple ones of said multiple players on at least one of said teams are selected from said stored pictures.
6. A method as in claim 5, wherein at least one of said teams includes images of multiple players, each of which multiple players having images that are based on said stored pictures of previous players of said game, and wherein at least one of said multiple players is based on said first picture of said first person.
7. A method as in claim 1, wherein said application is a social networking application.
8. A method as in claim 1, further comprising modeling cartoonizing said first picture prior to said displaying.
9. A method comprising:
playing a computer based game;
obtaining pictures of players playing said game; and
as part of said playing said game, displaying actual faces of previous players playing said game including at least one player who is not playing at a current time.
10. A method as in claim 9, further comprising storing a picture of each player who uses said application.
11. A method as in claim 9, wherein said computer-based game has teams with multiple players, and at least multiple ones of said multiple players on at least one of said teams are selected from said stored pictures.
12. A method as in claim 9, further comprising processing said pictures prior to said displaying.
13. A gaming console comprising:
a computer system, that allows receiving at least one removable computer-readable memory, including a game to be played, and checks at least one aspect of said removable computer readable media to determine whether said media is authorized for play, and allows playing an application that is stored on said media only to said determining indicates that the application is authorized for play;
a camera, associated with said computer system, and obtaining an image of a person in a vicinity of said computer system at a first time; and
said computer system using said image of said person at said first time as part of a display for said game.
14. A console as in claim 13, further comprising a memory which stores said first picture, and wherein said computer system uses said first picture of said first person as a part of a display created by said application at a second time subsequent to said first time, wherein said second time is a time when said first picture of said first person operating said application is no longer being obtained.
15. A console as in claim 14, wherein said memory stores images of at least a plurality of players who play said game.
16. A console as in claim 14, wherein said computer checks an encryption key on said removable computer readable memory, and verifies that said encryption key matches a specified criteria, prior to indicating that the application is authorized for play.
16. A console as in claim 15, wherein said removable computer readable memory stores an application to be played in read/write memory of a type where the game can be changed based on game updates.
17. A method as in claim 15, wherein said game has teams with multiple players, and at least multiple ones of said multiple players on at least one of said teams are selected from said stored pictures.
18. A method comprising:
allowing login to a site that allows social interaction with other users; and
verifying that a user logging into the site is actually the person associated with the login using at least one technique other than the login itself.
19. A method as in claim 18, wherein said verifying comprises obtaining an image of the user, and comparing the image with a pre-stored image of the user.
US12/138,221 2008-06-12 2008-06-12 Face Simulation in Networking Abandoned US20090312100A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/138,221 US20090312100A1 (en) 2008-06-12 2008-06-12 Face Simulation in Networking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/138,221 US20090312100A1 (en) 2008-06-12 2008-06-12 Face Simulation in Networking

Publications (1)

Publication Number Publication Date
US20090312100A1 true US20090312100A1 (en) 2009-12-17

Family

ID=41415306

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/138,221 Abandoned US20090312100A1 (en) 2008-06-12 2008-06-12 Face Simulation in Networking

Country Status (1)

Country Link
US (1) US20090312100A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150038225A1 (en) * 2012-03-13 2015-02-05 Neowiz Bless Studio Corporation Online game providing method for providing character makeup and system therefor

Citations (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4710873A (en) * 1982-07-06 1987-12-01 Marvin Glass & Associates Video game incorporating digitized images of being into game graphics
US5830065A (en) * 1992-05-22 1998-11-03 Sitrick; David H. User image integration into audiovisual presentation system and methodology
US6117011A (en) * 1995-07-27 2000-09-12 Lvov; Denis Ernestovich Electronic game system, method of managing and regulating said system
US6139433A (en) * 1995-11-22 2000-10-31 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US6283858B1 (en) * 1997-02-25 2001-09-04 Bgk International Incorporated Method for manipulating images
US20020050997A1 (en) * 2000-01-28 2002-05-02 Square Co., Ltd. Method, game machine and recording medium for displaying motion in a video game
US20020098885A1 (en) * 2001-01-24 2002-07-25 Square Co. Video game system and control method thereof and program of video game and computer readable record medium recorded with the program
US20020142825A1 (en) * 2001-03-27 2002-10-03 Igt Interactive game playing preferences
US6677967B2 (en) * 1997-11-20 2004-01-13 Nintendo Co., Ltd. Video game system for capturing images and applying the captured images to animated game play characters
US20040046736A1 (en) * 1997-08-22 2004-03-11 Pryor Timothy R. Novel man machine interfaces and applications
US20040063084A1 (en) * 1995-01-20 2004-04-01 Macri Vincent J. Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
US20040259631A1 (en) * 2000-09-27 2004-12-23 Milestone Entertainment Llc Apparatus, systems and methods for implementing enhanced gaming and prizing parameters in an electronic environment
US6890262B2 (en) * 2001-07-19 2005-05-10 Konami Corporation Video game apparatus, method and recording medium storing program for controlling viewpoint movement of simulated camera in video game
US20050176486A1 (en) * 2004-02-09 2005-08-11 Nintendo Co., Ltd. Game apparatus and storage medium having game program stored therein
US20050176502A1 (en) * 2004-02-09 2005-08-11 Nintendo Co., Ltd. Game apparatus and storage medium having game program stored therein
US20050250083A1 (en) * 1997-10-06 2005-11-10 Macri Vincent J Method and apparatus for instructors to develop pre-training lessons using controllable images
US20060068917A1 (en) * 2004-09-21 2006-03-30 Snoddy Jon H System, method and handheld controller for multi-player gaming
US7027054B1 (en) * 2002-08-14 2006-04-11 Avaworks, Incorporated Do-it-yourself photo realistic talking head creation system and method
US7112134B1 (en) * 2002-03-26 2006-09-26 Pixel Puzzles, Inc. Method and system for photographic gaming
US20070008322A1 (en) * 2005-07-11 2007-01-11 Ludwigsen David M System and method for creating animated video with personalized elements
US20070243918A1 (en) * 2006-04-18 2007-10-18 Yahoo! Inc. Player roster selection interface
US20070283265A1 (en) * 2006-05-16 2007-12-06 Portano Michael D Interactive gaming system with animated, real-time characters
US7309280B2 (en) * 2002-07-16 2007-12-18 Aruze Co., Ltd. Gaming machine, server, and program with image of real player
US20080001951A1 (en) * 2006-05-07 2008-01-03 Sony Computer Entertainment Inc. System and method for providing affective characteristics to computer generated avatar during gameplay
US7328119B1 (en) * 2000-03-07 2008-02-05 Pryor Timothy R Diet and exercise planning and motivation including apparel purchases based on future appearance
US20080030496A1 (en) * 2007-01-03 2008-02-07 Social Concepts, Inc. On-line interaction system
US20080122805A1 (en) * 2000-10-11 2008-05-29 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US20080182664A1 (en) * 2007-01-26 2008-07-31 Winster, Inc. Games Promoting Cooperative And Interactive Play
US20080214273A1 (en) * 2004-09-21 2008-09-04 Snoddy Jon H System, method and handheld controller for multi-player gaming
US20080227524A1 (en) * 1996-11-14 2008-09-18 Bally Gaming, Inc. Tournament qualification & characteristics in a gaming system
US20080254829A1 (en) * 2007-04-16 2008-10-16 Ntt Docomo, Inc. Control Apparatus, Mobile Communications System, and Communications Terminal
US20080261693A1 (en) * 2008-05-30 2008-10-23 Sony Computer Entertainment America Inc. Determination of controller three-dimensional location using image analysis and ultrasonic communication
US20080268961A1 (en) * 2007-04-30 2008-10-30 Michael Brook Method of creating video in a virtual world and method of distributing and using same
US20080293488A1 (en) * 2007-05-21 2008-11-27 World Golf Tour, Inc. Electronic game utilizing photographs
US20080309671A1 (en) * 2007-06-18 2008-12-18 Brian Mark Shuster Avatar eye control in a multi-user animation environment
US20090027337A1 (en) * 2007-07-27 2009-01-29 Gesturetek, Inc. Enhanced camera-based input
US20090042647A1 (en) * 2007-07-27 2009-02-12 Phenomenon Holdings, Ltd. Method and device for controlling a motion-sequence within a simulated game or sports event
US20090053970A1 (en) * 2003-07-02 2009-02-26 Ganz Interactive action figures for gaming schemes
US20090075731A1 (en) * 2006-06-15 2009-03-19 Konami Digital Entertainment Co., Ltd. Game program, game apparatus, and game method
US20090132371A1 (en) * 2007-11-20 2009-05-21 Big Stage Entertainment, Inc. Systems and methods for interactive advertising using personalized head models
US20090147003A1 (en) * 2007-12-10 2009-06-11 International Business Machines Corporation Conversion of Two Dimensional Image Data Into Three Dimensional Spatial Data for Use in a Virtual Universe
US20090186700A1 (en) * 2008-01-19 2009-07-23 Tim Konkle System and method for providing interactive content for multiple networked users in a shared venue using short messaging service communication
US20090196516A1 (en) * 2002-12-10 2009-08-06 Perlman Stephen G System and Method for Protecting Certain Types of Multimedia Data Transmitted Over a Communication Channel
US7614948B2 (en) * 2003-09-15 2009-11-10 Igt Multi-player bingo with slept awards reverting to progressive jackpot pool
US20100007665A1 (en) * 2002-08-14 2010-01-14 Shawn Smith Do-It-Yourself Photo Realistic Talking Head Creation System and Method
US20100099471A1 (en) * 2008-10-17 2010-04-22 Feeney Robert J Network-Based Contests Having Multiple Participating Sponsors
US20100105454A1 (en) * 2006-04-13 2010-04-29 Igt Methods and systems for interfacing with a third-party application
US20100160041A1 (en) * 2008-12-19 2010-06-24 Immersion Corporation Interactive painting game and associated controller
US20100229106A1 (en) * 2009-03-06 2010-09-09 Trion World Network, Inc. Synthetic environment character data sharing
US20100227688A1 (en) * 2009-03-06 2010-09-09 Trion World Network, Inc. Synthetic environment character data sharing
US20100231790A1 (en) * 2006-12-29 2010-09-16 Prodea Systems, Inc Display inserts, overlays, and graphical user interfaces for multimedia systems
US20100302142A1 (en) * 1995-11-06 2010-12-02 French Barry J System and method for tracking and assessing movement skills in multidimensional space
US7847808B2 (en) * 2006-07-19 2010-12-07 World Golf Tour, Inc. Photographic mapping in a simulation
US7967674B2 (en) * 2004-08-20 2011-06-28 Igt Gaming device and method having a first interactive game which determines a function of a second wagering game
US20110248992A1 (en) * 2010-04-07 2011-10-13 Apple Inc. Avatar editing environment
US8047915B2 (en) * 2006-01-11 2011-11-01 Lyle Corporate Development, Inc. Character for computer game and method

Patent Citations (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4710873A (en) * 1982-07-06 1987-12-01 Marvin Glass & Associates Video game incorporating digitized images of being into game graphics
US5830065A (en) * 1992-05-22 1998-11-03 Sitrick; David H. User image integration into audiovisual presentation system and methodology
US20040063084A1 (en) * 1995-01-20 2004-04-01 Macri Vincent J. Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
US6117011A (en) * 1995-07-27 2000-09-12 Lvov; Denis Ernestovich Electronic game system, method of managing and regulating said system
US20100302142A1 (en) * 1995-11-06 2010-12-02 French Barry J System and method for tracking and assessing movement skills in multidimensional space
US6139433A (en) * 1995-11-22 2000-10-31 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US20080227524A1 (en) * 1996-11-14 2008-09-18 Bally Gaming, Inc. Tournament qualification & characteristics in a gaming system
US6283858B1 (en) * 1997-02-25 2001-09-04 Bgk International Incorporated Method for manipulating images
US20080122786A1 (en) * 1997-08-22 2008-05-29 Pryor Timothy R Advanced video gaming methods for education and play using camera based inputs
US20060202953A1 (en) * 1997-08-22 2006-09-14 Pryor Timothy R Novel man machine interfaces and applications
US20040046736A1 (en) * 1997-08-22 2004-03-11 Pryor Timothy R. Novel man machine interfaces and applications
US20050250083A1 (en) * 1997-10-06 2005-11-10 Macri Vincent J Method and apparatus for instructors to develop pre-training lessons using controllable images
US6677967B2 (en) * 1997-11-20 2004-01-13 Nintendo Co., Ltd. Video game system for capturing images and applying the captured images to animated game play characters
US20020050997A1 (en) * 2000-01-28 2002-05-02 Square Co., Ltd. Method, game machine and recording medium for displaying motion in a video game
US20100190610A1 (en) * 2000-03-07 2010-07-29 Pryor Timothy R Camera based interactive exercise
US20080125289A1 (en) * 2000-03-07 2008-05-29 Pryor Timothy R Camera based video games and related methods for exercise motivation
US7328119B1 (en) * 2000-03-07 2008-02-05 Pryor Timothy R Diet and exercise planning and motivation including apparel purchases based on future appearance
US20040259631A1 (en) * 2000-09-27 2004-12-23 Milestone Entertainment Llc Apparatus, systems and methods for implementing enhanced gaming and prizing parameters in an electronic environment
US20080122805A1 (en) * 2000-10-11 2008-05-29 Peter Smith Books, papers, and downloaded information to facilitate human interaction with computers
US20020098885A1 (en) * 2001-01-24 2002-07-25 Square Co. Video game system and control method thereof and program of video game and computer readable record medium recorded with the program
US20020142825A1 (en) * 2001-03-27 2002-10-03 Igt Interactive game playing preferences
US6890262B2 (en) * 2001-07-19 2005-05-10 Konami Corporation Video game apparatus, method and recording medium storing program for controlling viewpoint movement of simulated camera in video game
US7112134B1 (en) * 2002-03-26 2006-09-26 Pixel Puzzles, Inc. Method and system for photographic gaming
US7309280B2 (en) * 2002-07-16 2007-12-18 Aruze Co., Ltd. Gaming machine, server, and program with image of real player
US7027054B1 (en) * 2002-08-14 2006-04-11 Avaworks, Incorporated Do-it-yourself photo realistic talking head creation system and method
US20100007665A1 (en) * 2002-08-14 2010-01-14 Shawn Smith Do-It-Yourself Photo Realistic Talking Head Creation System and Method
US20090196516A1 (en) * 2002-12-10 2009-08-06 Perlman Stephen G System and Method for Protecting Certain Types of Multimedia Data Transmitted Over a Communication Channel
US20090053970A1 (en) * 2003-07-02 2009-02-26 Ganz Interactive action figures for gaming schemes
US7614948B2 (en) * 2003-09-15 2009-11-10 Igt Multi-player bingo with slept awards reverting to progressive jackpot pool
US7785199B2 (en) * 2004-02-09 2010-08-31 Nintendo Co., Ltd. Touch-sensitive gaming system with dual displays
US20050176502A1 (en) * 2004-02-09 2005-08-11 Nintendo Co., Ltd. Game apparatus and storage medium having game program stored therein
US20050176486A1 (en) * 2004-02-09 2005-08-11 Nintendo Co., Ltd. Game apparatus and storage medium having game program stored therein
US7967674B2 (en) * 2004-08-20 2011-06-28 Igt Gaming device and method having a first interactive game which determines a function of a second wagering game
US20080214273A1 (en) * 2004-09-21 2008-09-04 Snoddy Jon H System, method and handheld controller for multi-player gaming
US20060068917A1 (en) * 2004-09-21 2006-03-30 Snoddy Jon H System, method and handheld controller for multi-player gaming
US20070008322A1 (en) * 2005-07-11 2007-01-11 Ludwigsen David M System and method for creating animated video with personalized elements
US8047915B2 (en) * 2006-01-11 2011-11-01 Lyle Corporate Development, Inc. Character for computer game and method
US20100105454A1 (en) * 2006-04-13 2010-04-29 Igt Methods and systems for interfacing with a third-party application
US20070243918A1 (en) * 2006-04-18 2007-10-18 Yahoo! Inc. Player roster selection interface
US20080001951A1 (en) * 2006-05-07 2008-01-03 Sony Computer Entertainment Inc. System and method for providing affective characteristics to computer generated avatar during gameplay
US20070283265A1 (en) * 2006-05-16 2007-12-06 Portano Michael D Interactive gaming system with animated, real-time characters
US20090075731A1 (en) * 2006-06-15 2009-03-19 Konami Digital Entertainment Co., Ltd. Game program, game apparatus, and game method
US7847808B2 (en) * 2006-07-19 2010-12-07 World Golf Tour, Inc. Photographic mapping in a simulation
US20100231790A1 (en) * 2006-12-29 2010-09-16 Prodea Systems, Inc Display inserts, overlays, and graphical user interfaces for multimedia systems
US20080030496A1 (en) * 2007-01-03 2008-02-07 Social Concepts, Inc. On-line interaction system
US20080182664A1 (en) * 2007-01-26 2008-07-31 Winster, Inc. Games Promoting Cooperative And Interactive Play
US20080254829A1 (en) * 2007-04-16 2008-10-16 Ntt Docomo, Inc. Control Apparatus, Mobile Communications System, and Communications Terminal
US20080268961A1 (en) * 2007-04-30 2008-10-30 Michael Brook Method of creating video in a virtual world and method of distributing and using same
US20080293488A1 (en) * 2007-05-21 2008-11-27 World Golf Tour, Inc. Electronic game utilizing photographs
US20080309671A1 (en) * 2007-06-18 2008-12-18 Brian Mark Shuster Avatar eye control in a multi-user animation environment
US20090042647A1 (en) * 2007-07-27 2009-02-12 Phenomenon Holdings, Ltd. Method and device for controlling a motion-sequence within a simulated game or sports event
US20090027337A1 (en) * 2007-07-27 2009-01-29 Gesturetek, Inc. Enhanced camera-based input
US20090132371A1 (en) * 2007-11-20 2009-05-21 Big Stage Entertainment, Inc. Systems and methods for interactive advertising using personalized head models
US20090153552A1 (en) * 2007-11-20 2009-06-18 Big Stage Entertainment, Inc. Systems and methods for generating individualized 3d head models
US20090135177A1 (en) * 2007-11-20 2009-05-28 Big Stage Entertainment, Inc. Systems and methods for voice personalization of video content
US20090135176A1 (en) * 2007-11-20 2009-05-28 Big Stage Entertainment, Inc. Systems and methods for creating personalized media content having multiple content layers
US20090147003A1 (en) * 2007-12-10 2009-06-11 International Business Machines Corporation Conversion of Two Dimensional Image Data Into Three Dimensional Spatial Data for Use in a Virtual Universe
US20090186700A1 (en) * 2008-01-19 2009-07-23 Tim Konkle System and method for providing interactive content for multiple networked users in a shared venue using short messaging service communication
US20080261693A1 (en) * 2008-05-30 2008-10-23 Sony Computer Entertainment America Inc. Determination of controller three-dimensional location using image analysis and ultrasonic communication
US20100099471A1 (en) * 2008-10-17 2010-04-22 Feeney Robert J Network-Based Contests Having Multiple Participating Sponsors
US20100160041A1 (en) * 2008-12-19 2010-06-24 Immersion Corporation Interactive painting game and associated controller
US20100227688A1 (en) * 2009-03-06 2010-09-09 Trion World Network, Inc. Synthetic environment character data sharing
US20100229106A1 (en) * 2009-03-06 2010-09-09 Trion World Network, Inc. Synthetic environment character data sharing
US20110248992A1 (en) * 2010-04-07 2011-10-13 Apple Inc. Avatar editing environment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150038225A1 (en) * 2012-03-13 2015-02-05 Neowiz Bless Studio Corporation Online game providing method for providing character makeup and system therefor

Similar Documents

Publication Publication Date Title
JP6700463B2 (en) Filtering and parental control methods for limiting visual effects on head mounted displays
JP6955861B2 (en) Event control system and program
US10380798B2 (en) Projectile object rendering for a virtual reality spectator
CN106716306B (en) Synchronizing multiple head mounted displays to a unified space and correlating object movements in the unified space
US8632408B2 (en) Entertainment device and method
Chesher Neither gaze nor glance, but glaze: relating to console game screens
CN107551544A (en) Interactive entertainment process playback system
WO2020138107A1 (en) Video streaming system, video streaming method, and video streaming program for live streaming of video including animation of character object generated on basis of motion of streaming user
US20060015560A1 (en) Multi-sensory emoticons in a communication system
CN105808781B (en) Location-based on-line video game forum for special area
US11058956B2 (en) Consent verification
KR20040104753A (en) On-line gaming spectator
US20110296318A1 (en) Virtual Reality Space Provision System, Virtual Reality Space Provision Method and Program
US20170148267A1 (en) Celebrity chase virtual world game system and method
JP7300925B2 (en) Live communication system with characters
US11513656B2 (en) Distally shared, augmented reality space
JP2016123560A (en) Game system and program
JP2021131800A (en) Information control system
US20090312100A1 (en) Face Simulation in Networking
US9669297B1 (en) Using biometrics to alter game content
US20090310187A1 (en) Face Simulation in Networking
JP2016123561A (en) Game system and program
KR102169804B1 (en) Apparatus and method of handling configuration information of a character using screen shot image
JP7445723B1 (en) Programs and information processing systems
JP7375143B1 (en) Programs and information processing systems

Legal Events

Date Code Title Description
AS Assignment

Owner name: HARRIS TECHNOLOGY, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HARRIS, SCOTT C;REEL/FRAME:022050/0298

Effective date: 20090101

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION