US20130138773A1 - Methods, Systems, and Products for Managing Multiple Data Sources - Google Patents

Methods, Systems, and Products for Managing Multiple Data Sources Download PDF

Info

Publication number
US20130138773A1
US20130138773A1 US13/748,621 US201313748621A US2013138773A1 US 20130138773 A1 US20130138773 A1 US 20130138773A1 US 201313748621 A US201313748621 A US 201313748621A US 2013138773 A1 US2013138773 A1 US 2013138773A1
Authority
US
United States
Prior art keywords
signals
telepresence
user
data source
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/748,621
Inventor
James Carlton Bedingfield, Sr.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AT&T Intellectual Property I LP
Original Assignee
AT&T Intellectual Property I LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AT&T Intellectual Property I LP filed Critical AT&T Intellectual Property I LP
Priority to US13/748,621 priority Critical patent/US20130138773A1/en
Assigned to BELLSOUTH INTELLECTUAL PROPERTY CORPORATION reassignment BELLSOUTH INTELLECTUAL PROPERTY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BEDINGFIELD, JAMES CARLTON, SR.
Assigned to AT&T INTELLECTUAL PROPERTY I, L.P. reassignment AT&T INTELLECTUAL PROPERTY I, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AT&T DELAWARE INTELLECTUAL PROPERTY, INC.
Assigned to AT&T INTELLECTUAL PROPERTY, INC. reassignment AT&T INTELLECTUAL PROPERTY, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: BELLSOUTH INTELLECTUAL PROPERTY CORPORATION
Assigned to AT&T BLS INTELLECTUAL PROPERTY, INC. reassignment AT&T BLS INTELLECTUAL PROPERTY, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: AT&T INTELLECTUAL PROPERTY, INC.
Assigned to AT&T DELAWARE INTELLECTUAL PROPERTY, INC. reassignment AT&T DELAWARE INTELLECTUAL PROPERTY, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: AT&T BLS INTELLECTUAL PROPERTY, INC.
Publication of US20130138773A1 publication Critical patent/US20130138773A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/765Media network packet handling intermediate
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/54Presence management, e.g. monitoring or registration for receipt of user log-on information, or the connection status of the users
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/20Details of the management of multiple sources of image data

Definitions

  • Computer users perform a variety of tasks with the use of computing devices and the like. For instance, users may at times use a computer to perform work, surf the web, play games, or watch movies or the like.
  • Some employers may, however, limit their employees' use of the employer's computers or networks. For example, employers may block access to non-work related web sites. This may cause some employee computer users to use multiple computers for multiple different tasks. For instance, an employee computer user may use a work computer or network to perform work, while using a personal computer to navigate the web. Using multiple computers, however, can prove to be confusing and spatially inefficient.
  • One such device may include a data source input module configured to input a plurality of signals, which may include video signals, from multiple local data sources.
  • the device may further include an output module, which may receive the signals from the data source input module and output the signals onto a single display device.
  • Another device may include computer-readable media having instructions for selecting, from multiple data source displays each outputted from a respective local data source and each located on a single display device, one of the multiple data source displays. Further instructions may include accessing the selected local data source to allow for use of the local data source, in response to the selecting of the local data source.
  • a method described herein may include receiving data, including video data, from multiple local data sources. These local data sources may include some first local data sources that couple to a first network and other second local data sources that couple to a second network that is independent of the first network. This method may also include outputting some or all of the video data received from both the first and second local data sources onto a single display device.
  • FIG. 1 is a block diagram of an overall environment in which a manager module may operate.
  • FIG. 2 is a block diagram providing further details of another overall environment in which a manager module may operate, relating to various types of input and output to and from the manager module.
  • FIG. 3 is a block diagram of illustrative components of a manager module.
  • FIG. 4 is a block representation of a display device upon which a manager module may output a plurality of data source displays.
  • FIG. 5 is another block representation of a display device upon which a manager module may output a plurality of data source displays.
  • FIG. 6 is another block representation of a display device upon which a manager module may output a plurality of data source displays as well as an independent network data source display.
  • FIG. 7 is another block representation of a display device upon which a manager module may output a plurality of data source displays as well as an independent network data source display.
  • FIG. 8 is a block representation of a display device upon which a manager module may select and access one of a plurality of data source(s) and/or independent network data source(s).
  • FIG. 9 is a block representation of a display device upon which a manager module may output a plurality of data source display windows from a plurality of data sources.
  • FIG. 10 is a flow diagram that illustrates a process for managing data received from multiple local data sources.
  • FIG. 1 illustrates an overall environment 100 in which a manager module 110 may operate.
  • Data sources 102 ( 1 )-(N) may provide input in the form of data signals or the like to a system 104 and, hence, the manager module 110 .
  • System 104 may comprise a processor 106 , upon which may run computer-readable medium 108 or the like.
  • Computer-readable medium may comprise, in whole or in part, manager module 110 .
  • manager module 110 may be implemented as one or more software modules that, when loaded onto a processor and executed, cause the system to perform any of the functions described herein.
  • manager module 110 may itself comprise hardware, such as a processor or the like.
  • System 104 and/or manager module 110 may output these signals to one or more input and/or output devices 112 .
  • Data sources 102 ( 1 )-(N) may comprise a plurality of different devices. Illustrative but non-limiting examples include personal computers such as towers and laptops, server devices, personal digital assistants (PDA's), mobile telephones, land-based telephone lines, external memory devices, cameras, or any other suitable electronic device.
  • data sources 102 ( 1 )-(N) may comprise local devices. For instance, if data sources 102 ( 1 )-(N) are “local data sources”, they may be located locally to a user operating each respective device. These local data sources may receive data from and send data both locally and remotely.
  • Input/output devices 112 may also comprise a plurality of electronic devices or the like, described in further detail with reference to FIG. 2 .
  • input/output devices 112 may be located at a single user station 114 .
  • manager module 110 may be configured to accept signal inputs from a plurality of data sources 102 ( 1 )-(N) and output those signals to single user station 114 .
  • devices 112 may be located at a multitude of locations, and may thus not comprise single user station 114 .
  • the resulting system may thus comprise an environment capable of managing information to and from a plurality of data sources.
  • the managed information may also comprise information from multiple networks, some of which may be independent from one another.
  • FIG. 2 illustrates another environment 200 in which manager module 110 may operate.
  • Environment 200 may comprise one or more data sources 102 ( 1 )-(N), manager module 110 , input/output devices 112 , and an independent network data source 216 .
  • the single independent network data source 216 is illustrated, it is to be understood throughout that a plurality of independent network data sources may exist.
  • input/output devices 112 may again be located at single user station 114 in some implementations, and at a multitude of locations in other implementations.
  • Data sources 102 ( 1 )-(N) may transmit data to and receive data from a plurality of networks, or the data sources may all transmit data to and receive data from the same network.
  • data sources 102 ( 1 )-(N) are shown to all transmit data to manager module 110 from a network 218 .
  • Network 218 may be a land-line telephone-based dial up connection, a digital subscriber (DSL) line, a cable-based network, a satellite network, or the like.
  • DSL digital subscriber
  • network 218 may connect to a router 220 , which may then connect to one or more of data sources 102 ( 1 )-(N).
  • data sources 102 ( 1 )-(N) may all be put to the same use or such uses may vary.
  • data source 102 ( 1 ) may be used to carry broadband entertainment signals
  • data source 102 ( 2 ) may be used to connect a virtual private network (VPN) signal.
  • the latter signal may be used, for example, to connect to a work account that requires relatively high security.
  • data source 102 (N) may be used for a connection to the internet.
  • the internet connection may be an open internet connection, as compared to some instances of the VPN connection over data source 102 ( 2 ). Because the VPN connection may be used, in some instances, to connect to a work account or the like, the managing employer may limit internet access over the network.
  • the open internet connection of data source 102 (N) may remedy this limitation in some implementations.
  • non-limiting examples of data sources 102 ( 1 )-(N) may include personal computers such as towers and laptops, server devices, personal digital assistants (PDA's), mobile telephones, land-based telephone lines, external memory devices, cameras, or any other suitable electronic device.
  • data source 102 ( 1 ) may comprise a personal computer, which may be used to carry broadband entertainment signals as discussed above.
  • data source 102 ( 2 ) may comprise a computer as well, with its purpose being to provide a VPN connection to a user's work account.
  • Data source 102 (N) may also comprise a personal computer, such as a laptop computer, with its purpose being to provide an open internet connection for the user's navigating convenience.
  • data sources 102 ( 1 )-(N) will each comprise a separate housing, although it is possible that more than one data source may occupy a single housing. Furthermore, while the above example depicts all of data sources 102 ( 1 )-(N) as computers, computers need not always be implemented.
  • An independent network data source 216 may also provide input signals to manager module 110 .
  • Independent network data source 216 may transmit and receive data via a network 222 that is independent of network 218 . If multiple independent network data sources exist, then one or more such networks may exist. Due to the independence of network 222 from network 218 , independent network 222 and thus independent network data source 216 may be relatively free from any problems or encumbrances that network 218 may experience. For instance, if network 218 congests or crashes, signals traveling via independent network 222 may be unaffected. Thus, the content itself of independent network data source 216 may be relatively more secure as compared to a data source that shares a network with other data sources. Similar to network 218 , independent network 222 may comprise a land-line telephone-based dial up connection, a digital subscriber (DSL) line, a cable-based network, a satellite network, or the like.
  • DSL digital subscriber
  • independent network data source 216 may comprise a plurality of different devices. Such non-limiting examples may include personal computers such as towers and laptops, server devices, personal digital assistants (PDA's), mobile telephones, land-based telephone lines, external memory devices, cameras, or any other electronic device. In some implementations, however, independent network data source 216 may comprise devices that define a telepresence module. Such a telepresence module may operate to provide, to a local or remote location, audio and/or video signals of a user when that user is present at a certain location, such as user station 114 . A telepresence module may also operate to receive audio and/or video signals from a remote or local location. When a user is present at the desired location, such as the user station 114 , the telepresence module may automatically or manually activate.
  • PDA's personal digital assistants
  • this resulting telepresence module may serve as a reliable telepresence link between two or more locations.
  • telepresence link could serve as a link between user station 114 comprised of input/output devices 112 and a user's place of work. This place of work may, in some instances, be the user's office located at his or her employer's building. This link may provide audio and/or video signals to the two or more locations, such as user station 114 and the user's office at work.
  • independent network 222 and independent network data source 216 may provide a reliable telepresence signal without concern for any congestion of failure that network 218 may encounter.
  • independent network 222 and independent network data source 216 may provide always-on point-to-point or point-to-multipoint connectivity.
  • Such a reliable and always-on system may serve to avoid not only processor errors, discussed above by way of congestion and/or failures, but may also serve to avoid operational error by a user by simplifying the system.
  • An always-on system may also avoid the possibility of a data source entering “sleep mode” at an undesirable time.
  • Telephone Environment 200 may also comprise a telephone line 224 , which may also provide input signals to manager module 110 .
  • Telephone line 224 may be a traditional land-based telephone line, or it may comprise packet (IP), mobile or satellite technology. Signals from telephone line 224 may also output to some or all of input/output devices 112 .
  • IP packet
  • Input/output devices 112 may comprise one or more displays 226 , an audio system 228 , cursor controllers 230 , cameras 232 , microphones 234 and/or sensors 236 . Input/output devices 112 may serve to input or output signals to or from data sources 102 ( 1 )-(N) or independent network data source 216 . As illustrated, input/output devices 112 may be configured to connect to manager module 110 . Furthermore, input/output devices 112 may be located in a plurality of locations, or in some implementations may be located at single user station 114 . Some components may also serve more than one purpose, or a single device could comprise all or nearly all of input/output devices 112 .
  • One or more displays 226 may receive and display video signals from data sources 102 ( 1 )-(N) and/or independent network data source 216 . If displays 226 comprises a single display, then the video signals and hence data source displays may be displayed on the single display. This is discussed in detail below with reference to FIGS. 4-9 . It is specifically noted that the term “single display device” does not require a single monitor but rather a system that gives a user the sensation of a single display. This may be accomplished by attaching multiple monitors or the like and giving each a portion of a video signal. When combined, the signals form a complete video display.
  • display 226 may comprise a cylindrical or spherical monitor, which may span approximately 180° around a user.
  • Display 226 may comprise a liquid crystal display (LCD) screen or the like, or may comprise a projector screen. If display 226 comprises a projector screen, then input/output devices 112 may further comprise a projector for outputting video signals onto display 226 .
  • manager module 110 could itself comprise a projector. If the projector screen is cylindrical or spherical, the projector may be capable of spreading the data source displays across the screen, again as discussed in detail below.
  • input/output devices 112 may also comprise one or more audio systems 228 .
  • Audio system 228 may comprise speakers or the like for outputting audio signals from data sources 102 ( 1 )-(N) and/or independent network data source 216 .
  • Audio system 228 may further comprise a multi-channel audio reproduction system, which may comprise an automated mixer(s), amp(s), headphones, or the like.
  • An automated mixer may mix audio signals from a plurality of data sources as well as incoming telephone calls.
  • Input/output devices 112 may also comprise one or more cursor controllers 230 .
  • Cursor controllers may include, without limitation, a text input device such as a keyboard, a point-and-select device such as a mouse, and/or a touch screen.
  • Input/output devices 112 may also include one or more cameras 232 .
  • Cameras 232 may comprise, for example, one or more video cameras.
  • camera 232 may be configured for use with a telepresence module discussed above. For example, a video camera signal may capture the image of a user and provide the signal to manager module 110 . Manager module 110 may then transmit the signal to one or more locations, one of which may be the user's work office.
  • the transmitted video signal may be streaming in some instances, so that an image of a user is projected at all times at one or more locations, such as the user's work office. This image may, for example, be displayed on a display device at one or more locations such as the user's work office.
  • the user may be able to work at a remote user station, such as user station 114 , while the user's coworkers or superiors can view the user and his or her activities.
  • teleworking is merely one example of an activity for which the one or cameras 232 may be put to use.
  • Input/output devices 112 may further comprise, in some instances, one or more microphones 234 or other multi-channel audio sensing equipment.
  • Microphone 234 may be configured to provide audio input signals to manager module 110 .
  • microphone 234 may be configured for use with the telepresence module discussed above.
  • microphone 234 may be configured to capture audio signals from a user and provide these signals to manager module 110 .
  • Manager module 110 may then transmit these signals to one or more locations, one of which may again be the user's work office.
  • the transmitted audio signal may be streaming, so that the sounds of the user are audible at one or more locations at all times.
  • microphone 234 may be noise-gated and set with a threshold value.
  • Microphone 234 may also act in conjunction with manager module 110 to enable voice recognition. In some implementations, these components may require that user's voice be recognized before allowing user to use microphone 234 and hence manager module 110 .
  • implementations with multiple microphones may utilize stereophonic sound techniques. For instance, if a user conducts multiple videoconference and/or teleconference meetings simultaneously, the system may dedicate a microphone to each. As a result, if a user turns and speaks into a left microphone, a recipient represented on the data source display on the left may receive the sound at greater volumes. At the same time, the microphone on the right may receive less input and, hence, the recipient depicted in the data source display on the right may receive the sound at lesser volumes.
  • microphone 234 could be coupled with camera 232 so as to provide the image and sounds of the user at the user's work office at all times.
  • the user's work office may thus not only have a display depicting the user's image, but may also have one or more speakers to output the audio signals captured by microphone 234 and transmitted by manager module 110 .
  • the user's work office (or other exemplary locations) may contain one or more cameras, such as a video camera, and one or more microphones so that two-way audio and visual communication may occur between the office and the remote user station, such as user station 114 .
  • this may provide adequate availability of the user in the office, despite the fact that user may be remotely situated.
  • a co-worker may be able to walk over to the user's work office and see that the user is working at a remote user station.
  • the co-worker may confront a monitor or the like displaying the user at a remote user station.
  • the co-worker may also be able to ask the user questions and like with use of the office microphone(s), and the two may be able to communicate as efficiently or nearly as efficiently as if the user were physically present at the work office.
  • this telepresence link may be secure, it may also be accessible by others.
  • a user's boss for instance, may be able to bridge into the telepresence signal rather than have to walk to the user's work office to communicate to the user.
  • Output/input devices 112 may further comprise one or more sensors 236 .
  • Sensor 236 may, in some instances, sense when a user is present at a user station, such as user station 114 .
  • sensor 236 may comprise a weight-based sensor that may be configured to situate adjacent to or integral with a user's chair. Thus, in some instances when a user sits down on his or her chair, sensor 236 may detect that the user is present.
  • sensor 236 may be capable of differentiating between users.
  • sensor 236 may be capable of differentiating between a first user (User #1) and a second user (User #2) based on each user's weight. It is noted, however, that other sensors are envisioned.
  • camera 232 may serve to sense when a user is present, as may microphone 234 .
  • sensor 236 may collect various pieces of information, which may use to determine the presence of a user. Sensor 236 may also store and/or log this collected information. For instance, sensor 236 may collect information such as the weight or height of a user, as well as other user-specific data. Furthermore, sensor 236 may gather the time of when certain data was collected. In some implementations, sensor 236 may calculate and save the amount of time that a user spent at the user station, as well as the amount of time that the user was away. This collected information may be provided only the user, or it could be available by others, such as the user's boss in the teleworking examples.
  • input/output devices 112 may also collect various pieces of information. For instance, camera 232 may collect location information pertaining to a present user. Similarly, microphone 234 may collect information regarding the sounds emanating from the present user. Sensor 236 , camera 232 , microphone 234 , and other input/output devices 112 may not only collect the actual content of the user's actions, but may also collect the “recipe” of the content of the user's actions. These devices may create a chronological list of events that comprise the user's actions, which may be sent to remote locations for synthesis. Using this events list or recipe, remote locations may synthesize the user's actions. This may apply to audio, video, smell or any other characteristic that may be present at user station 114 or the like. While this synthesis may not be as real as if the remote location merely receives and displays streaming video, audio, or the like, the size of the file transporting this information may be orders of magnitude smaller.
  • Such a chronological events list created by various pieces of information collected by input/output devices 112 may look like the following:
  • this events list may be transmitted to remote locations, which may then synthesize the content according to the recipe.
  • the remote location may initially show a cartoon face or the like of a generic man, or a more realistic picture of “Bob Jones”.
  • the remote location system such as a computer at the user's work, may also project a voice, and possibly Bob's voice, that says “I′m working from home today”.
  • the projected voice may be approximately equal to the volume spoken by Bob, and may also be approximately equal to the words' spoken cadence.
  • the remote location may display an empty chair representing that Bob is no longer present at user station 114 .
  • Such an events list may serve to decrease file size being sent along the networks while still maintaining a realistic experience at the remote location.
  • Sensor 236 may also serve other purposes.
  • sensor 236 may adjust the sizes of data source displays or data source display windows, discussed in detail below, depending on a current angle of rotation of a user's chair. Sound volumes emanating from different data source displays may also be adjusted based on chair rotation, as may microphone volumes. For instance, if a user's chair is currently rotated to the left, insinuating that user's attention currently focuses on the data source display to the left, the volume of that data source may be increased, as may be the window size. It is also envisioned that sensor 236 could make other similar adjustments in this manner.
  • sensor 236 may cause one or more of data sources 102 ( 1 )-(N) and/or independent network data source 216 to turn on or off, or perform one or more other functions.
  • sensor 236 may cause a telepresence module to turn on when a user sits in his or her chair. This may thereby establish a reliable telepresence link. Again, in some implementations this could automatically turn on a telepresence link to a work office, which may comprise turning on a display, camera, and/or microphone at the work office.
  • sensor 236 may also notify others located at the work office (or other exemplary location) when the user has stepped out of the remote user station, such as user station 114 .
  • the work office display may present an away message indicating that the user has recently left the user station, but may soon return.
  • sensor 236 may recognize users in order to automatically provide user preferences tailored to the detected user. These user preferences may comprise audio or video preferences or the like, which may be comprised of the different audio and video capabilities discussed below.
  • telephone line 224 may input to manager module 110 .
  • input/output devices 112 may comprise a telephone.
  • some or all of input/output devices 112 described above may serve telephone functionality over telephone line 216 .
  • audio system 228 may output sound from telephone line 224
  • microphone 234 may input audio signals into telephone line 216 .
  • display 226 may exhibit a video signal transmitted over telephone line 224 , such as for a video teleconferencing call.
  • Camera 232 may accordingly also provide video signals, such as of the image of the user, over telephone line 224 .
  • Other components may be included to achieve full telephone capabilities, such as a speakerphone, a headset, a handset, or the like.
  • FIG. 3 depicts a block diagram of illustrative components of manager module 110 .
  • Manager module 110 may comprise a data source input module 338 , an output module 340 and a user input module 342 .
  • Data source input module 338 may be configured to receive a plurality of signals from data sources 102 ( 1 )-(N).
  • the plurality of signals may comprise video and audio signals.
  • the plurality of signals may comprise a telepresence signal as described above.
  • Output module 340 may be configured to receive the plurality of signals from data source input module 338 and output the signals to input/output devices 112 .
  • output module 340 may be configured to output video signals to one or more displays 226 .
  • output module 340 may output the video signals to a single display 226 . Furthermore, output module 340 may be configured to output audio signals to audio system 228 . In one implementation, output module 340 may be configured to output audio signals from data sources 102 ( 1 )-(N) to a single speaker or set of speakers. In this case, the system may further be capable of managing sound volumes and other respective characteristics of audio signals coming from discrete data sources 102 ( 1 )-(N).
  • manager module 110 may include user input module 342 , which may be configured to receive user input signals or other user instructions.
  • user input module 342 may be configured to receive input from one or more of the one or more cursor controllers 230 , cameras 232 , microphones 234 , and/or sensors 236 . Signals received from cursor controller 230 may, for example, be received for the purpose of accessing a data source or modifying, activating or using a program or application running on a data source.
  • User input module 342 may also be configured to receive signals from camera 232 , which may comprise images (e.g. video images) of a user located at user station 114 . Similarly, user input module 342 may receive user-inputted audio signals from microphone 234 .
  • User input module 342 may further be configured to receive signals from sensor 236 , which may, in some instances, serve to indicate that a user is present at user station 114 . For instance, user input module 342 may receive a signal from sensor 236 indicating that a user is sitting in the user station chair and/or facing a certain direction as discussed above. All of the afore-mentioned signals may be relayed from user input module 342 , and hence manager module 110 , and to the respective data source 102 ( 1 )-(N) or independent network data source 216 destination.
  • manager module 110 and its output module 340 may output video signals from one or more data sources 102 ( 1 )-(N) to a single display.
  • Video signals from data sources 102 ( 1 )-(N) may define data source displays 444 ( 1 )-(N).
  • data source display 444 ( 1 ) may represent video signals corresponding to data source 102 ( 1 ), and so on.
  • manager module 110 may be configured to output or project a plurality of data source displays 444 ( 1 )-(N) to a single display device 226 . In the illustrated implementation of FIG. 4 , four data source displays are depicted.
  • FIG. 5 depicts a manner in which a plurality of data source displays 444 ( 1 )-(N) may be arranged on the single display device 226 .
  • data source displays 444 ( 1 )-(N) may be aligned adjacent to one another and may be displayed in equal proportions.
  • Manager module 110 may be further configured to output video signals of independent network data source 216 onto the single display device 226 .
  • Video signals of independent network data source 216 may define an independent network data source display 646 .
  • the independent network data source display 646 may comprise video images taken at another location by detection means such as one or more cameras, as discussed above. Also as described above, this location may be a work office or the like.
  • manager module 110 may, in some instances, output one or more data source displays 444 ( 1 )-(N) as well as independent network data source display 646 on a single display, such as the single display device 226 .
  • the data source displays 444 ( 1 )-(N) and 646 may be arranged adjacent to one another and in equal proportions, as depicted in FIG. 6 .
  • Data source displays 444 ( 1 )-(N) and/or independent network data source display 646 need not be arranged with such uniformity. Reference is thus made to FIG. 7 , which depicts an alternative arrangement of the data source displays 444 ( 1 )-(N) and 646 .
  • a user may decide to focus his or her attention mostly upon independent network data source display 646 (or other data source displays 444 ( 1 )-(N)). As such, the user may arrange the display 646 in the middle of display 226 and may also choose to enlarge the display 646 .
  • the user may decide that the current importance of data source display 444 ( 1 ) is minimal, and may accordingly move the display 444 ( 1 ) away from the middle of a screen of display 226 .
  • the user may choose to slide the display 444 ( 1 ) all the way adjacent to the user's left ear, for instance.
  • the user may lessen the size of data source display 444 ( 1 ) as well.
  • Other data source displays 444 ( 2 )-(N) may likewise be arranged by location and size.
  • the user may also be able to modify resolutions of the displays and manager module 110 may be capable of providing such modification.
  • data source 102 ( 1 ) may comprise a personal computer, which may be used to carry broadband entertainment signals.
  • data source 102 ( 2 ) may comprise a computer as well, with its purpose being to provide a VPN connection to a user's work account.
  • Data source 102 (N), meanwhile, may also comprise a personal computer, such as a laptop computer, with its purpose being to provide an open internet connection for the user's navigating convenience.
  • independent network data source 216 may comprise a telepresence signal so that a user may work at a remote user station, such as user station 114 , while broadcasting audio and/or video signals to the user's work office, for instance.
  • the user becomes more focused on work than the other tasks being accomplished by data sources 102 ( 1 )-(N).
  • the user may enlarge and center independent network data source display 646 .
  • the user may be watching video from CNN® (CNN® is a registered trademark of Cable News Network, LP) or the like and may choose to lessen this data source display 444 ( 1 ) and move it away from the center of display 226 .
  • CNN® is a registered trademark of Cable News Network, LP
  • the user may be accessing a spreadsheet from a work database via the VPN from data source 102 ( 2 ).
  • the user may again decide to focus on independent network data source display 646 and may thus decide to lessen data source display 444 ( 2 ) as well as move it away from the center of display 226 .
  • the user may also be surfing the internet with the use of the open internet connection provided by data source 102 (N). Again, the user may choose to divert his or her attention from this data source display 444 (N) by lessening its size and placing it in a less noticeable location.
  • the user may also decide to provide high resolution to independent network data source display 646 . Conversely, the user may choose to lessen a resolution of data source display 444 ( 1 ) or other data sources.
  • manager module 110 is to have the capability to dynamically, select, arrange, and modify data source displays 444 ( 1 )-(N) and/or independent network data source display 646 , one or more of data sources 102 ( 1 )-(N) or independent network data source 216 may be chosen, selected, highlighted, or the like.
  • cursor controller 230 may help to provide this capability. Reference is thus made to FIG. 8 , which provides an exemplary illustration of how this selection may be accomplished.
  • FIG. 8 again depicts display 226 , in this case a single display, as well as data source displays 444 ( 1 )-(N) and independent network data source display 646 .
  • FIG. 8 also depicts a point-and-select cursor 848 .
  • point-and-select cursor 848 may comprise a “meta-cursor” for each data source cursor, in some implementations.
  • each of the data sources 102 ( 1 )-(N) and 216 may comprise its own cursor operable by one or more cursor controllers.
  • manager module 110 may provide point-and-select cursor 848 , which may be configured to navigate display 226 .
  • cursor controller 230 such as a mouse or keyboard, may command point-and-select cursor 848 .
  • cursor controller 230 may navigate cursor 848 over a data source display
  • cursor 848 may then command the data source's own cursor.
  • cursor 848 may thus define a “meta-cursor” for individual data source cursors.
  • point-and-select cursor 848 may serve to navigate over the output of display 226 and select one or more of a plurality of data source displays 444 ( 1 )-(N) and 646 .
  • a data source such as the data sources 102 ( 1 )-(N) and 216 , may be selected by clicking a portion of a cursor controller 230 or by merely moving cursor 848 over a certain data source display 444 ( 1 )-(N) and 646 .
  • the user may have access to that data source (including its video and audio signals) and may also now have the ability to modify and/or arrange the data source display. As illustrated in FIG.
  • the user may select, for example, data source display 444 ( 2 ) from display 226 .
  • the user may be able to place data source display 444 ( 2 ) on display 226 in a desired location.
  • Other data source displays may similarly be arranged and re-arranged.
  • the user may now have the ability to change the size and/or resolution of data source display 444 ( 2 ).
  • data source display 444 ( 2 ) may correspond to data source 102 ( 2 ), which may be performing work-related VPN operations.
  • the user may have a work-related spreadsheet open on data source 102 ( 2 ). Nevertheless, the user may choose to focus on the contents of independent network data source 216 and may thus move data source display 444 ( 2 ), corresponding to the spreadsheet, away from the center of display 226 . Again, the user may also choose to lessen the size and possibly the resolution of the display 444 ( 2 ).
  • Other data source displays, such as the data source displays 444 ( 1 ), 444 (N), and 646 may likewise be selected, arranged, re-arranged, and modified.
  • selecting data source 102 ( 2 ) may also serve to allow for use or modification of the data source. For example, if the user in the current example selects data source display 444 ( 2 ) with point-and-select cursor 848 , then the user may be able to operate on the work-related spreadsheet. Similarly, selection of another data source display, such as the data source display 444 ( 1 ), 444 (N), or 646 , may allow for operation of the selected data source.
  • manager module 110 may also be configured to allow for the selection, modification, arrangement, and/or re-arrangement of data source display windows 950 ( 1 )( 1 )-(N)(N).
  • These data source display windows 950 ( 1 )( 1 )-(N)(N) may correspond to programs or applications that may be running in each respective data source 102 ( 1 )-(N) and on each respective data source display 444 ( 1 )-(N).
  • data source display window 950 ( 1 )( 1 ) may correspond to a first window open in data source display 444 ( 1 ), which itself may correspond to data source 102 ( 1 ).
  • data source display window 950 (N)(N) may correspond to an N th window open on data source display 444 (N), which itself may correspond to data source 102 (N).
  • Data source display windows 950 ( 1 )( 1 )-(N)(N) may be modified, selected, arranged, and re-arranged in display 226 in many of the ways discussed above in regards to data source displays 444 ( 1 )-(N). For instance, once a data source 102 ( 1 )-(N) is selected, such as in the manner depicted in FIG. 7 , data source display windows located within the data source displays may be used, modified, selected, arranged or re-arranged. Data source display windows 950 ( 1 )( 1 )-(N)(N) may also be extracted from their respective data source displays 444 ( 1 )-(N) and placed on display 226 .
  • the illustrated point-or-select cursor 848 may accomplish such alterations of the data source display windows 950 ( 1 )( 1 )-(N)(N), although other methods may also be utilized.
  • a keyboard for example, may also be used.
  • multiple data source display windows from a single data source display may be located and viewed on display 226 .
  • FIG. 9 depicts, for instance, data source display windows 950 ( 1 )( 1 ), 950 ( 1 )( 2 ), and 950 ( 1 )(N), all of which correspond to data source display 444 ( 1 ), which again corresponds to data source 102 ( 1 ).
  • data source 102 ( 1 ) were used for streaming broadband entertainment
  • three data source display windows 950 ( 1 )( 1 ), 950 ( 1 )( 2 ), and 950 ( 1 )(N) may comprise three different internet-broadcast television shows or the like.
  • multiple work-related data-source display windows may also be present on display 226 .
  • volume, size control, resolution control, as well as user preferences of data source display windows 950 ( 1 )( 1 )-(N)(N) may be managed by manager module 110 in many of the same ways as discussed both above and below. It is specifically noted that while independent network data source display 646 is illustrated as a single display, the display 646 may possess N number of windows, which may be arranged and modified in many or all of the same ways as data source display windows 950 ( 1 )( 1 )-(N)(N).
  • Manager module 110 in conjunction with audio system 228 , may be further configured to manage audio signals from data sources 102 ( 1 )-(N) and/or independent network data source 216 . Sound from the multiple data sources may project in unison, singly, or in any user-chosen combination. In some implementations, the sound emanating from one data source will project from the direction of the location of the corresponding data source display. Referring back to FIG. 7 , for example, sound from data source 102 ( 1 )—whose display 444 ( 1 ) is located on the left-hand portion of display 226 —may project from the left-hand location of audio system 228 . As such, the sound may appear to be emanating from the location of the corresponding display.
  • sound may appear to originate from the direction that a user is looking. This may be accomplished by the conjunction of manager module 110 , audio system 228 , as well as camera 232 , which may serve to notify manager module 110 of the user's current head orientation. This implementation may also be accomplished with the help of a user's chair. For instance, the direction from which sound emanates may be related to the current rotation of a user's chair.
  • Manager module 110 and audio system 228 may also manage sound volumes in a multitude of ways.
  • the volume of a data source may be related—possibly directly related—to the size of the corresponding data source display.
  • FIG. 7 depicts that data source display 444 (N) is noticeably smaller than independent network data source display 646 .
  • data source display 444 (N) broadcasts a news station and independent network data source display 646 comprises a telepresence signal, any sound from that latter may dominate the former. Again, this may be because of the relative sizes of each respective display.
  • the size of these displays is user-modifiable, thus making their volumes user-modifiable as well.
  • sounds from data sources may become louder when point-and-select cursor 848 covers that data source's respective display.
  • manager module 110 may store user preferences, as discussed above. Each user may have one or more stored preference settings, which may comprise audio or video preferences, or the like. Thus, if sensor 236 , which may comprise a weight-based chair sensor, recognizes User #1, then User #1's preference settings may be activated. Depending on the time of day or possibly User #1's selection, one of a plurality of different preference settings may be selected. For instance, User #1 may have a work preference setting and a recreational preference setting. In the work preference setting, display 226 may enlarge a work-related data source display and may lessen sizes and/or resolutions of others. In a recreational preference setting, all data source displays may be enlarged.
  • the work preference setting may be enabled.
  • User #1 may choose his or her own preference setting, such as recreational. While these implementations involve automatically setting preferences, it is to be understood that preferences may also be manually configured.
  • Other preference settings may be default settings. For instance, when a videoconference call is received, a data source display window associated with the call may increase in size, while others may decrease in size. When such a video call is received, all other data source displays may also disappear, so as to limit the calling party's visual access to the user's data. This may be used, for example, if one or more of data source displays comprise proprietary information. Furthermore, when a video or an audio phone call is received, all other sound coming from other data sources may be muted. It is to be understood that these specific capabilities are but some non-limiting examples of possible configurations.
  • FIG. 10 represents an exemplary process 1000 that may be carried out with the tools described above.
  • the process 1000 is illustrated as a collection of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof.
  • the blocks represent computer instructions that, when executed by one or more processors, perform the recited operations.
  • Operation 1002 represents receiving data including video data from a plurality of local data sources, the plurality of local data sources comprising one or more first local data sources coupled to a first network and one or more second local data sources coupled to a second network that is independent of the first network.
  • Operation 1004 meanwhile, represents outputting a portion of the video data received from the one or more first local data sources and a portion of the video data received from the one or more second local data sources onto a single display device.

Abstract

Methods, systems, and products manage multiple data sources or networks to a single device. A plurality of signals is received at the device, which may include video signals, from multiple local data sources. The device may further include an output module, which may output the plurality of signals onto a single display device.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 11/611,795, filed Dec. 15, 2006, now issued as U.S. Pat. No. ______, and incorporated herein by reference in its entirety.
  • BACKGROUND
  • Computer users perform a variety of tasks with the use of computing devices and the like. For instance, users may at times use a computer to perform work, surf the web, play games, or watch movies or the like. Some employers may, however, limit their employees' use of the employer's computers or networks. For example, employers may block access to non-work related web sites. This may cause some employee computer users to use multiple computers for multiple different tasks. For instance, an employee computer user may use a work computer or network to perform work, while using a personal computer to navigate the web. Using multiple computers, however, can prove to be confusing and spatially inefficient.
  • These problems may be exacerbated for a teleworking computer user. This is because a teleworking computer user may desire a reliable link into an employer's network, in addition to the ability to perform work-related and non-work related tasks on one or more computers. Currently, the teleworker only has the solution of keeping multiple computers and other desired electronics on her desk in order to accomplish all of her tasks. Again, this solution is less than ideal.
  • The description below addresses these and other shortcomings in the present art.
  • SUMMARY
  • Devices for managing multiple data sources or networks are described herein. One such device may include a data source input module configured to input a plurality of signals, which may include video signals, from multiple local data sources. The device may further include an output module, which may receive the signals from the data source input module and output the signals onto a single display device.
  • Another device may include computer-readable media having instructions for selecting, from multiple data source displays each outputted from a respective local data source and each located on a single display device, one of the multiple data source displays. Further instructions may include accessing the selected local data source to allow for use of the local data source, in response to the selecting of the local data source.
  • A method described herein may include receiving data, including video data, from multiple local data sources. These local data sources may include some first local data sources that couple to a first network and other second local data sources that couple to a second network that is independent of the first network. This method may also include outputting some or all of the video data received from both the first and second local data sources onto a single display device.
  • Other systems, methods, and/or computer program products according to embodiments will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional systems, methods, and/or computer program products be included within this description, be within the scope of the present invention, and be protected by the accompanying claims.
  • BRIEF DESCRIPTIONS OF THE DRAWINGS
  • The teachings herein are described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
  • FIG. 1 is a block diagram of an overall environment in which a manager module may operate.
  • FIG. 2 is a block diagram providing further details of another overall environment in which a manager module may operate, relating to various types of input and output to and from the manager module.
  • FIG. 3 is a block diagram of illustrative components of a manager module.
  • FIG. 4 is a block representation of a display device upon which a manager module may output a plurality of data source displays.
  • FIG. 5 is another block representation of a display device upon which a manager module may output a plurality of data source displays.
  • FIG. 6 is another block representation of a display device upon which a manager module may output a plurality of data source displays as well as an independent network data source display.
  • FIG. 7 is another block representation of a display device upon which a manager module may output a plurality of data source displays as well as an independent network data source display.
  • FIG. 8 is a block representation of a display device upon which a manager module may select and access one of a plurality of data source(s) and/or independent network data source(s).
  • FIG. 9 is a block representation of a display device upon which a manager module may output a plurality of data source display windows from a plurality of data sources.
  • FIG. 10 is a flow diagram that illustrates a process for managing data received from multiple local data sources.
  • DETAILED DESCRIPTION
  • FIG. 1 illustrates an overall environment 100 in which a manager module 110 may operate. Data sources 102(1)-(N) may provide input in the form of data signals or the like to a system 104 and, hence, the manager module 110. System 104 may comprise a processor 106, upon which may run computer-readable medium 108 or the like. Computer-readable medium may comprise, in whole or in part, manager module 110. As illustrated, manager module 110 may be implemented as one or more software modules that, when loaded onto a processor and executed, cause the system to perform any of the functions described herein. Alternatively, manager module 110 may itself comprise hardware, such as a processor or the like. System 104 and/or manager module 110 may output these signals to one or more input and/or output devices 112. Data sources 102(1)-(N) may comprise a plurality of different devices. Illustrative but non-limiting examples include personal computers such as towers and laptops, server devices, personal digital assistants (PDA's), mobile telephones, land-based telephone lines, external memory devices, cameras, or any other suitable electronic device. In some implementations, data sources 102(1)-(N) may comprise local devices. For instance, if data sources 102(1)-(N) are “local data sources”, they may be located locally to a user operating each respective device. These local data sources may receive data from and send data both locally and remotely. Input/output devices 112 may also comprise a plurality of electronic devices or the like, described in further detail with reference to FIG. 2.
  • In some implementations, input/output devices 112 may be located at a single user station 114. As such, manager module 110 may be configured to accept signal inputs from a plurality of data sources 102(1)-(N) and output those signals to single user station 114. In other implementations, devices 112 may be located at a multitude of locations, and may thus not comprise single user station 114. In any instance, the resulting system may thus comprise an environment capable of managing information to and from a plurality of data sources. As described in greater detail with reference to FIG. 2, the managed information may also comprise information from multiple networks, some of which may be independent from one another.
  • FIG. 2 illustrates another environment 200 in which manager module 110 may operate. Environment 200 may comprise one or more data sources 102(1)-(N), manager module 110, input/output devices 112, and an independent network data source 216. Although the single independent network data source 216 is illustrated, it is to be understood throughout that a plurality of independent network data sources may exist. As illustrated, input/output devices 112 may again be located at single user station 114 in some implementations, and at a multitude of locations in other implementations. Data sources 102(1)-(N) may transmit data to and receive data from a plurality of networks, or the data sources may all transmit data to and receive data from the same network. In the illustrated but non-limiting embodiment, data sources 102(1)-(N) are shown to all transmit data to manager module 110 from a network 218. Network 218 may be a land-line telephone-based dial up connection, a digital subscriber (DSL) line, a cable-based network, a satellite network, or the like. As illustrated, network 218 may connect to a router 220, which may then connect to one or more of data sources 102(1)-(N).
  • Furthermore, data sources 102(1)-(N) may all be put to the same use or such uses may vary. For instance, in some implementations data source 102(1) may be used to carry broadband entertainment signals, while data source 102(2) may be used to connect a virtual private network (VPN) signal. The latter signal may be used, for example, to connect to a work account that requires relatively high security. Meanwhile, in some instances data source 102(N) may be used for a connection to the internet. The internet connection may be an open internet connection, as compared to some instances of the VPN connection over data source 102(2). Because the VPN connection may be used, in some instances, to connect to a work account or the like, the managing employer may limit internet access over the network. The open internet connection of data source 102(N) may remedy this limitation in some implementations.
  • As discussed in reference to FIG. 1, non-limiting examples of data sources 102(1)-(N) may include personal computers such as towers and laptops, server devices, personal digital assistants (PDA's), mobile telephones, land-based telephone lines, external memory devices, cameras, or any other suitable electronic device. For example, data source 102(1) may comprise a personal computer, which may be used to carry broadband entertainment signals as discussed above. Similarly, data source 102(2) may comprise a computer as well, with its purpose being to provide a VPN connection to a user's work account. Data source 102(N), meanwhile, may also comprise a personal computer, such as a laptop computer, with its purpose being to provide an open internet connection for the user's navigating convenience. In some instances such as the one described above, data sources 102(1)-(N) will each comprise a separate housing, although it is possible that more than one data source may occupy a single housing. Furthermore, while the above example depicts all of data sources 102(1)-(N) as computers, computers need not always be implemented.
  • An independent network data source 216 may also provide input signals to manager module 110. Independent network data source 216 may transmit and receive data via a network 222 that is independent of network 218. If multiple independent network data sources exist, then one or more such networks may exist. Due to the independence of network 222 from network 218, independent network 222 and thus independent network data source 216 may be relatively free from any problems or encumbrances that network 218 may experience. For instance, if network 218 congests or crashes, signals traveling via independent network 222 may be unaffected. Thus, the content itself of independent network data source 216 may be relatively more secure as compared to a data source that shares a network with other data sources. Similar to network 218, independent network 222 may comprise a land-line telephone-based dial up connection, a digital subscriber (DSL) line, a cable-based network, a satellite network, or the like.
  • Similar to data sources 102(1)-(N), independent network data source 216 may comprise a plurality of different devices. Such non-limiting examples may include personal computers such as towers and laptops, server devices, personal digital assistants (PDA's), mobile telephones, land-based telephone lines, external memory devices, cameras, or any other electronic device. In some implementations, however, independent network data source 216 may comprise devices that define a telepresence module. Such a telepresence module may operate to provide, to a local or remote location, audio and/or video signals of a user when that user is present at a certain location, such as user station 114. A telepresence module may also operate to receive audio and/or video signals from a remote or local location. When a user is present at the desired location, such as the user station 114, the telepresence module may automatically or manually activate.
  • Stated otherwise, this resulting telepresence module may serve as a reliable telepresence link between two or more locations. For instance, telepresence link could serve as a link between user station 114 comprised of input/output devices 112 and a user's place of work. This place of work may, in some instances, be the user's office located at his or her employer's building. This link may provide audio and/or video signals to the two or more locations, such as user station 114 and the user's office at work. As such, independent network 222 and independent network data source 216 may provide a reliable telepresence signal without concern for any congestion of failure that network 218 may encounter.
  • Furthermore, independent network 222 and independent network data source 216 may provide always-on point-to-point or point-to-multipoint connectivity. Such a reliable and always-on system may serve to avoid not only processor errors, discussed above by way of congestion and/or failures, but may also serve to avoid operational error by a user by simplifying the system. An always-on system may also avoid the possibility of a data source entering “sleep mode” at an undesirable time.
  • Environment 200 may also comprise a telephone line 224, which may also provide input signals to manager module 110. Telephone line 224 may be a traditional land-based telephone line, or it may comprise packet (IP), mobile or satellite technology. Signals from telephone line 224 may also output to some or all of input/output devices 112.
  • Input/output devices 112 may comprise one or more displays 226, an audio system 228, cursor controllers 230, cameras 232, microphones 234 and/or sensors 236. Input/output devices 112 may serve to input or output signals to or from data sources 102(1)-(N) or independent network data source 216. As illustrated, input/output devices 112 may be configured to connect to manager module 110. Furthermore, input/output devices 112 may be located in a plurality of locations, or in some implementations may be located at single user station 114. Some components may also serve more than one purpose, or a single device could comprise all or nearly all of input/output devices 112.
  • One or more displays 226 may receive and display video signals from data sources 102(1)-(N) and/or independent network data source 216. If displays 226 comprises a single display, then the video signals and hence data source displays may be displayed on the single display. This is discussed in detail below with reference to FIGS. 4-9. It is specifically noted that the term “single display device” does not require a single monitor but rather a system that gives a user the sensation of a single display. This may be accomplished by attaching multiple monitors or the like and giving each a portion of a video signal. When combined, the signals form a complete video display.
  • In some implementations, display 226 may comprise a cylindrical or spherical monitor, which may span approximately 180° around a user. Display 226 may comprise a liquid crystal display (LCD) screen or the like, or may comprise a projector screen. If display 226 comprises a projector screen, then input/output devices 112 may further comprise a projector for outputting video signals onto display 226. Alternatively, manager module 110 could itself comprise a projector. If the projector screen is cylindrical or spherical, the projector may be capable of spreading the data source displays across the screen, again as discussed in detail below.
  • As mentioned above, input/output devices 112 may also comprise one or more audio systems 228. Audio system 228 may comprise speakers or the like for outputting audio signals from data sources 102(1)-(N) and/or independent network data source 216. Audio system 228 may further comprise a multi-channel audio reproduction system, which may comprise an automated mixer(s), amp(s), headphones, or the like. An automated mixer may mix audio signals from a plurality of data sources as well as incoming telephone calls.
  • Input/output devices 112 may also comprise one or more cursor controllers 230. Cursor controllers may include, without limitation, a text input device such as a keyboard, a point-and-select device such as a mouse, and/or a touch screen. Input/output devices 112 may also include one or more cameras 232. Cameras 232 may comprise, for example, one or more video cameras. In some implementations, camera 232 may be configured for use with a telepresence module discussed above. For example, a video camera signal may capture the image of a user and provide the signal to manager module 110. Manager module 110 may then transmit the signal to one or more locations, one of which may be the user's work office. The transmitted video signal may be streaming in some instances, so that an image of a user is projected at all times at one or more locations, such as the user's work office. This image may, for example, be displayed on a display device at one or more locations such as the user's work office. Thus, with the use of the one or more cameras 232, the user may be able to work at a remote user station, such as user station 114, while the user's coworkers or superiors can view the user and his or her activities. Of course, teleworking is merely one example of an activity for which the one or cameras 232 may be put to use.
  • Input/output devices 112 may further comprise, in some instances, one or more microphones 234 or other multi-channel audio sensing equipment. Microphone 234 may be configured to provide audio input signals to manager module 110. For example, microphone 234 may be configured for use with the telepresence module discussed above. In this implementation, microphone 234 may be configured to capture audio signals from a user and provide these signals to manager module 110. Manager module 110 may then transmit these signals to one or more locations, one of which may again be the user's work office. The transmitted audio signal may be streaming, so that the sounds of the user are audible at one or more locations at all times. Also, microphone 234 may be noise-gated and set with a threshold value. This may be of particular importance in implementations with multiple microphones, so as to only send user-inputted audio signals from a certain microphone to manager module 110 when the user intends to do so. Microphone 234 may also act in conjunction with manager module 110 to enable voice recognition. In some implementations, these components may require that user's voice be recognized before allowing user to use microphone 234 and hence manager module 110.
  • Furthermore, implementations with multiple microphones may utilize stereophonic sound techniques. For instance, if a user conducts multiple videoconference and/or teleconference meetings simultaneously, the system may dedicate a microphone to each. As a result, if a user turns and speaks into a left microphone, a recipient represented on the data source display on the left may receive the sound at greater volumes. At the same time, the microphone on the right may receive less input and, hence, the recipient depicted in the data source display on the right may receive the sound at lesser volumes.
  • In the office example, microphone 234 could be coupled with camera 232 so as to provide the image and sounds of the user at the user's work office at all times. The user's work office may thus not only have a display depicting the user's image, but may also have one or more speakers to output the audio signals captured by microphone 234 and transmitted by manager module 110. Furthermore, in some instances the user's work office (or other exemplary locations) may contain one or more cameras, such as a video camera, and one or more microphones so that two-way audio and visual communication may occur between the office and the remote user station, such as user station 114.
  • In the teleworking example, this may provide adequate availability of the user in the office, despite the fact that user may be remotely situated. For example, a co-worker may be able to walk over to the user's work office and see that the user is working at a remote user station. The co-worker may confront a monitor or the like displaying the user at a remote user station. The co-worker may also be able to ask the user questions and like with use of the office microphone(s), and the two may be able to communicate as efficiently or nearly as efficiently as if the user were physically present at the work office. Furthermore, while this telepresence link may be secure, it may also be accessible by others. A user's boss, for instance, may be able to bridge into the telepresence signal rather than have to walk to the user's work office to communicate to the user.
  • Output/input devices 112 may further comprise one or more sensors 236. Sensor 236 may, in some instances, sense when a user is present at a user station, such as user station 114. In some implementations, sensor 236 may comprise a weight-based sensor that may be configured to situate adjacent to or integral with a user's chair. Thus, in some instances when a user sits down on his or her chair, sensor 236 may detect that the user is present. Furthermore, sensor 236 may be capable of differentiating between users. In the weight-based sensor example, sensor 236 may be capable of differentiating between a first user (User #1) and a second user (User #2) based on each user's weight. It is noted, however, that other sensors are envisioned. For example, camera 232 may serve to sense when a user is present, as may microphone 234.
  • Furthermore, sensor 236 may collect various pieces of information, which may use to determine the presence of a user. Sensor 236 may also store and/or log this collected information. For instance, sensor 236 may collect information such as the weight or height of a user, as well as other user-specific data. Furthermore, sensor 236 may gather the time of when certain data was collected. In some implementations, sensor 236 may calculate and save the amount of time that a user spent at the user station, as well as the amount of time that the user was away. This collected information may be provided only the user, or it could be available by others, such as the user's boss in the teleworking examples.
  • Along with sensor 236, input/output devices 112 may also collect various pieces of information. For instance, camera 232 may collect location information pertaining to a present user. Similarly, microphone 234 may collect information regarding the sounds emanating from the present user. Sensor 236, camera 232, microphone 234, and other input/output devices 112 may not only collect the actual content of the user's actions, but may also collect the “recipe” of the content of the user's actions. These devices may create a chronological list of events that comprise the user's actions, which may be sent to remote locations for synthesis. Using this events list or recipe, remote locations may synthesize the user's actions. This may apply to audio, video, smell or any other characteristic that may be present at user station 114 or the like. While this synthesis may not be as real as if the remote location merely receives and displays streaming video, audio, or the like, the size of the file transporting this information may be orders of magnitude smaller.
  • For instance, such a chronological events list created by various pieces of information collected by input/output devices 112 may look like the following:
      • At time 21:27:06 GMT, 28 Nov. 2006; weight sensor reading=151.5 pounds; audio amplitude=13.7 decibels; audio content=“I'm working from home today” using Bob Jones' voice font; playback rate=7 words per minute; video content=Bob Jones in chair, display cartoon face item #47 for 50 milliseconds
      • At time 21:27:08 GMT, 28 Nov. 2006; weight sensor reading 000.3 pounds; audio amplitude=0.0 decibels; video content=empty chair, display empty chair
      • At time . . . .
  • As discussed above, this events list may be transmitted to remote locations, which may then synthesize the content according to the recipe. Using the above events list, for example, the remote location may initially show a cartoon face or the like of a generic man, or a more realistic picture of “Bob Jones”. The remote location system, such as a computer at the user's work, may also project a voice, and possibly Bob's voice, that says “I′m working from home today”. The projected voice may be approximately equal to the volume spoken by Bob, and may also be approximately equal to the words' spoken cadence. Furthermore, at time=21:27:08 GMT, the remote location may display an empty chair representing that Bob is no longer present at user station 114. Such an events list may serve to decrease file size being sent along the networks while still maintaining a realistic experience at the remote location.
  • Sensor 236 may also serve other purposes. In some implementations, sensor 236 may adjust the sizes of data source displays or data source display windows, discussed in detail below, depending on a current angle of rotation of a user's chair. Sound volumes emanating from different data source displays may also be adjusted based on chair rotation, as may microphone volumes. For instance, if a user's chair is currently rotated to the left, insinuating that user's attention currently focuses on the data source display to the left, the volume of that data source may be increased, as may be the window size. It is also envisioned that sensor 236 could make other similar adjustments in this manner.
  • Furthermore, in some implementations, sensor 236 may cause one or more of data sources 102(1)-(N) and/or independent network data source 216 to turn on or off, or perform one or more other functions. In the implementation of the weight-based chair sensor, sensor 236 may cause a telepresence module to turn on when a user sits in his or her chair. This may thereby establish a reliable telepresence link. Again, in some implementations this could automatically turn on a telepresence link to a work office, which may comprise turning on a display, camera, and/or microphone at the work office. In this implementation, sensor 236 may also notify others located at the work office (or other exemplary location) when the user has stepped out of the remote user station, such as user station 114. When the user gets up from his or her chair, for instance, the work office display may present an away message indicating that the user has recently left the user station, but may soon return. Furthermore, sensor 236 may recognize users in order to automatically provide user preferences tailored to the detected user. These user preferences may comprise audio or video preferences or the like, which may be comprised of the different audio and video capabilities discussed below.
  • As noted above, telephone line 224 may input to manager module 110. Thus, input/output devices 112 may comprise a telephone. Also, some or all of input/output devices 112 described above may serve telephone functionality over telephone line 216. For instance, audio system 228 may output sound from telephone line 224, while microphone 234 may input audio signals into telephone line 216. Furthermore, display 226 may exhibit a video signal transmitted over telephone line 224, such as for a video teleconferencing call. Camera 232 may accordingly also provide video signals, such as of the image of the user, over telephone line 224. Other components may be included to achieve full telephone capabilities, such as a speakerphone, a headset, a handset, or the like.
  • FIG. 3 depicts a block diagram of illustrative components of manager module 110. Manager module 110 may comprise a data source input module 338, an output module 340 and a user input module 342. Data source input module 338 may be configured to receive a plurality of signals from data sources 102(1)-(N). The plurality of signals may comprise video and audio signals. In one implementation, the plurality of signals may comprise a telepresence signal as described above. Output module 340 may be configured to receive the plurality of signals from data source input module 338 and output the signals to input/output devices 112. In some instances, output module 340 may be configured to output video signals to one or more displays 226. For instance, output module 340 may output the video signals to a single display 226. Furthermore, output module 340 may be configured to output audio signals to audio system 228. In one implementation, output module 340 may be configured to output audio signals from data sources 102(1)-(N) to a single speaker or set of speakers. In this case, the system may further be capable of managing sound volumes and other respective characteristics of audio signals coming from discrete data sources 102(1)-(N).
  • As mentioned above, manager module 110 may include user input module 342, which may be configured to receive user input signals or other user instructions. In some instances, user input module 342 may be configured to receive input from one or more of the one or more cursor controllers 230, cameras 232, microphones 234, and/or sensors 236. Signals received from cursor controller 230 may, for example, be received for the purpose of accessing a data source or modifying, activating or using a program or application running on a data source. User input module 342 may also be configured to receive signals from camera 232, which may comprise images (e.g. video images) of a user located at user station 114. Similarly, user input module 342 may receive user-inputted audio signals from microphone 234. User input module 342 may further be configured to receive signals from sensor 236, which may, in some instances, serve to indicate that a user is present at user station 114. For instance, user input module 342 may receive a signal from sensor 236 indicating that a user is sitting in the user station chair and/or facing a certain direction as discussed above. All of the afore-mentioned signals may be relayed from user input module 342, and hence manager module 110, and to the respective data source 102(1)-(N) or independent network data source 216 destination.
  • Turning next to FIG. 4, in some implementations manager module 110 and its output module 340 may output video signals from one or more data sources 102(1)-(N) to a single display. Video signals from data sources 102(1)-(N) may define data source displays 444(1)-(N). For instance data source display 444(1) may represent video signals corresponding to data source 102(1), and so on. As such, manager module 110 may be configured to output or project a plurality of data source displays 444(1)-(N) to a single display device 226. In the illustrated implementation of FIG. 4, four data source displays are depicted.
  • FIG. 5 depicts a manner in which a plurality of data source displays 444(1)-(N) may be arranged on the single display device 226. As illustrated, data source displays 444(1)-(N) may be aligned adjacent to one another and may be displayed in equal proportions. Manager module 110 may be further configured to output video signals of independent network data source 216 onto the single display device 226. Video signals of independent network data source 216 may define an independent network data source display 646. The independent network data source display 646 may comprise video images taken at another location by detection means such as one or more cameras, as discussed above. Also as described above, this location may be a work office or the like. As such, a user observing display 226 may be able to see his or her work office from a user station, such as user station 114, containing input/output devices 112. In any event, manager module 110 may, in some instances, output one or more data source displays 444(1)-(N) as well as independent network data source display 646 on a single display, such as the single display device 226. Furthermore, the data source displays 444(1)-(N) and 646 may be arranged adjacent to one another and in equal proportions, as depicted in FIG. 6.
  • Data source displays 444(1)-(N) and/or independent network data source display 646, however, need not be arranged with such uniformity. Reference is thus made to FIG. 7, which depicts an alternative arrangement of the data source displays 444(1)-(N) and 646. In the illustrated and non-limiting example, a user may decide to focus his or her attention mostly upon independent network data source display 646 (or other data source displays 444(1)-(N)). As such, the user may arrange the display 646 in the middle of display 226 and may also choose to enlarge the display 646. Conversely, the user may decide that the current importance of data source display 444(1) is minimal, and may accordingly move the display 444(1) away from the middle of a screen of display 226. In the 180° cylindrical display implementation discussed above, the user may choose to slide the display 444(1) all the way adjacent to the user's left ear, for instance. Note also that, as illustrated, the user may lessen the size of data source display 444(1) as well. Other data source displays 444(2)-(N) may likewise be arranged by location and size. Furthermore, the user may also be able to modify resolutions of the displays and manager module 110 may be capable of providing such modification.
  • Returning to the example discussed above, data source 102(1) may comprise a personal computer, which may be used to carry broadband entertainment signals. Similarly, data source 102(2) may comprise a computer as well, with its purpose being to provide a VPN connection to a user's work account. Data source 102(N), meanwhile, may also comprise a personal computer, such as a laptop computer, with its purpose being to provide an open internet connection for the user's navigating convenience. Furthermore, independent network data source 216 may comprise a telepresence signal so that a user may work at a remote user station, such as user station 114, while broadcasting audio and/or video signals to the user's work office, for instance.
  • Now merging this implementation with FIG. 7, it may be that the user becomes more focused on work than the other tasks being accomplished by data sources 102(1)-(N). Thus, as discussed immediately above, the user may enlarge and center independent network data source display 646. Meanwhile, the user may be watching video from CNN® (CNN® is a registered trademark of Cable News Network, LP) or the like and may choose to lessen this data source display 444(1) and move it away from the center of display 226. Similarly, the user may be accessing a spreadsheet from a work database via the VPN from data source 102(2). At this moment, however, the user may again decide to focus on independent network data source display 646 and may thus decide to lessen data source display 444(2) as well as move it away from the center of display 226. In this example the user may also be surfing the internet with the use of the open internet connection provided by data source 102(N). Again, the user may choose to divert his or her attention from this data source display 444(N) by lessening its size and placing it in a less noticeable location. The user may also decide to provide high resolution to independent network data source display 646. Conversely, the user may choose to lessen a resolution of data source display 444(1) or other data sources.
  • If manager module 110 is to have the capability to dynamically, select, arrange, and modify data source displays 444(1)-(N) and/or independent network data source display 646, one or more of data sources 102(1)-(N) or independent network data source 216 may be chosen, selected, highlighted, or the like. In some implementations, cursor controller 230 may help to provide this capability. Reference is thus made to FIG. 8, which provides an exemplary illustration of how this selection may be accomplished.
  • FIG. 8 again depicts display 226, in this case a single display, as well as data source displays 444(1)-(N) and independent network data source display 646. FIG. 8 also depicts a point-and-select cursor 848. Because each of the data sources 102(1)-(N) and 216 may comprise its own point-and-select cursor, point-and-select cursor 848 may comprise a “meta-cursor” for each data source cursor, in some implementations. In other words, each of the data sources 102(1)-(N) and 216 may comprise its own cursor operable by one or more cursor controllers. However, manager module 110 may provide point-and-select cursor 848, which may be configured to navigate display 226. Furthermore, cursor controller 230, such as a mouse or keyboard, may command point-and-select cursor 848. Thus, when cursor controller 230 navigates cursor 848 over a data source display, cursor 848 may then command the data source's own cursor. In some implementations, cursor 848 may thus define a “meta-cursor” for individual data source cursors.
  • Furthermore, point-and-select cursor 848 may serve to navigate over the output of display 226 and select one or more of a plurality of data source displays 444(1)-(N) and 646. A data source, such as the data sources 102(1)-(N) and 216, may be selected by clicking a portion of a cursor controller 230 or by merely moving cursor 848 over a certain data source display 444(1)-(N) and 646. Once selected, the user may have access to that data source (including its video and audio signals) and may also now have the ability to modify and/or arrange the data source display. As illustrated in FIG. 8, the user may select, for example, data source display 444(2) from display 226. At this point, the user may be able to place data source display 444(2) on display 226 in a desired location. Other data source displays may similarly be arranged and re-arranged. Furthermore, the user may now have the ability to change the size and/or resolution of data source display 444(2).
  • Continuing the example discussed immediately above, data source display 444(2) may correspond to data source 102(2), which may be performing work-related VPN operations. For example, the user may have a work-related spreadsheet open on data source 102(2). Nevertheless, the user may choose to focus on the contents of independent network data source 216 and may thus move data source display 444(2), corresponding to the spreadsheet, away from the center of display 226. Again, the user may also choose to lessen the size and possibly the resolution of the display 444(2). Other data source displays, such as the data source displays 444(1), 444(N), and 646 may likewise be selected, arranged, re-arranged, and modified.
  • Furthermore, selecting data source 102(2) may also serve to allow for use or modification of the data source. For example, if the user in the current example selects data source display 444(2) with point-and-select cursor 848, then the user may be able to operate on the work-related spreadsheet. Similarly, selection of another data source display, such as the data source display 444(1), 444(N), or 646, may allow for operation of the selected data source.
  • Referring now to FIG. 9, manager module 110 may also be configured to allow for the selection, modification, arrangement, and/or re-arrangement of data source display windows 950(1)(1)-(N)(N). These data source display windows 950(1)(1)-(N)(N) may correspond to programs or applications that may be running in each respective data source 102(1)-(N) and on each respective data source display 444(1)-(N). Thus, for example, data source display window 950(1)(1) may correspond to a first window open in data source display 444(1), which itself may correspond to data source 102(1). Likewise, data source display window 950(N)(N) may correspond to an Nth window open on data source display 444(N), which itself may correspond to data source 102(N).
  • Data source display windows 950(1)(1)-(N)(N) may be modified, selected, arranged, and re-arranged in display 226 in many of the ways discussed above in regards to data source displays 444(1)-(N). For instance, once a data source 102(1)-(N) is selected, such as in the manner depicted in FIG. 7, data source display windows located within the data source displays may be used, modified, selected, arranged or re-arranged. Data source display windows 950(1)(1)-(N)(N) may also be extracted from their respective data source displays 444(1)-(N) and placed on display 226. In some implementations, the illustrated point-or-select cursor 848 may accomplish such alterations of the data source display windows 950(1)(1)-(N)(N), although other methods may also be utilized. A keyboard, for example, may also be used. In any instance, it is noted that multiple data source display windows from a single data source display may be located and viewed on display 226.
  • FIG. 9 depicts, for instance, data source display windows 950(1)(1), 950(1)(2), and 950(1)(N), all of which correspond to data source display 444(1), which again corresponds to data source 102(1). If, as in the example used above, data source 102(1) were used for streaming broadband entertainment, then three data source display windows 950(1)(1), 950(1)(2), and 950(1)(N) may comprise three different internet-broadcast television shows or the like. Similarly, for a work-related data source, multiple work-related data-source display windows may also be present on display 226. Volume, size control, resolution control, as well as user preferences of data source display windows 950(1)(1)-(N)(N) may be managed by manager module 110 in many of the same ways as discussed both above and below. It is specifically noted that while independent network data source display 646 is illustrated as a single display, the display 646 may possess N number of windows, which may be arranged and modified in many or all of the same ways as data source display windows 950(1)(1)-(N)(N).
  • Manager module 110, in conjunction with audio system 228, may be further configured to manage audio signals from data sources 102(1)-(N) and/or independent network data source 216. Sound from the multiple data sources may project in unison, singly, or in any user-chosen combination. In some implementations, the sound emanating from one data source will project from the direction of the location of the corresponding data source display. Referring back to FIG. 7, for example, sound from data source 102(1)—whose display 444(1) is located on the left-hand portion of display 226—may project from the left-hand location of audio system 228. As such, the sound may appear to be emanating from the location of the corresponding display.
  • Alternatively, sound may appear to originate from the direction that a user is looking. This may be accomplished by the conjunction of manager module 110, audio system 228, as well as camera 232, which may serve to notify manager module 110 of the user's current head orientation. This implementation may also be accomplished with the help of a user's chair. For instance, the direction from which sound emanates may be related to the current rotation of a user's chair.
  • Manager module 110 and audio system 228 may also manage sound volumes in a multitude of ways. For instance, the volume of a data source may be related—possibly directly related—to the size of the corresponding data source display. Reference is again made to FIG. 7, which depicts that data source display 444(N) is noticeably smaller than independent network data source display 646. In this implementation, if data source display 444(N) broadcasts a news station and independent network data source display 646 comprises a telepresence signal, any sound from that latter may dominate the former. Again, this may be because of the relative sizes of each respective display. Of course, in some implementations the size of these displays is user-modifiable, thus making their volumes user-modifiable as well. In these implementations, sounds from data sources may become louder when point-and-select cursor 848 covers that data source's respective display.
  • Furthermore, it is noted that manager module 110 may store user preferences, as discussed above. Each user may have one or more stored preference settings, which may comprise audio or video preferences, or the like. Thus, if sensor 236, which may comprise a weight-based chair sensor, recognizes User #1, then User #1's preference settings may be activated. Depending on the time of day or possibly User #1's selection, one of a plurality of different preference settings may be selected. For instance, User #1 may have a work preference setting and a recreational preference setting. In the work preference setting, display 226 may enlarge a work-related data source display and may lessen sizes and/or resolutions of others. In a recreational preference setting, all data source displays may be enlarged. If, for example, sensor detects User #1 during the daytime, then the work preference setting may be enabled. Alternatively, User #1 may choose his or her own preference setting, such as recreational. While these implementations involve automatically setting preferences, it is to be understood that preferences may also be manually configured.
  • Other preference settings may be default settings. For instance, when a videoconference call is received, a data source display window associated with the call may increase in size, while others may decrease in size. When such a video call is received, all other data source displays may also disappear, so as to limit the calling party's visual access to the user's data. This may be used, for example, if one or more of data source displays comprise proprietary information. Furthermore, when a video or an audio phone call is received, all other sound coming from other data sources may be muted. It is to be understood that these specific capabilities are but some non-limiting examples of possible configurations.
  • Finally, FIG. 10 represents an exemplary process 1000 that may be carried out with the tools described above. The process 1000 is illustrated as a collection of blocks in a logical flow graph, which represents a sequence of operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the blocks represent computer instructions that, when executed by one or more processors, perform the recited operations. Operation 1002 represents receiving data including video data from a plurality of local data sources, the plurality of local data sources comprising one or more first local data sources coupled to a first network and one or more second local data sources coupled to a second network that is independent of the first network. Operation 1004, meanwhile, represents outputting a portion of the video data received from the one or more first local data sources and a portion of the video data received from the one or more second local data sources onto a single display device.
  • It is noted that the various modules shown herein may be implemented in hardware, software, or any combination thereof. Additionally, these modules are shown as separate items only for convenience of reference and description, and these representations do not limit possible implementations of the teachings herein. Instead, various functions described with these modules could be combined or separated as appropriate in a given implementation, without departing from the scope and spirit of the description herein.
  • CONCLUSION
  • Although techniques and devices for managing data from multiple data sources have been described in language specific to certain features and methods, it is to be understood that the features defined in the appended claims are not necessarily limited to the specific features and methods described. Rather, the specific features and methods are disclosed as illustrative forms of implementing the claimed subject matter.

Claims (20)

1. A system, comprising:
a processor; and
memory for storing code that when executed causes the processor to perform operations, the operations comprising:
receiving broadband signals from a broadband connection to a first network;
receiving virtual private network signals over a separate connection to a second network;
determining a current location matches a telepresence location;
activating a telepresence module when the current location matches the telepresence location;
receiving telepresence signals over an independent connection to a third network when the telepresence module is activated;
selecting a single display device of multiple displays communicating with the processor; and
outputting the broadband signals, the virtual private network signals, and the telepresence signals to a single connection to the single display device.
2. The system according to claim 1, wherein the operations further comprise receiving a sensor measurement at the telepresence module.
3. The system according to claim 1, wherein the operations further comprise generating a tiled presentation of the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
4. The system according to claim 1, wherein the operations further comprise proportionally displaying the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
5. The system according to claim 1, wherein the operations further comprise generating an aligned presentation of the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
6. The system according to claim 1, wherein the operations further comprise minimizing a presentation of one of the broadband signals, the virtual private network signals, and the telepresence signals output to the single display device.
7. The system according to claim 1, wherein the operations further comprise expanding a presentation of one of the broadband signals, the virtual private network signals, and the telepresence signals output to the single display device.
8. A method, comprising:
receiving, at a device, broadband signals from a broadband connection to a first network;
receiving, at the device, virtual private network signals over a separate connection to a second network;
determining a current location of the device matches a telepresence location;
activating a telepresence module stored in memory of the device when the current location matches the telepresence location;
receiving, at the device, telepresence signals over an independent connection to a third network when the telepresence module is activated;
receiving a user input at the device that selects a single display device of multiple display devices connected to the device; and
outputting the broadband signals, the virtual private network signals, and the telepresence signals to a single connection from the device to the single display device.
9. The method according to claim 1, further comprising receiving a sensor measurement at the telepresence module.
10. The method according to claim 1, further comprising generating a tiled presentation of the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
11. The method according to claim 1, further comprising proportionally displaying the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
12. The method according to claim 1, further comprising generating an aligned presentation of the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
13. The method according to claim 1, further comprising minimizing a presentation of one of the broadband signals, the virtual private network signals, and the telepresence signals output to the single display device.
14. The method according to claim 1, further comprising expanding a presentation of one of the broadband signals, the virtual private network signals, and the telepresence signals output to the single display device.
15. A computer readable memory storing instructions that when executed cause a processor to perform operations, the operations comprising:
receiving, at a device, broadband signals from a broadband connection to a first network;
receiving, at the device, virtual private network signals over a separate connection to a second network;
determining a current location of the device matches a telepresence location;
activating a telepresence module stored in memory of the device when the current location matches the telepresence location;
receiving, at the device, telepresence signals over an independent connection to a third network when the telepresence module is activated;
receiving a user input at the device that selects a single display device of multiple display devices connected to the device; and
outputting the broadband signals, the virtual private network signals, and the telepresence signals to a single connection from the device to the single display device.
16. The computer readable memory according to claim 15, wherein the operations further comprise receiving a sensor measurement at the telepresence module.
17. The computer readable memory according to claim 15, wherein the operations further comprise generating a tiled presentation of the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
18. The computer readable memory according to claim 15, wherein the operations further comprise proportionally displaying the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
19. The computer readable memory according to claim 15, wherein the operations further comprise generating an aligned presentation of the broadband signals, the virtual private network signals, and the telepresence signals on the single display device.
20. The computer readable memory according to claim 15, wherein the operations further comprise minimizing a presentation of one of the broadband signals, the virtual private network signals, and the telepresence signals output to the single display device.
US13/748,621 2006-12-15 2013-01-24 Methods, Systems, and Products for Managing Multiple Data Sources Abandoned US20130138773A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/748,621 US20130138773A1 (en) 2006-12-15 2013-01-24 Methods, Systems, and Products for Managing Multiple Data Sources

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/611,795 US8384753B1 (en) 2006-12-15 2006-12-15 Managing multiple data sources
US13/748,621 US20130138773A1 (en) 2006-12-15 2013-01-24 Methods, Systems, and Products for Managing Multiple Data Sources

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/611,795 Continuation US8384753B1 (en) 2006-12-15 2006-12-15 Managing multiple data sources

Publications (1)

Publication Number Publication Date
US20130138773A1 true US20130138773A1 (en) 2013-05-30

Family

ID=47721232

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/611,795 Active 2031-04-27 US8384753B1 (en) 2006-12-15 2006-12-15 Managing multiple data sources
US13/748,621 Abandoned US20130138773A1 (en) 2006-12-15 2013-01-24 Methods, Systems, and Products for Managing Multiple Data Sources

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/611,795 Active 2031-04-27 US8384753B1 (en) 2006-12-15 2006-12-15 Managing multiple data sources

Country Status (1)

Country Link
US (2) US8384753B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104144223A (en) * 2014-08-21 2014-11-12 北京奇艺世纪科技有限公司 Data obtaining method and device
US9578079B2 (en) 2013-03-15 2017-02-21 Ricoh Company, Ltd. Distribution control system, distribution system, distribution control method, and computer-readable storage medium
US9648096B2 (en) 2013-03-15 2017-05-09 Ricoh Company, Limited Distribution control system, distribution system, distribution control method, and computer-readable storage medium

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008124820A1 (en) * 2007-04-10 2008-10-16 Reactrix Systems, Inc. Display using a three dimensional vision system
WO2009035705A1 (en) 2007-09-14 2009-03-19 Reactrix Systems, Inc. Processing of gesture-based user interactions
US8159682B2 (en) 2007-11-12 2012-04-17 Intellectual Ventures Holding 67 Llc Lens system
US8259163B2 (en) * 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
US8849680B2 (en) * 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US8539369B2 (en) * 2010-01-06 2013-09-17 La Crosse Technology, Ltd. Central monitoring and measurement system
WO2013144417A1 (en) * 2012-03-29 2013-10-03 Nokia Corporation A method, an apparatus and a computer program for modification of a composite audio signal
US11595527B2 (en) 2021-03-16 2023-02-28 Bank Of America Corporation Dynamic routing for communication systems
US11715056B2 (en) 2021-03-16 2023-08-01 Bank Of America Corporation Performance monitoring for communication systems

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020163572A1 (en) * 2000-11-10 2002-11-07 Center Julian L. Methods of establishing a communications link using perceptual sensing of a user's presence
US6493008B1 (en) * 1999-02-19 2002-12-10 Canon Kabushiki Kaisha Multi-screen display system and method
US20030020671A1 (en) * 1999-10-29 2003-01-30 Ovid Santoro System and method for simultaneous display of multiple information sources
US20040263686A1 (en) * 2003-06-26 2004-12-30 Samsung Electronics, Co., Ltd. Method and apparatus displaying double screen
US7180988B2 (en) * 2003-01-31 2007-02-20 Qwest Communications International Inc. Packet network interface device and systems and methods for its use
US20070040900A1 (en) * 2005-07-13 2007-02-22 Polycom, Inc. System and Method for Configuring Routing of Video from Multiple Sources to Multiple Destinations of Videoconference Using Software Video Switch
US20070120763A1 (en) * 2005-11-23 2007-05-31 Lode De Paepe Display system for viewing multiple video signals
US7364313B2 (en) * 2002-12-27 2008-04-29 Barco N.V. Multiple image projection system and method for projecting multiple selected images adjacent each other
US7679612B2 (en) * 2004-04-30 2010-03-16 Microsoft Corporation Configuration goals via video presenting network

Family Cites Families (93)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6594688B2 (en) 1993-10-01 2003-07-15 Collaboration Properties, Inc. Dedicated echo canceler for a workstation
US5850606A (en) 1996-05-17 1998-12-15 Bellsouth Corporation Method and system for transferring a cellular telephone call between intelligent cell sites
IL132060A0 (en) 1997-03-31 2001-03-19 Broadband Associates Method and system for providing a presentation on a network
JPH10308751A (en) 1997-05-09 1998-11-17 Fujitsu Ltd Communication terminal for information provision system, network device and terminal identification information setting method, information provision system and fixed length cell transmitter/receiver for information communication system
US6757274B1 (en) 1997-12-16 2004-06-29 Bellsouth Intellectual Property Corporation Method and apparatus for allowing selective disposition of an incoming telephone call during an internet session
US7085710B1 (en) 1998-01-07 2006-08-01 Microsoft Corporation Vehicle computer system audio entertainment system
US7394897B1 (en) 1998-11-04 2008-07-01 At&T Delaware Intellectual Property, Inc. Method and system for routing service calls made from resold lines
US6810113B1 (en) 1999-02-26 2004-10-26 Bellsouth Intellectual Property Corporation Methods and systems to make spoken name data available
US6882708B1 (en) 1999-02-26 2005-04-19 Bellsouth Intellectual Property Corporation Region-wide messaging system and methods including validation of transactions
US6628761B1 (en) 1999-02-26 2003-09-30 Bellsouth Intellectual Property Corporation Methods and systems allowing access to a messaging platform through a visited messaging platform
AU2350200A (en) 1999-02-26 2000-09-14 Bellsouth Intellectual Property Corporation Methods and systems to provide a message in a messaging system without revealingan identity of the sending party
US6681257B1 (en) 1999-02-26 2004-01-20 Bellsouth Intellectual Property Corporation Methods and system for determining message routing based on elements of a directory number
US6633633B1 (en) 1999-12-22 2003-10-14 Bellsouth Intellectuel Property Corporation Method and system for providing calling number restoral
US7669051B2 (en) 2000-11-13 2010-02-23 DigitalDoors, Inc. Data security system and method with multiple independent levels of security
US6665388B2 (en) 2000-12-20 2003-12-16 Bellsouth Intellectual Property Corporation System and method for monitoring incoming communications to a telecommunications device
US20050152363A1 (en) 2000-12-21 2005-07-14 Bellsouth Intellectual Property Corporation Disposable communications addresses
US7469043B1 (en) 2000-12-21 2008-12-23 At&T Delaware Intellectual Property, Inc. Disposable telephone numbers
US6724863B1 (en) 2000-12-22 2004-04-20 Bellsouth Intellectual Property Corporation Method and system for message routing
US6842506B1 (en) 2000-12-22 2005-01-11 Bellsouth Intellectual Property Corp. Method and system for message routing
US7388949B2 (en) 2000-12-28 2008-06-17 At&T Delaware Intellectual Property, Inc. System and method for audio caller identification service
US20020143812A1 (en) 2001-03-27 2002-10-03 Bedingfield James C. System and method of automatically updating content on a web site
US8990678B2 (en) 2001-03-27 2015-03-24 At&T Intellectual Property I, L.P. Systems and methods for automatically providing alerts of web site content updates
US6879683B1 (en) 2001-06-28 2005-04-12 Bellsouth Intellectual Property Corp. System and method for providing a call back option for callers to a call center
US7403768B2 (en) 2001-08-14 2008-07-22 At&T Delaware Intellectual Property, Inc. Method for using AIN to deliver caller ID to text/alpha-numeric pagers as well as other wireless devices, for calls delivered to wireless network
US7315614B2 (en) 2001-08-14 2008-01-01 At&T Delaware Intellectual Property, Inc. Remote notification of communications
US7080169B2 (en) 2001-12-11 2006-07-18 Emulex Design & Manufacturing Corporation Receiving data from interleaved multiple concurrent transactions in a FIFO memory having programmable buffer zones
US20050034147A1 (en) 2001-12-27 2005-02-10 Best Robert E. Remote presence recognition information delivery systems and methods
US20040260604A1 (en) 2001-12-27 2004-12-23 Bedingfield James C. Methods and systems for location-based yellow page services
US7206388B2 (en) * 2002-03-18 2007-04-17 Openwave Systems Inc. System and method for providing voice-activated presence information
US7404001B2 (en) * 2002-03-27 2008-07-22 Ericsson Ab Videophone and method for a video call
US7103168B2 (en) 2002-04-16 2006-09-05 Bellsouth Intellectual Property Corporation Methods and systems for implementing personal dialing plans
US6980635B2 (en) 2002-04-30 2005-12-27 Bellsouth Intellectual Property Corporation Methods and systems for automated prepaid service routing
US7095834B2 (en) 2002-04-30 2006-08-22 Bellsouth Intellectual Property Corporation Methods and systems for 1+ prepaid routing service
US6853718B1 (en) 2002-05-29 2005-02-08 Bellsouth Intellectual Property Corporation System and method for efficient telephone call transfer
US7382872B2 (en) 2002-08-01 2008-06-03 At&T Delaware Intellectual Property, Inc. Systems and methods for providing advanced telephony services
WO2004028118A2 (en) 2002-09-17 2004-04-01 Bellsouth Intellectual Property Corporation System and method for providing usage monitoring telephony services
US7127051B2 (en) 2002-09-17 2006-10-24 Bellsouth Intellectual Property Corporation System and method for providing advanced telephony services using a virtual telephone number
US7450945B2 (en) 2002-09-17 2008-11-11 At&T Mobility Ii Llc System and method for providing advanced wireless telephony services using a wireline telephone number
US7274784B2 (en) 2002-11-05 2007-09-25 At&T Bls Intellectual Property, Inc. Methods, systems, and computer program products for routing calls based on the originating network
US7773982B2 (en) 2002-11-25 2010-08-10 At&T Intellectual Property, I, L.P. Methods, systems and storage media to remotely control a wireless unit
US7389089B1 (en) 2002-11-25 2008-06-17 At&T Delaware Intellectual Property, Inc. Methods to remotely control a wireless unit
US7006829B2 (en) 2003-05-14 2006-02-28 Bellsouth Intellectual Property Corporation Method and system for routing a telephone call
US7352855B2 (en) 2003-08-22 2008-04-01 At&T Delaware Intellectual Property, Inc. Method and system for providing a privacy management service
US8180039B2 (en) 2003-08-26 2012-05-15 At&T Intellectual Property I, L.P. Methods, systems, and computer program products for routing calls based on the originating network
US7233656B2 (en) 2003-09-10 2007-06-19 At&T Intellectual Property, Inc. Method and system for identifying telemarketer communications
US7609820B2 (en) 2003-11-12 2009-10-27 At&T Intellectual Property I, L.P. Identification and management of automatically-generated voicemail notifications of voicemail and electronic mail receipt
US20050198336A1 (en) 2004-01-22 2005-09-08 Edward Eytchison Methods and apparatuses for automatic adaptation of different protocols
US7558277B2 (en) 2004-12-15 2009-07-07 At&T Intellectual Property I, Lp Coordinated multi-network data services
US7676753B2 (en) 2005-01-07 2010-03-09 At&T Intellectual Property I, L.P. Methods, systems, devices and computer program products for collecting and sharing selected personal data
US7802205B2 (en) 2005-01-07 2010-09-21 At&T Intellectual Property I, L.P. Graphical chronological path presentation
US7925990B2 (en) 2005-03-31 2011-04-12 At&T Intellectual Property I, L. P. Methods, systems, and products for calendaring applications
US7640507B2 (en) 2005-02-28 2009-12-29 At&T Intellectual Property I, L.P. Methods, systems, and products for calendaring applications
US7809013B2 (en) 2005-03-24 2010-10-05 Intel Corporation Channel scanning
US8024438B2 (en) 2005-03-31 2011-09-20 At&T Intellectual Property, I, L.P. Methods, systems, and computer program products for implementing bandwidth management services
US7975283B2 (en) 2005-03-31 2011-07-05 At&T Intellectual Property I, L.P. Presence detection in a bandwidth management system
US8306033B2 (en) 2005-03-31 2012-11-06 At&T Intellectual Property I, L.P. Methods, systems, and computer program products for providing traffic control services
US8335239B2 (en) 2005-03-31 2012-12-18 At&T Intellectual Property I, L.P. Methods, systems, and devices for bandwidth conservation
US8098582B2 (en) 2005-03-31 2012-01-17 At&T Intellectual Property I, L.P. Methods, systems, and computer program products for implementing bandwidth control services
US20060282855A1 (en) * 2005-05-05 2006-12-14 Digital Display Innovations, Llc Multiple remote display system
US7724753B2 (en) 2005-06-24 2010-05-25 Aylus Networks, Inc. Digital home networks having a control point located on a wide area network
US7764960B2 (en) 2005-07-01 2010-07-27 Cisco Technology, Inc. System and method for communication using a wireless handset in wireless and wired networks
US8223938B2 (en) 2005-09-30 2012-07-17 At&T Intellectual Property I, L.P. Methods, systems, and computer program products for providing caller identification services
US7627819B2 (en) 2005-11-01 2009-12-01 At&T Intellectual Property I, L.P. Visual screen indicator
US8112632B2 (en) 2005-11-30 2012-02-07 At&T Intellectual Property I, L.P. Security devices, systems and computer program products
US8255480B2 (en) 2005-11-30 2012-08-28 At&T Intellectual Property I, L.P. Substitute uniform resource locator (URL) generation
US20070124500A1 (en) 2005-11-30 2007-05-31 Bedingfield James C Sr Automatic substitute uniform resource locator (URL) generation
US8595325B2 (en) 2005-11-30 2013-11-26 At&T Intellectual Property I, L.P. Substitute uniform resource locator (URL) form
US8065710B2 (en) * 2006-03-02 2011-11-22 At& T Intellectual Property I, L.P. Apparatuses and methods for interactive communication concerning multimedia content
US20070256008A1 (en) 2006-04-26 2007-11-01 Bedingfield James C Sr Methods, systems, and computer program products for managing audio information
US8701005B2 (en) 2006-04-26 2014-04-15 At&T Intellectual Property I, Lp Methods, systems, and computer program products for managing video information
US8219553B2 (en) 2006-04-26 2012-07-10 At&T Intellectual Property I, Lp Methods, systems, and computer program products for managing audio and/or video information via a web broadcast
US20070256007A1 (en) 2006-04-26 2007-11-01 Bedingfield James C Sr Methods, systems, and computer program products for managing information by annotating a captured information object
US20070263884A1 (en) 2006-05-09 2007-11-15 Bellsouth Intellectual Property Corporation Audio Mixer Apparatus
US8484335B2 (en) 2006-11-06 2013-07-09 At&T Intellectual Property I, L.P. Methods, systems, and computer products for download status notification
US20080110991A1 (en) 2006-11-15 2008-05-15 Bellsouth Intellectual Property Corporation Apparatus and methods for providing active functions using encoded two-dimensional arrays
US8484296B2 (en) 2006-11-17 2013-07-09 At&T Intellectual Property I, L.P. Systems and methods for displaying electronic mail messages
US8233605B2 (en) 2007-01-03 2012-07-31 At&T Intellectual Property I, L.P. Methods, systems, and products for monitoring conferences
US8715082B2 (en) 2007-04-19 2014-05-06 At&T Intellectual Property I, L.P. Systems, methods and computer products for IPTV network game control
US7818396B2 (en) 2007-06-21 2010-10-19 Microsoft Corporation Aggregating and searching profile data from multiple services
US8260366B2 (en) 2007-09-28 2012-09-04 At&T Intellectual Property I, Lp Automatic setting of an alert mode on a wireless device
US7895157B2 (en) 2007-11-19 2011-02-22 At&T Intellectual Property I, Lp Methods, systems and computer program products for playing back previously published content
US20090137298A1 (en) 2007-11-27 2009-05-28 Bedingfield Sr James Carlton Collaborative Virtual Coaching
US8352479B2 (en) 2007-12-10 2013-01-08 At&T Intellectual Property I, L.P. Systems,methods and computer products for content-derived metadata
US8401362B2 (en) 2008-04-23 2013-03-19 At&T Intellectual Property I, L.P. Indication of trickplay availability for selected multimedia stream
US8165446B2 (en) 2008-04-23 2012-04-24 At&T Intellectual Property I, Lp Indication of trickplay availability via remote control device
US8117036B2 (en) 2008-12-03 2012-02-14 At&T Intellectual Property I, L.P. Non-disruptive side conversation information retrieval
US8156054B2 (en) 2008-12-04 2012-04-10 At&T Intellectual Property I, L.P. Systems and methods for managing interactions between an individual and an entity
US8374576B2 (en) 2008-12-04 2013-02-12 At&T Intellectual Property I, L.P. Methods, systems, and computer program products for generating resource utilization alerts through communication terminals
US20100179701A1 (en) 2009-01-13 2010-07-15 At&T Intellectual Property I, L.P. Irrigation system with wireless control
US9667918B2 (en) 2009-02-20 2017-05-30 At&T Intellectual Property I, L.P. Network recording system
US10482428B2 (en) 2009-03-10 2019-11-19 Samsung Electronics Co., Ltd. Systems and methods for presenting metaphors
US9489039B2 (en) 2009-03-27 2016-11-08 At&T Intellectual Property I, L.P. Systems and methods for presenting intermediaries
US8826355B2 (en) 2009-04-30 2014-09-02 At&T Intellectual Property I, Lp System and method for recording a multi-part performance on an internet protocol television network

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6493008B1 (en) * 1999-02-19 2002-12-10 Canon Kabushiki Kaisha Multi-screen display system and method
US20030020671A1 (en) * 1999-10-29 2003-01-30 Ovid Santoro System and method for simultaneous display of multiple information sources
US20020163572A1 (en) * 2000-11-10 2002-11-07 Center Julian L. Methods of establishing a communications link using perceptual sensing of a user's presence
US7364313B2 (en) * 2002-12-27 2008-04-29 Barco N.V. Multiple image projection system and method for projecting multiple selected images adjacent each other
US7180988B2 (en) * 2003-01-31 2007-02-20 Qwest Communications International Inc. Packet network interface device and systems and methods for its use
US20040263686A1 (en) * 2003-06-26 2004-12-30 Samsung Electronics, Co., Ltd. Method and apparatus displaying double screen
US7679612B2 (en) * 2004-04-30 2010-03-16 Microsoft Corporation Configuration goals via video presenting network
US20070040900A1 (en) * 2005-07-13 2007-02-22 Polycom, Inc. System and Method for Configuring Routing of Video from Multiple Sources to Multiple Destinations of Videoconference Using Software Video Switch
US20070120763A1 (en) * 2005-11-23 2007-05-31 Lode De Paepe Display system for viewing multiple video signals

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9578079B2 (en) 2013-03-15 2017-02-21 Ricoh Company, Ltd. Distribution control system, distribution system, distribution control method, and computer-readable storage medium
US9648096B2 (en) 2013-03-15 2017-05-09 Ricoh Company, Limited Distribution control system, distribution system, distribution control method, and computer-readable storage medium
CN104144223A (en) * 2014-08-21 2014-11-12 北京奇艺世纪科技有限公司 Data obtaining method and device

Also Published As

Publication number Publication date
US8384753B1 (en) 2013-02-26

Similar Documents

Publication Publication Date Title
US8384753B1 (en) Managing multiple data sources
US9128592B2 (en) Displaying graphical representations of contacts
US8823768B2 (en) Conference system, event management server, and program
US11444990B1 (en) System and method of enabling a non-host, participant-initiated breakout session in a videoconferencing system utilizing a virtual space, and simultaneously displaying a session view of a videoconferencing session and the participant-initiated breakout session
US8416715B2 (en) Interest determination for auditory enhancement
CA2766503C (en) Systems and methods for switching between computer and presenter audio transmission during conference call
US8700097B2 (en) Method and system for controlling dual-processing of screen data in mobile terminal having projector function
US20120297305A1 (en) Presenting or sharing state in presence
US20130155268A1 (en) Performing Camera Control Using a Remote Control Device
US20040203835A1 (en) Integrated telephony and television system
WO2016168158A1 (en) Presenting a message in a communication session
US20140240446A1 (en) Method for establishing video conference
US20130154923A1 (en) Performing Searching for a List of Entries Using a Remote Control Device
US8922615B2 (en) Customizing input to a videoconference using a remote control device
US20130155171A1 (en) Providing User Input Having a Plurality of Data Types Using a Remote Control Device
US9531981B2 (en) Customized mute in a videoconference based on context
US20140362166A1 (en) Incoming call display method, electronic device, and incoming call display system
JP2003284018A (en) Television conference system, method therefor and television conference server
WO2022007618A1 (en) Video call method and display device
US20110074912A1 (en) Providing an Indication of a Videoconference by a Videoconferencing Device
JP3449772B2 (en) Multipoint conference equipment
JP2008306475A (en) Voice and image conference device
JP2003339034A (en) Network conference system, network conference method, and network conference program
US20130155172A1 (en) User Interface for a Display Using a Simple Remote Control Device
JP3139531U (en) Proxy shooting and recording system

Legal Events

Date Code Title Description
AS Assignment

Owner name: BELLSOUTH INTELLECTUAL PROPERTY CORPORATION, DELAW

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BEDINGFIELD, JAMES CARLTON, SR.;REEL/FRAME:029684/0843

Effective date: 20061214

AS Assignment

Owner name: AT&T DELAWARE INTELLECTUAL PROPERTY, INC., GEORGIA

Free format text: CHANGE OF NAME;ASSIGNOR:AT&T BLS INTELLECTUAL PROPERTY, INC.;REEL/FRAME:030389/0515

Effective date: 20071101

Owner name: AT&T BLS INTELLECTUAL PROPERTY, INC., DELAWARE

Free format text: CHANGE OF NAME;ASSIGNOR:AT&T INTELLECTUAL PROPERTY, INC.;REEL/FRAME:030389/0223

Effective date: 20070727

Owner name: AT&T INTELLECTUAL PROPERTY, INC., TEXAS

Free format text: CHANGE OF NAME;ASSIGNOR:BELLSOUTH INTELLECTUAL PROPERTY CORPORATION;REEL/FRAME:030389/0074

Effective date: 20070427

Owner name: AT&T INTELLECTUAL PROPERTY I, L.P., GEORGIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AT&T DELAWARE INTELLECTUAL PROPERTY, INC.;REEL/FRAME:030384/0228

Effective date: 20130503

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION