US20100332984A1 - System and method for providing a remote user interface for an application executing on a computing device - Google Patents

System and method for providing a remote user interface for an application executing on a computing device Download PDF

Info

Publication number
US20100332984A1
US20100332984A1 US12/878,848 US87884810A US2010332984A1 US 20100332984 A1 US20100332984 A1 US 20100332984A1 US 87884810 A US87884810 A US 87884810A US 2010332984 A1 US2010332984 A1 US 2010332984A1
Authority
US
United States
Prior art keywords
computing device
commands
capability information
audio
graphics
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/878,848
Inventor
Yoav M. Tzruya
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Exent Tech Ltd
Original Assignee
Exent Tech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Exent Tech Ltd filed Critical Exent Tech Ltd
Priority to US12/878,848 priority Critical patent/US20100332984A1/en
Assigned to EXENT TECHNOLOGIES, LTD. reassignment EXENT TECHNOLOGIES, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TZRUYA, YOAV M.
Publication of US20100332984A1 publication Critical patent/US20100332984A1/en
Priority to US13/021,631 priority patent/US20110157196A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation

Definitions

  • the present invention generally relates to user interfaces for an application executing on a computing device.
  • the present invention relates to a system and method for providing a remote user interface for an application, such as a video game, executing on a computing device.
  • PC personal computers
  • console-based systems such as Microsoft's Xbox® and Sony's Playstation®.
  • These platforms are limited in various respects.
  • a given PC can run only a single video game at a time, since the video game requires exclusive control over both the graphics and audio hardware of the PC as well as the PC's display and sound system. This is true regardless of whether the game is being played on-line (i.e., in connection with a server or other PC over a data communication network) or off-line.
  • an entirely new PC or other gaming platform must be purchased and located elsewhere in the home.
  • the end user is confined to playing the video game in the room in which the PC is located.
  • the present invention provides a system and method for providing a remote user interface for an application, such as a video game, executing on a computing device.
  • the system includes a computing device, such as a personal computer (PC), configured to execute a software application and a remote user interface (UI) communicatively coupled thereto via a data communication network.
  • the remote UI includes a hardware device such as a video, audio or user input/output (I/O) device.
  • the computing device is also configured to emulate a local hardware device and to redirect function calls generated by the software application for the emulated local hardware device to the remote UI for processing therein.
  • the computing device may also be further configured to receive control commands from the remote UI, the control commands originating from a user I/O device, and to redirect the control commands to the software application.
  • multiple remote UIs may be coupled to the computing device via the data communication network, and each of the multiple remote UIs may include one or more hardware devices, such as one or more of a video, audio or user I/O device.
  • an implementation of the present invention permits simultaneously execution of multiple software applications on the computing device. Consequently, a user of a first remote UI can remotely access and interact with a first software application executing on computing device while a user of a second remote UI remotely accesses and utilizes a second software application executing on the computing device. In this way, more than one user within a home can remotely use different interactive software applications executing on the computing device at the same time that would have otherwise exclusively occupied the resources of the computing device.
  • An implementation of the present invention provides a low-cost solution to the problem of providing multiple remote user interfaces for using interactive software applications throughout the home.
  • An implementation of the present invention provides additional benefits in that it allows a software application to be executed on its native computing platform while being accessed via a remote UI, without requiring that the software application be programmed to accommodate such remote access. This is achieved through the emulation of local resources by the computing device and the subsequent interception and redirection of commands generated by the software application for those local resources in a manner transparent to the software application. This is in contrast to, for example, conventional X-Windows systems that enable programs running on one computer to be displayed on another computer. In order to make use of X-Windows technology, only software applications written specifically to work with the X-Windows protocol can be used.
  • a remote UI in accordance with an implementation of the present invention need only implement the low-level hardware necessary to process graphics and audio commands transmitted from the computing device, it may be manufactured in a low-cost fashion relative to the cost of manufacturing the computing device.
  • the remote UI device need only implement such low-level hardware, the remote UI device can be implemented as a mobile device, such as a personal digital assistant (PDA), thereby allowing an end user to roam from place to place within the home, or as an extension to a set-top box, thereby integrating into cable TV and IPTV networks.
  • PDA personal digital assistant
  • an implementation of the present invention sends graphics and audio commands from the computing device to a remote UI device rather than a high-bandwidth raw video and audio feed
  • such an implementation provides a low-latency, low-bandwidth alternative to the streaming of raw video and audio content over a data communication network.
  • an implementation of the present invention marks an improvement over conventional “screen-scraping” technologies, such as those implemented in Windows terminal servers, in which graphics output is captured at a low level, converted to a raw video feed and transmitted to a remote device in a fully-textured and fully-rendered form.
  • FIG. 1 is a block diagram illustrating an exemplary system for providing a remote user interface for an application executing on a computing device in accordance with an implementation of the present invention.
  • FIG. 2 is a flowchart of an example process for establishing communication between a computing device and a remote UI and for remotely generating and displaying graphics content via the remote UI in accordance with an implementation of the present invention.
  • FIG. 3 illustrates an example software architecture of a media server in accordance with an implementation of the present invention.
  • FIG. 4 depicts an example computer system that may be utilized to implement a computing device in accordance with an implementation of the present invention.
  • FIG. 1 is a high level block diagram illustrating an exemplary system 100 for providing a remote user interface for an application executing on a computing device.
  • system 100 includes a computing device 102 coupled to one or more remote user interfaces (UIs) 106 a - 106 n via a data communication network 104 .
  • UIs remote user interfaces
  • computing device 102 and remote U's 106 a - 106 n are all located in a user's home and data communication network 104 comprises a wired and/or wireless local area network (LAN).
  • LAN local area network
  • computing device 102 is located at the central office or point-of-presence of a broadband service provider, remote U's 106 a - 106 n are located in a user's home, and data communication network 104 includes a wide area network (WAN) such as the Internet.
  • WAN wide area network
  • Computing device 102 is configured to execute a software application 108 , such as a video game, that is programmed to generate graphics and audio commands for respective hardware devices capable of executing those commands.
  • Software application 108 is also programmed to receive and respond to control commands received from a user input/output (I/O) device and/or associated user I/O device interface.
  • Computing device 102 represents the native platform upon which software application 108 was intended to be executed and displayed.
  • computing device 102 will be described as a personal computer (PC) and software application 108 will be described as a software application programmed for execution on a PC.
  • PC personal computer
  • software application 108 will be described as a software application programmed for execution on a PC.
  • computing device 102 may comprise a server, a console, or any other processor-based system capable of executing software applications.
  • graphics and audio commands generated by a software application such as software application 108 would be received by software interfaces also executing on the PC and then processed for execution by local hardware devices, such as a video and audio card connected to the motherboard of the PC.
  • control commands for the software application would be received via one or more local user input/output (I/O) devices coupled to an I/O bus of the PC, such as a keyboard, mouse, game controller or the like, and processed by a locally-executing software interface prior to receipt by the software application.
  • I/O local user input/output
  • software application 108 is executed within a sandbox environment 118 on computing device 102 .
  • Sandbox environment 118 captures the graphics and audio commands generated by software application 108 and selectively redirects them to one of remote UIs 106 a - 106 n via data communication network 104 . This allows software application 108 to be displayed on the remote UI using the hardware of the remote UI, even though software application 108 may not have been programmed to utilize such remote resources.
  • sandbox environment 118 receives control commands from the remote UI via data communication network 104 and processes them for input to software application 108 .
  • remote UI 106 a includes control logic 110 , a graphics device 112 , an audio device 114 , and a user I/O device 116 .
  • Control logic 110 comprises an interface between data communication network 104 and each of graphics device 112 , audio device 114 and user I/O device 116 .
  • control logic 110 is configured to at least perform functions relating to the publication of graphics, audio and user I/O device capability information over data communication network 104 and to facilitate the transfer of graphics, audio and user I/O device commands from computing device 102 to graphics device 112 , audio device 114 , and user I/O device 116 .
  • control logic 110 can be implemented in hardware, software, or as a combination of hardware and software.
  • Graphics device 112 comprises a graphics card or like hardware capable of executing graphics commands to generate image and video content.
  • Audio device 114 comprises an audio card or like hardware capable of executing audio commands to generate audio content.
  • User I/O device 116 comprises a mouse, keyboard, game controller or like hardware capable of receiving user input and generating control commands therefrom.
  • User I/O device 116 may be connected to remote UI 106 a using a direct cable connection or any type of wireless communication.
  • Each of remote UIs 106 a - 106 n can be a device capable of independently displaying the video content, playing the audio content and receiving control commands from a user.
  • Each of remote UIs 106 a - 106 n may operate in conjunction with one or more other devices to perform these functions.
  • the remote UI may comprise a set-top box that operates in conjunction with a television to which it is connected to display video content, play audio content, and in conjunction with a user I/O device to which it is connected to receive control commands from a user.
  • the remote UI may comprise a PC that operates in conjunction with a monitor to which it is connected to display video content, with a sound system or speakers to which it is connected to play audio content, and in conjunction with a user I/O device to which it is connected to receive control commands from a user.
  • FIG. 1 shows only one software application 108 executing within sandbox environment 118 , it is to be appreciated that multiple software applications may be simultaneously executing within multiple corresponding sandbox environments 118 . Consequently, a user of a first remote UI can remotely access and interact with a first software application executing on computing device 102 while a user of a second remote UI remotely accesses and utilizes a second software application executing on computing device 102 , each in accordance with the techniques described herein. In this way, more than one user within a home can use different interactive software applications executing on computing device 102 at the same time.
  • Sandbox environment 118 comprises one or more software modules installed on computing device 102 that operate to isolate software application 108 from other processes executing on computing device 102 and that optionally prevent a user from accessing processes or files associated with software application 108 .
  • sandbox environment 118 includes one or more software modules that capture graphics and audio commands generated by software application 108 for selective transmission to one of remote UIs 106 a - 106 n. The capturing of commands may occur, for example, at the device driver level or hardware abstraction layer (HAL) level of computing device 102 .
  • HAL hardware abstraction layer
  • sandbox environment 118 is configured to receive notifications from the control logic within each of remote UIs 106 a - 106 n.
  • the term “notification” is used in a general sense, and may in fact include the transmission of multiple messages from a remote UI to computing device 102 or the exchange of messages between a remote UI and computing device 102 .
  • the notifications provide a means by which each of remote U's 106 a - 106 n can publish its capabilities.
  • a device discovery and control protocol such as UPnP is used to allow sandbox environment 118 to automatically discover each of remote UIs 106 a - 106 n and to learn about their capabilities.
  • sandbox environment 118 Upon learning about the capabilities of a remote UI, sandbox environment 118 emulates the existence of a device, including device drivers, having similar capabilities. For example, upon receiving information about the capabilities of remote UI 106 a, sandbox environment 118 would emulate devices having the respective capabilities of graphics device 112 , audio device 114 , and user I/O device 116 . This would include creating a software stack for each of those devices on computing device 102 .
  • the published capabilities of a remote UI may be inherently different than the internal hardware and software capabilities of computing device 102 .
  • the software stacks created on computing device 102 provide an emulated environment which allow software application 108 to operate as if such capabilities existed within computing device 102 .
  • the published capabilities of a remote UI 106 a may be significantly different than the capabilities of remote U's 106 b - 106 n.
  • an implementation of the present invention creates a separate software stack for each such remote UI within a corresponding separate sandbox environment 118 on computing device 102 .
  • Each software stack may be significantly different from each other software stack.
  • a heterogeneous set of remote U's can be supported by system 100 .
  • an emulated device captures commands generated by software application 108 relating to graphics, audio, or user I/O devices, depending on the type of device being emulated.
  • the captured commands are transmitted over data communication network 104 to a selected one of remote UIs 106 a - 106 n.
  • commands generated by software application 108 directed to a DirectX or OpenGL stack may be captured and transmitted over data communication network 104 to one of remote U's 106 a - 106 n.
  • an implementation of the present invention provides a low-latency, low-bandwidth alternative to the streaming of raw video and audio content over a data communication network.
  • An example of such meta commands includes, but is not limited to, OpenGL commands, DirectX commands or Graphics Device Interface (GDI) commands.
  • sandbox environment 118 generates one or more Pre-Rendering Code (PRC) streams or commands responsive to the receipt of DirectX or OpenGL inputs from software application 108 .
  • PRC Pre-Rendering Code
  • These PRC streams are then transmitted over data communication network 104 to a selected one of remote UIs 106 a - 106 n, where they are received and processed by an output device to generate video and/or audio content.
  • the manner in which the PRC is generated may be related to parameters of the output device which were made known when the remote UI first published its capabilities.
  • Each of remote UIs 106 a - 106 n includes hardware and software stacks for processing graphics commands and generating graphics content therefrom, processing audio commands and generating audio content therefrom, and for receiving user input and generating control commands therefrom.
  • each remote UI 106 a - 106 n publishes its particular set of capabilities to sandbox environment 118 . This may be achieved, for example, by sending a notification to computing device 102 via data communication network 104 or alternatively through the use of a device discovery and control protocol such as UPnP.
  • the software stacks on each remote UI are capable of processing graphics and audio commands transmitted over data communication network 104 by computing device 102 .
  • the processing is performed in adherence with both the original command functionality as well as in a low-latency fashion.
  • the software stacks convert the PRC into video and audio output to feed a presentation device (e.g., video display, speakers) that is integrated with or connected to the remote UI device.
  • FIG. 2 is a flowchart 200 of an example process for establishing communication between computing device 102 and one of remote UIs 106 a - 106 n and for remotely generating and displaying graphics content via the remote UI.
  • the combination of computing device 102 and sandbox environment 118 executing thereon will be collectively referred to as “the media server”, while the remote UI 106 a - 106 n with which it is communicating will simply be referred to as “the remote UI”.
  • the process begins at step 202 , in which an end user requests to run or start a graphics application that is located on the media server for display on the remote UI, or on a device that is connected to the remote UI.
  • the end user may request to run a video game located on the media server, wherein the media server is situated in the basement of the end user's home, in order to view it on a television which is connected to the remote UI in another part of the end user's home.
  • the request may be input by the end user via a user interface located on the remote UI or on a device connected to the remote UI.
  • a network connection is established between the remote UI and the media server via a data communication network.
  • a network protocol can be used in order to set up communication between the remote UI and the media server.
  • the media server is configured to listen and wait for an incoming Internet Protocol (IP) connection and the media server is configured to establish a Transmission Control Protocol/Internet Protocol (TCP/IP) connection to the remote UI when needed.
  • IP Internet Protocol
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • the remote UI publishes or exposes its capabilities to the media server. These capabilities can be published via unidirectional or bidirectional communication between the remote UI and the media server.
  • the establishment of a network connection between the media server and the remote UI as set forth in step 204 and the publication of the capabilities of the remote UI as set forth in step 206 each may be facilitated by the use and extensions of a network discovery and control protocol such as UPnP.
  • the media server determines what functionality required for executing the requested graphic application can be executed on the media server and what functionality can be executed on the remote UI.
  • the decision algorithm executed by the media server to make this determination may be based on the capabilities of both the remote UI and the media server as well as on the hardware and software resources currently available on each at the time the algorithm is executed.
  • the media server is configured to dynamically adjust its allocation of functionality during the execution of the requested graphic application if the capabilities and available resources change.
  • step 210 after the capabilities of the remote UI have been exposed and the decision algorithm executed by the media server defines what portions of the graphic rendering are to be executed on each of the media server and the remote UI, software hooks are set in the relevant software and operating system (OS) stack on the media server in order to capture the relevant functionality in real time.
  • OS operating system
  • the hooks can be set on interfaces such as DirectX or OpenGL interfaces, or on any other interface.
  • the software hooks capture graphics commands and redirect them to the remote UI.
  • FIG. 3 illustrates an example software architecture 300 of the media server that is useful in understanding step 210 .
  • software architecture 300 comprises two graphics applications 302 and 304 , which are identified as 32 -bit Microsoft® Windows® applications, executing on the media server.
  • Each application 302 and 304 has a different software stack by which it utilizes graphics hardware 314 .
  • graphics commands generated by application 302 are received by a Direct 3 D application programming interface (API) 306 .
  • Direct 3 D API 306 processes the graphics commands for input to a device driver interface (DDI) 312 either directly or via a hardware abstraction layer (HAL) device 310 .
  • DDI 312 then processes the input and generates commands for graphics hardware 314 .
  • graphics commands generated by application 304 are received by a Microsoft® Windows® Graphics Device Interface (GDI) 308 .
  • GDI 308 processes the graphics commands for input to DDI 312 , which then processes the input and generates commands for graphics hardware 314 .
  • the media server can set software hooks in between any of the depicted layers of the software stacks for applications 302 and 304 , wherein the location of a hook is determined based on the allocation of functionality between the remote UI and the media server.
  • a software hook could be set between application 302 and Direct 3 D API 306 if the remote UI fully supports Direct 3 D.
  • a software hook could be set between Direct 3 D API 306 and HAL device 310 , or between Direct 3 D API 306 and DDI 312 if the remote UI is less powerful and it is determined that some Direct 3 D processing must be performed on the media server.
  • a software hook could be set between application 304 and GDI 308 or between GDI 308 and DDI 312 depending on the allocation of functionality between the media server and the remote UI.
  • the location of the software hooks is tied to which software layers must be emulated on the media server.
  • the media server emulates those layer just below the software hooks, thereby providing the upper layers the necessary interfaces to “believe” that the lower levels are fully available on the media server.
  • the emulated layer transmits relevant commands to the remote UI to ensure proper operation of graphics applications 302 and 304 .
  • the graphics application is executed on the media server as shown at step 212 .
  • the function call or command is redirected by the software hooks to the remote UI as shown at step 214 .
  • the function call is redirected using a Remote Procedure Call (RPC)-like communication protocol.
  • RPC Remote Procedure Call
  • the remote UI processes the function calls received from the media server to generate and display graphics content.
  • steps 204 , 206 , and 208 may be performed prior to receipt of the end user's request to run a graphics application.
  • steps 204 , 206 , and 208 may be performed the first time the media server and the remote UI are both connected to the data communication network.
  • the foregoing method of flowchart 200 is also applicable to the remote generation and playing of the audio content portion of a software application via the remote UI.
  • the media server compares the audio capabilities of the remote UI and the media server and then allocates functionality to each based on a decision algorithm.
  • Software hooks are set in accordance with this allocation.
  • the software hooks redirect audio-related function calls to the remote UI, where they are processed to generate audio content.
  • the audio content is then either played by the remote UI itself, or by a device connected to the remote UI.
  • the same general approach can be used to handle the remote generation and processing of control commands by a user I/O device attached to the remote UI.
  • the media server compares the user I/O device capabilities of the remote UI and the media server and allocates functionality to each based on a decision algorithm.
  • Device drivers that emulate the I/O capabilities of the remote UI are created on the media server in accordance with this allocation.
  • Control commands associated with a user I/O device are unique in that they may be transmitted in both directions—from the remote UI to the media server and from the media server to the remote UI (e.g., as in the case of a force feedback game controller).
  • the software hooks in this case operate both to receive control commands transmitted from the remote UI and to re-direct function calls related to the user I/O device to the remote UI.
  • an RPC-like protocol can be used for communication between the two.
  • FIG. 4 depicts an example computer system 400 that may be utilized to implement computing device 102 .
  • computer system 400 is provided by way of example only and is not intended to be limiting. Rather, as noted elsewhere herein, computing device 102 may alternately comprise a server, a console, or any other processor-based system capable of executing software applications.
  • example computer system 400 includes a processor 404 for executing software routines. Although a single processor is shown for the sake of clarity, computer system 400 may also comprise a multi-processor system.
  • Processor 404 is connected to a communication infrastructure 406 for communication with other components of computer system 400 .
  • Communication infrastructure 406 may comprise, for example, a communications bus, cross-bar, or network.
  • Computer system 400 further includes a main memory 408 , such as a random access memory (RAM), and a secondary memory 410 .
  • Secondary memory 410 may include, for example, a hard disk drive 412 and/or a removable storage drive 414 , which may comprise a floppy disk drive, a magnetic tape drive, an optical disk drive, or the like.
  • Removable storage drive 414 reads from and/or writes to a removable storage unit 418 in a well known manner.
  • Removable storage unit 418 may comprise a floppy disk, magnetic tape, optical disk, or the like, which is read by and written to by removable storage drive 414 .
  • removable storage unit 418 includes a computer usable storage medium having stored therein computer software and/or data.
  • secondary memory 410 may include other similar means for allowing computer programs or other instructions to be loaded into computer system 400 .
  • Such means can include, for example, a removable storage unit 422 and an interface 420 .
  • a removable storage unit 422 and interface 420 include a program cartridge and cartridge interface (such as that found in video game console devices), a removable memory chip (such as an EPROM or PROM) and associated socket, and other removable storage units 422 and interfaces 420 which allow software and data to be transferred from the removable storage unit 422 to computer system 400 .
  • Computer system 400 also includes at least one communication interface 424 .
  • Communication interface 424 allows software and data to be transferred between computer system 400 and external devices via a communication path 426 .
  • communication interface 424 permits data to be transferred between computer system 400 and a data communication network, such as a public data or private data communication network.
  • Examples of communication interface 424 can include a modem, a network interface (such as Ethernet card), a communication port, and the like.
  • Software and data transferred via communication interface 424 are in the form of signals which can be electronic, electromagnetic, optical or other signals capable of being received by communication interface 424 . These signals are provided to the communication interface via communication path 426 .
  • computer system 400 further includes a graphics interface 430 , an audio interface 440 , and an I/O device interface 450 .
  • a software application executed by processor 404 generates graphics and audio commands.
  • the graphics commands are received by graphics interface 430 , which processes them to generate video content for display on a local display 432 .
  • the audio commands are received by audio interface 440 , which processes them to generate audio content for playback by one or more local speaker(s) 442 .
  • I/O device interface 450 receives control commands from a local I/O device 452 , such as a keyboard, mouse, game controller or the like, and processes them for handling by the software application being executed by processor 404 .
  • a software application is executed by processor 404 within a sandbox environment.
  • the sandbox environment captures graphics and audio commands generated by the software application and selectively redirects them to a remote UI (not shown) via communications interface 424 .
  • the graphics commands are processed by a graphics interface within the remote UI to generate video content for display on a remote display.
  • the audio commands are processed by an audio interface within the remote UI to generate audio content for playback by one or more remote speaker(s).
  • the sandbox environment receives control commands from the remote UI via communications interface 424 and processes them for input to the software application.
  • the hardware associated with local graphics interface 430 , audio interface 440 , and I/O device interface 450 is not used to execute the software application. Rather, hardware within (or connected) to the remote UI is used to carry out analogous functions.
  • computer program product may refer, in part, to removable storage unit 418 , removable storage unit 422 , a hard disk installed in hard disk drive 412 , or a carrier wave carrying software over communication path 426 (wireless link or cable) to communication interface 424 .
  • a computer useable medium can include magnetic media, optical media, or other recordable media, or media that transmits a carrier wave or other signal.
  • Computer programs are stored in main memory 408 and/or secondary memory 410 . Computer programs can also be received via communication interface 424 . Such computer programs, when executed, enable the computer system 400 to perform one or more features of the present invention as discussed herein. In particular, the computer programs, when executed, enable the processor 404 to perform features of the present invention. Accordingly, such computer programs represent controllers of the computer system 400 .
  • Software for implementing the present invention may be stored in a computer program product and loaded into computer system 400 using removable storage drive 414 , hard disk drive 412 , or interface 420 .
  • the computer program product may be downloaded to computer system 400 over communications path 426 .
  • the software when executed by the processor 404 , causes the processor 404 to perform functions of the invention as described herein.

Abstract

A system and method for providing a remote user interface for an application, such as a video game, executing on a computing device. The system includes a computing device configured to execute a software application and at least one remote user interface (UI) communicatively coupled to the computing device via a data communication network. The remote UI includes at least one hardware device such as a video, audio or user input/output (I/O) device. The computing device is further configured to emulate the hardware device locally and to redirect function calls generated by the software application for the emulated local hardware device to the remote UI for processing by the hardware device.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 11/204,363, filed Aug. 16, 2005, the entirety of which is incorporated by reference herein.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention generally relates to user interfaces for an application executing on a computing device. In particular, the present invention relates to a system and method for providing a remote user interface for an application, such as a video game, executing on a computing device.
  • 2. Background
  • Currently, the platforms available for playing video games or other real-time software applications in the home include personal computers (PC) and various proprietary console-based systems, such as Microsoft's Xbox® and Sony's Playstation®. These platforms are limited in various respects. For example, a given PC can run only a single video game at a time, since the video game requires exclusive control over both the graphics and audio hardware of the PC as well as the PC's display and sound system. This is true regardless of whether the game is being played on-line (i.e., in connection with a server or other PC over a data communication network) or off-line. To enable multiple end users to play different video games at the same time, an entirely new PC or other gaming platform must be purchased and located elsewhere in the home. Furthermore, the end user is confined to playing the video game in the room in which the PC is located.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention provides a system and method for providing a remote user interface for an application, such as a video game, executing on a computing device. The system includes a computing device, such as a personal computer (PC), configured to execute a software application and a remote user interface (UI) communicatively coupled thereto via a data communication network. The remote UI includes a hardware device such as a video, audio or user input/output (I/O) device. The computing device is also configured to emulate a local hardware device and to redirect function calls generated by the software application for the emulated local hardware device to the remote UI for processing therein. The computing device may also be further configured to receive control commands from the remote UI, the control commands originating from a user I/O device, and to redirect the control commands to the software application.
  • In accordance with an implementation of the present invention, multiple remote UIs may be coupled to the computing device via the data communication network, and each of the multiple remote UIs may include one or more hardware devices, such as one or more of a video, audio or user I/O device.
  • By off-loading the processing of graphics and/or audio commands to a remote UI, an implementation of the present invention permits simultaneously execution of multiple software applications on the computing device. Consequently, a user of a first remote UI can remotely access and interact with a first software application executing on computing device while a user of a second remote UI remotely accesses and utilizes a second software application executing on the computing device. In this way, more than one user within a home can remotely use different interactive software applications executing on the computing device at the same time that would have otherwise exclusively occupied the resources of the computing device.
  • An implementation of the present invention provides a low-cost solution to the problem of providing multiple remote user interfaces for using interactive software applications throughout the home.
  • An implementation of the present invention provides additional benefits in that it allows a software application to be executed on its native computing platform while being accessed via a remote UI, without requiring that the software application be programmed to accommodate such remote access. This is achieved through the emulation of local resources by the computing device and the subsequent interception and redirection of commands generated by the software application for those local resources in a manner transparent to the software application. This is in contrast to, for example, conventional X-Windows systems that enable programs running on one computer to be displayed on another computer. In order to make use of X-Windows technology, only software applications written specifically to work with the X-Windows protocol can be used.
  • Furthermore, because a remote UI in accordance with an implementation of the present invention need only implement the low-level hardware necessary to process graphics and audio commands transmitted from the computing device, it may be manufactured in a low-cost fashion relative to the cost of manufacturing the computing device.
  • Indeed, because the remote UI device need only implement such low-level hardware, the remote UI device can be implemented as a mobile device, such as a personal digital assistant (PDA), thereby allowing an end user to roam from place to place within the home, or as an extension to a set-top box, thereby integrating into cable TV and IPTV networks.
  • Additionally, because an implementation of the present invention sends graphics and audio commands from the computing device to a remote UI device rather than a high-bandwidth raw video and audio feed, such an implementation provides a low-latency, low-bandwidth alternative to the streaming of raw video and audio content over a data communication network. Thus, an implementation of the present invention marks an improvement over conventional “screen-scraping” technologies, such as those implemented in Windows terminal servers, in which graphics output is captured at a low level, converted to a raw video feed and transmitted to a remote device in a fully-textured and fully-rendered form.
  • Further features and advantages of the present invention, as well as the structure and operation of various embodiments thereof, are described in detail below with reference to the accompanying drawings. It is noted that the invention is not limited to the specific embodiments described herein. Such embodiments are presented herein for illustrative purposes only. Additional embodiments will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein.
  • BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
  • The accompanying drawings, which are incorporated herein and form part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the relevant art(s) to make and use the invention.
  • FIG. 1 is a block diagram illustrating an exemplary system for providing a remote user interface for an application executing on a computing device in accordance with an implementation of the present invention.
  • FIG. 2 is a flowchart of an example process for establishing communication between a computing device and a remote UI and for remotely generating and displaying graphics content via the remote UI in accordance with an implementation of the present invention.
  • FIG. 3 illustrates an example software architecture of a media server in accordance with an implementation of the present invention.
  • FIG. 4 depicts an example computer system that may be utilized to implement a computing device in accordance with an implementation of the present invention.
  • The features and advantages of the present invention will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the corresponding reference number.
  • DETAILED DESCRIPTION OF THE INVENTION A. System Architecture
  • FIG. 1 is a high level block diagram illustrating an exemplary system 100 for providing a remote user interface for an application executing on a computing device. As shown in FIG. 1, system 100 includes a computing device 102 coupled to one or more remote user interfaces (UIs) 106 a-106 n via a data communication network 104. In one exemplary implementation, computing device 102 and remote U's 106 a-106 n are all located in a user's home and data communication network 104 comprises a wired and/or wireless local area network (LAN). In an alternative exemplary implementation, computing device 102 is located at the central office or point-of-presence of a broadband service provider, remote U's 106 a-106 n are located in a user's home, and data communication network 104 includes a wide area network (WAN) such as the Internet.
  • Computing device 102 is configured to execute a software application 108, such as a video game, that is programmed to generate graphics and audio commands for respective hardware devices capable of executing those commands. Software application 108 is also programmed to receive and respond to control commands received from a user input/output (I/O) device and/or associated user I/O device interface. Computing device 102 represents the native platform upon which software application 108 was intended to be executed and displayed.
  • For the sake of convenience, from this point forward, computing device 102 will be described as a personal computer (PC) and software application 108 will be described as a software application programmed for execution on a PC. However, the present invention is not so limited. For example, computing device 102 may comprise a server, a console, or any other processor-based system capable of executing software applications.
  • In a conventional PC, graphics and audio commands generated by a software application such as software application 108 would be received by software interfaces also executing on the PC and then processed for execution by local hardware devices, such as a video and audio card connected to the motherboard of the PC. Furthermore, control commands for the software application would be received via one or more local user input/output (I/O) devices coupled to an I/O bus of the PC, such as a keyboard, mouse, game controller or the like, and processed by a locally-executing software interface prior to receipt by the software application.
  • In contrast, in accordance with FIG. 1 and as will be described in more detail herein, software application 108 is executed within a sandbox environment 118 on computing device 102. Sandbox environment 118 captures the graphics and audio commands generated by software application 108 and selectively redirects them to one of remote UIs 106 a-106 n via data communication network 104. This allows software application 108 to be displayed on the remote UI using the hardware of the remote UI, even though software application 108 may not have been programmed to utilize such remote resources. Furthermore, sandbox environment 118 receives control commands from the remote UI via data communication network 104 and processes them for input to software application 108.
  • As shown in FIG. 1, remote UI 106 a includes control logic 110, a graphics device 112, an audio device 114, and a user I/O device 116. Each of the other remote UI's 106 b-106 n includes similar features, although this is not shown in FIG. 1 for the sake of brevity. Control logic 110 comprises an interface between data communication network 104 and each of graphics device 112, audio device 114 and user I/O device 116. As will be described in more detail herein, control logic 110 is configured to at least perform functions relating to the publication of graphics, audio and user I/O device capability information over data communication network 104 and to facilitate the transfer of graphics, audio and user I/O device commands from computing device 102 to graphics device 112, audio device 114, and user I/O device 116. As will be appreciated by persons skilled in the relevant art based on the teachings provided herein, control logic 110 can be implemented in hardware, software, or as a combination of hardware and software.
  • Graphics device 112 comprises a graphics card or like hardware capable of executing graphics commands to generate image and video content. Audio device 114 comprises an audio card or like hardware capable of executing audio commands to generate audio content. User I/O device 116 comprises a mouse, keyboard, game controller or like hardware capable of receiving user input and generating control commands therefrom. User I/O device 116 may be connected to remote UI 106 a using a direct cable connection or any type of wireless communication.
  • Each of remote UIs 106 a-106 n can be a device capable of independently displaying the video content, playing the audio content and receiving control commands from a user. Each of remote UIs 106 a-106 n may operate in conjunction with one or more other devices to perform these functions. For example, the remote UI may comprise a set-top box that operates in conjunction with a television to which it is connected to display video content, play audio content, and in conjunction with a user I/O device to which it is connected to receive control commands from a user. As a further example, the remote UI may comprise a PC that operates in conjunction with a monitor to which it is connected to display video content, with a sound system or speakers to which it is connected to play audio content, and in conjunction with a user I/O device to which it is connected to receive control commands from a user.
  • Although FIG. 1 shows only one software application 108 executing within sandbox environment 118, it is to be appreciated that multiple software applications may be simultaneously executing within multiple corresponding sandbox environments 118. Consequently, a user of a first remote UI can remotely access and interact with a first software application executing on computing device 102 while a user of a second remote UI remotely accesses and utilizes a second software application executing on computing device 102, each in accordance with the techniques described herein. In this way, more than one user within a home can use different interactive software applications executing on computing device 102 at the same time.
  • The operation and interaction of sandbox environment 118 and remote UIs 106 a-106 n will now be described in more detail.
  • 1. Sandbox Environment
  • Sandbox environment 118 comprises one or more software modules installed on computing device 102 that operate to isolate software application 108 from other processes executing on computing device 102 and that optionally prevent a user from accessing processes or files associated with software application 108. At a minimum, sandbox environment 118 includes one or more software modules that capture graphics and audio commands generated by software application 108 for selective transmission to one of remote UIs 106 a-106 n. The capturing of commands may occur, for example, at the device driver level or hardware abstraction layer (HAL) level of computing device 102.
  • In particular, sandbox environment 118 is configured to receive notifications from the control logic within each of remote UIs 106 a-106 n. The term “notification” is used in a general sense, and may in fact include the transmission of multiple messages from a remote UI to computing device 102 or the exchange of messages between a remote UI and computing device 102. The notifications provide a means by which each of remote U's 106 a-106 n can publish its capabilities. In one implementation, a device discovery and control protocol such as UPnP is used to allow sandbox environment 118 to automatically discover each of remote UIs 106 a-106 n and to learn about their capabilities.
  • Upon learning about the capabilities of a remote UI, sandbox environment 118 emulates the existence of a device, including device drivers, having similar capabilities. For example, upon receiving information about the capabilities of remote UI 106 a, sandbox environment 118 would emulate devices having the respective capabilities of graphics device 112, audio device 114, and user I/O device 116. This would include creating a software stack for each of those devices on computing device 102.
  • The published capabilities of a remote UI may be inherently different than the internal hardware and software capabilities of computing device 102. As such, the software stacks created on computing device 102 provide an emulated environment which allow software application 108 to operate as if such capabilities existed within computing device 102.
  • Furthermore, the published capabilities of a remote UI 106 a may be significantly different than the capabilities of remote U's 106 b-106 n. To address this, an implementation of the present invention creates a separate software stack for each such remote UI within a corresponding separate sandbox environment 118 on computing device 102. Each software stack may be significantly different from each other software stack. As a result, a heterogeneous set of remote U's can be supported by system 100.
  • Once created, an emulated device captures commands generated by software application 108 relating to graphics, audio, or user I/O devices, depending on the type of device being emulated. The captured commands are transmitted over data communication network 104 to a selected one of remote UIs 106 a-106 n. For example, commands generated by software application 108 directed to a DirectX or OpenGL stack may be captured and transmitted over data communication network 104 to one of remote U's 106 a-106 n.
  • As will be appreciated by persons skilled in the art, because sandbox environment 118 captures graphics and audio commands in their “meta” form and transmits them from computing device 102 to a remote UI 106 a-106 n, an implementation of the present invention provides a low-latency, low-bandwidth alternative to the streaming of raw video and audio content over a data communication network. An example of such meta commands includes, but is not limited to, OpenGL commands, DirectX commands or Graphics Device Interface (GDI) commands.
  • In one implementation, sandbox environment 118 generates one or more Pre-Rendering Code (PRC) streams or commands responsive to the receipt of DirectX or OpenGL inputs from software application 108. These PRC streams are then transmitted over data communication network 104 to a selected one of remote UIs 106 a-106 n, where they are received and processed by an output device to generate video and/or audio content. The manner in which the PRC is generated may be related to parameters of the output device which were made known when the remote UI first published its capabilities.
  • 2. Remote UIs
  • Each of remote UIs 106 a-106 n includes hardware and software stacks for processing graphics commands and generating graphics content therefrom, processing audio commands and generating audio content therefrom, and for receiving user input and generating control commands therefrom. As noted above, each remote UI 106 a-106 n publishes its particular set of capabilities to sandbox environment 118. This may be achieved, for example, by sending a notification to computing device 102 via data communication network 104 or alternatively through the use of a device discovery and control protocol such as UPnP.
  • The software stacks on each remote UI are capable of processing graphics and audio commands transmitted over data communication network 104 by computing device 102. The processing is performed in adherence with both the original command functionality as well as in a low-latency fashion. In an implementation where the commands comprise PRC streams (described above), the software stacks convert the PRC into video and audio output to feed a presentation device (e.g., video display, speakers) that is integrated with or connected to the remote UI device.
  • B. Example Process
  • FIG. 2 is a flowchart 200 of an example process for establishing communication between computing device 102 and one of remote UIs 106 a-106 n and for remotely generating and displaying graphics content via the remote UI. In the following description, the combination of computing device 102 and sandbox environment 118 executing thereon will be collectively referred to as “the media server”, while the remote UI 106 a-106 n with which it is communicating will simply be referred to as “the remote UI”.
  • As shown in FIG. 2, the process begins at step 202, in which an end user requests to run or start a graphics application that is located on the media server for display on the remote UI, or on a device that is connected to the remote UI. For example, the end user may request to run a video game located on the media server, wherein the media server is situated in the basement of the end user's home, in order to view it on a television which is connected to the remote UI in another part of the end user's home. The request may be input by the end user via a user interface located on the remote UI or on a device connected to the remote UI.
  • At step 204, responsive to the end user's request, a network connection is established between the remote UI and the media server via a data communication network. As will be readily appreciated by persons skilled in the art, any of a variety of network protocols can be used in order to set up communication between the remote UI and the media server. For example, in one implementation, the media server is configured to listen and wait for an incoming Internet Protocol (IP) connection and the media server is configured to establish a Transmission Control Protocol/Internet Protocol (TCP/IP) connection to the remote UI when needed.
  • At step 206, after a connection has been established between the remote UI and the media server, the remote UI publishes or exposes its capabilities to the media server. These capabilities can be published via unidirectional or bidirectional communication between the remote UI and the media server. In one implementation, the establishment of a network connection between the media server and the remote UI as set forth in step 204 and the publication of the capabilities of the remote UI as set forth in step 206 each may be facilitated by the use and extensions of a network discovery and control protocol such as UPnP.
  • At step 208, based on the published capabilities of the remote UI, the media server determines what functionality required for executing the requested graphic application can be executed on the media server and what functionality can be executed on the remote UI. The decision algorithm executed by the media server to make this determination may be based on the capabilities of both the remote UI and the media server as well as on the hardware and software resources currently available on each at the time the algorithm is executed. In one implementation, the media server is configured to dynamically adjust its allocation of functionality during the execution of the requested graphic application if the capabilities and available resources change.
  • At step 210, after the capabilities of the remote UI have been exposed and the decision algorithm executed by the media server defines what portions of the graphic rendering are to be executed on each of the media server and the remote UI, software hooks are set in the relevant software and operating system (OS) stack on the media server in order to capture the relevant functionality in real time. For example, the hooks can be set on interfaces such as DirectX or OpenGL interfaces, or on any other interface. The software hooks capture graphics commands and redirect them to the remote UI.
  • FIG. 3 illustrates an example software architecture 300 of the media server that is useful in understanding step 210. As shown in FIG. 3, software architecture 300 comprises two graphics applications 302 and 304, which are identified as 32-bit Microsoft® Windows® applications, executing on the media server. Each application 302 and 304 has a different software stack by which it utilizes graphics hardware 314.
  • In particular, graphics commands generated by application 302 are received by a Direct3D application programming interface (API) 306. Direct3D API 306 processes the graphics commands for input to a device driver interface (DDI) 312 either directly or via a hardware abstraction layer (HAL) device 310. DDI 312 then processes the input and generates commands for graphics hardware 314. In contrast, graphics commands generated by application 304 are received by a Microsoft® Windows® Graphics Device Interface (GDI) 308. GDI 308 processes the graphics commands for input to DDI 312, which then processes the input and generates commands for graphics hardware 314.
  • In accordance with step 210, the media server can set software hooks in between any of the depicted layers of the software stacks for applications 302 and 304, wherein the location of a hook is determined based on the allocation of functionality between the remote UI and the media server. Thus, for example, with respect to application 302, a software hook could be set between application 302 and Direct3D API 306 if the remote UI fully supports Direct3D. Alternatively, a software hook could be set between Direct3D API 306 and HAL device 310, or between Direct3D API 306 and DDI 312 if the remote UI is less powerful and it is determined that some Direct3D processing must be performed on the media server. With respect to application 304, a software hook could be set between application 304 and GDI 308 or between GDI 308 and DDI 312 depending on the allocation of functionality between the media server and the remote UI.
  • The location of the software hooks is tied to which software layers must be emulated on the media server. In particular, the media server emulates those layer just below the software hooks, thereby providing the upper layers the necessary interfaces to “believe” that the lower levels are fully available on the media server. However, instead of fully implementing the lower levels, the emulated layer transmits relevant commands to the remote UI to ensure proper operation of graphics applications 302 and 304.
  • Returning now to flowchart 200, once the software hooks have been set at step 210, the graphics application is executed on the media server as shown at step 212. During execution of the graphics application, when a function that should be executed on the remote UI is called, the function call or command is redirected by the software hooks to the remote UI as shown at step 214. In an implementation, the function call is redirected using a Remote Procedure Call (RPC)-like communication protocol. It should be noted that, depending on the allocation of functionality between the media server and the remote UI, some function calls may be handled entirely by the media server. In any case, at step 216, the remote UI processes the function calls received from the media server to generate and display graphics content.
  • Note that in an alternate implementation, one or more of steps 204, 206, and 208 (involving the publication of the capabilities of the remote UI, the allocation of functionality between the media server and the remote UI, and the setting of software hooks) may be performed prior to receipt of the end user's request to run a graphics application. For example, one or more of these steps could be performed the first time the media server and the remote UI are both connected to the data communication network.
  • With minor modifications, the foregoing method of flowchart 200 is also applicable to the remote generation and playing of the audio content portion of a software application via the remote UI. In an audio context, the media server compares the audio capabilities of the remote UI and the media server and then allocates functionality to each based on a decision algorithm. Software hooks are set in accordance with this allocation. The software hooks redirect audio-related function calls to the remote UI, where they are processed to generate audio content. Depending on the implementation, the audio content is then either played by the remote UI itself, or by a device connected to the remote UI.
  • Furthermore, the same general approach can be used to handle the remote generation and processing of control commands by a user I/O device attached to the remote UI. Again, the media server compares the user I/O device capabilities of the remote UI and the media server and allocates functionality to each based on a decision algorithm. Device drivers that emulate the I/O capabilities of the remote UI are created on the media server in accordance with this allocation. Control commands associated with a user I/O device are unique in that they may be transmitted in both directions—from the remote UI to the media server and from the media server to the remote UI (e.g., as in the case of a force feedback game controller). Thus, the software hooks in this case operate both to receive control commands transmitted from the remote UI and to re-direct function calls related to the user I/O device to the remote UI. Once again, an RPC-like protocol can be used for communication between the two.
  • C. Example Computing Device
  • FIG. 4 depicts an example computer system 400 that may be utilized to implement computing device 102. However, the following description of computer system 400 is provided by way of example only and is not intended to be limiting. Rather, as noted elsewhere herein, computing device 102 may alternately comprise a server, a console, or any other processor-based system capable of executing software applications.
  • As shown in FIG. 4, example computer system 400 includes a processor 404 for executing software routines. Although a single processor is shown for the sake of clarity, computer system 400 may also comprise a multi-processor system. Processor 404 is connected to a communication infrastructure 406 for communication with other components of computer system 400. Communication infrastructure 406 may comprise, for example, a communications bus, cross-bar, or network.
  • Computer system 400 further includes a main memory 408, such as a random access memory (RAM), and a secondary memory 410. Secondary memory 410 may include, for example, a hard disk drive 412 and/or a removable storage drive 414, which may comprise a floppy disk drive, a magnetic tape drive, an optical disk drive, or the like. Removable storage drive 414 reads from and/or writes to a removable storage unit 418 in a well known manner. Removable storage unit 418 may comprise a floppy disk, magnetic tape, optical disk, or the like, which is read by and written to by removable storage drive 414. As will be appreciated by persons skilled in the relevant art(s), removable storage unit 418 includes a computer usable storage medium having stored therein computer software and/or data.
  • In an alternative implementation, secondary memory 410 may include other similar means for allowing computer programs or other instructions to be loaded into computer system 400. Such means can include, for example, a removable storage unit 422 and an interface 420. Examples of a removable storage unit 422 and interface 420 include a program cartridge and cartridge interface (such as that found in video game console devices), a removable memory chip (such as an EPROM or PROM) and associated socket, and other removable storage units 422 and interfaces 420 which allow software and data to be transferred from the removable storage unit 422 to computer system 400.
  • Computer system 400 also includes at least one communication interface 424. Communication interface 424 allows software and data to be transferred between computer system 400 and external devices via a communication path 426. In particular, communication interface 424 permits data to be transferred between computer system 400 and a data communication network, such as a public data or private data communication network. Examples of communication interface 424 can include a modem, a network interface (such as Ethernet card), a communication port, and the like. Software and data transferred via communication interface 424 are in the form of signals which can be electronic, electromagnetic, optical or other signals capable of being received by communication interface 424. These signals are provided to the communication interface via communication path 426.
  • As shown in FIG. 4, computer system 400 further includes a graphics interface 430, an audio interface 440, and an I/O device interface 450. In a conventional mode of operation, a software application executed by processor 404 generates graphics and audio commands. The graphics commands are received by graphics interface 430, which processes them to generate video content for display on a local display 432. The audio commands are received by audio interface 440, which processes them to generate audio content for playback by one or more local speaker(s) 442. I/O device interface 450 receives control commands from a local I/O device 452, such as a keyboard, mouse, game controller or the like, and processes them for handling by the software application being executed by processor 404.
  • However, as described in more detail elsewhere herein, in accordance with an implementation of the present invention, a software application is executed by processor 404 within a sandbox environment. The sandbox environment captures graphics and audio commands generated by the software application and selectively redirects them to a remote UI (not shown) via communications interface 424. The graphics commands are processed by a graphics interface within the remote UI to generate video content for display on a remote display. The audio commands are processed by an audio interface within the remote UI to generate audio content for playback by one or more remote speaker(s). Additionally, the sandbox environment receives control commands from the remote UI via communications interface 424 and processes them for input to the software application. Thus, in this implementation, the hardware associated with local graphics interface 430, audio interface 440, and I/O device interface 450 is not used to execute the software application. Rather, hardware within (or connected) to the remote UI is used to carry out analogous functions.
  • As used herein, the term “computer program product” may refer, in part, to removable storage unit 418, removable storage unit 422, a hard disk installed in hard disk drive 412, or a carrier wave carrying software over communication path 426 (wireless link or cable) to communication interface 424. A computer useable medium can include magnetic media, optical media, or other recordable media, or media that transmits a carrier wave or other signal. These computer program products are means for providing software to computer system 400.
  • Computer programs (also called computer control logic) are stored in main memory 408 and/or secondary memory 410. Computer programs can also be received via communication interface 424. Such computer programs, when executed, enable the computer system 400 to perform one or more features of the present invention as discussed herein. In particular, the computer programs, when executed, enable the processor 404 to perform features of the present invention. Accordingly, such computer programs represent controllers of the computer system 400.
  • Software for implementing the present invention may be stored in a computer program product and loaded into computer system 400 using removable storage drive 414, hard disk drive 412, or interface 420. Alternatively, the computer program product may be downloaded to computer system 400 over communications path 426. The software, when executed by the processor 404, causes the processor 404 to perform functions of the invention as described herein.
  • D. Conclusion
  • While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be understood by those skilled in the relevant art(s) that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined in the appended claims. Accordingly, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims (42)

1. A method for operating a remote user interface (UI) for a computing device comprising:
publishing graphics capability information to the computing device over a data communication network;
receiving graphics commands from the computing device over the data communication network, wherein the format of the graphics commands received from the computing device is consistent with the graphics capability information; and
processing the graphics commands in a graphics device to generate video content therefrom.
2. The method of claim 1, further comprising:
rendering and displaying the video content.
3. The method of claim 1, wherein publishing graphics capability information comprises publishing graphics capability information in accordance with a UPnP protocol.
4. The method of claim 1, wherein publishing graphics capability information to the computing device over a data communication network comprises publishing graphics capability information to the computing device over a local area network.
5. The method of claim 1, wherein publishing graphics capability information to the computing device over a data communication network comprises publishing graphics capability information to the computing device over a wide area network.
6. The method of claim 1, wherein receiving graphics commands comprises receiving one of OpenGL commands, DirectX commands, or Graphics Device Interface commands
7. The method of claim 1, wherein receiving graphics commands comprises receiving Pre-Rendering Code (PRC) commands.
8. A remote user interface (UI) for a computing device comprising:
control logic; and
a graphics device coupled to the control logic;
wherein the control logic is configured to publish graphics capability information to the computing device over a data communication network and to receive graphics commands from the computing device over the data communication network, wherein the format of the graphics commands received from the computing device is consistent with the graphics capability information; and
wherein the graphics device processes the graphics commands to generate video content therefrom.
9. The remote UI of claim 8, further comprising:
a display that renders and displays the video content.
10. The remote UI of claim 8, wherein the control logic is configured to publish the graphics capability information in accordance with a UPnP protocol.
11. The remote UI of claim 8, wherein the control logic is configured to publish the graphics capability information to the computing device over a local area network.
12. The remote UI of claim 8, wherein the control logic is configured to publish the graphics capability information to the computing device over a wide area network.
13. The remote UI of claim 8, wherein the control logic is configured to receive one of OpenGL commands, DirectX commands, or Graphics Device Interface commands.
14. The remote UI of claim 8, wherein the control logic is configured to receive Pre-Rendering Code (PRC) commands.
15. A method for operating a remote user interface (UI) for a computing device comprising:
publishing audio capability information to the computing device over a data communication network;
receiving audio commands from the computing device over the data communication network, wherein the format of the audio commands received from the computing device is consistent with the audio capability information; and
processing the audio commands in an audio device to generate audio content therefrom.
16. The method of claim 15, further comprising:
playing the audio content.
17. The method of claim 15, wherein publishing audio capability information comprises publishing audio capability information in accordance with a UPnP protocol.
18. The method of claim 15, wherein publishing audio capability information to the computing device over a data communication network comprises publishing audio capability information to the computing device over a local area network.
19. The method of claim 15, wherein publishing audio capability information to the computing device over a data communication network comprises publishing audio capability information to the computing device over a wide area network.
20. The method of claim 15, wherein receiving audio commands comprises receiving DirectX commands.
21. A remote user interface (UI) for a computing device comprising:
control logic; and
an audio device coupled to the control logic;
wherein the control logic is configured to publish audio capability information to the computing device over a data communication network and to receive audio commands from the computing device over the data communication network, wherein the format of the audio commands received from the computing device is consistent with the audio capability information; and
wherein the audio device processes the audio commands to generate audio content therefrom.
22. The remote UI of claim 21, further comprising:
one or more speakers that play the audio content.
23. The remote UI of claim 21, wherein the control logic is configured to publish the audio capability information in accordance with a UPnP protocol.
24. The remote UI of claim 21, wherein the control logic is configured to publish the audio capability information to the computing device over a local area network.
25. The remote UI of claim 21, wherein the control logic is configured to publish the audio capability information to the computing device over a wide area network.
26. The remote UI of claim 21, wherein the control logic is configured to receive DirectX commands.
27. A method for operating a remote user interface (UI) for a computing device comprising:
publishing user input/output (I/O) device capability information to the computing device over a data communication network;
receiving control commands from the computing device over the data communication network, wherein the format of the control commands received from the computing device is consistent with the user I/O device capability information; and
processing the control commands in a user I/O device to generate output to a user.
28. The method of claim 27, further comprising:
processing input from a user in the user I/O device to generate control commands; and
transmitting the generated control commands to the computing device over the data communication network.
29. The method of claim 27, wherein publishing user I/O device capability information comprises publishing user I/O device capability information in accordance with a UPnP protocol.
30. The method of claim 27, wherein publishing user I/O device capability information to the computing device over a data communication network comprises publishing user I/O device capability information to the computing device over a local area network.
31. The method of claim 27, wherein publishing user I/O device capability information to the computing device over a data communication network comprises publishing user I/O device capability information to the computing device over a wide area network.
32. The method of claim 27, wherein receiving control commands comprises receiving DirectX commands.
33. A remote user interface (UI) for a computing device comprising:
control logic; and
a user input/output (I/O) device coupled to the control logic;
wherein the control logic is configured to publish user I/O device capability information to the computing device over a data communication network and to receive control commands from the computing device over the data communication network, wherein the format of the control commands received from the computing device is consistent with the user I/O device capability information; and
wherein the user I/O device processes the control commands to generate output for a user.
34. The remote UI of claim 33, wherein the user I/O device processes input from a user to generate control commands; and
wherein the control logic is further configured to transmit the generated control commands to the computing device over the data communication network.
35. The remote UI of claim 33, wherein the control logic is configured to publish the user I/O device capability information in accordance with a UPnP protocol.
36. The remote UI of claim 33, wherein the control logic is configured to publish the user I/O device capability information to the computing device over a local area network.
37. The remote UI of claim 33, wherein the control logic is configured to publish the user I/O device capability information to the computing device over a wide area network.
38. The remote UI of claim 33, wherein the control logic is configured to receive DirectX commands.
39. A method for operating a remote user interface (UI) for a computing device comprising:
publishing graphics and audio capability information to the computing device over a data communication network;
receiving graphics and audio commands from the computing device over the data communication network, wherein the format of the graphics commands received from the computing device is consistent with the graphics capability information and the format of the audio commands received from the computing device is consistent with the audio capability information;
processing the graphics commands in a graphics device to generate video content therefrom; and
processing the audio commands in an audio device to generate audio content therefrom.
40. The method of claim 39, further comprising:
publishing user input/output (I/O) device capability information to the computing device over the data communication network; and
receiving control commands from the computing device over the data communication network, wherein the format of the control commands received from the computing device is consistent with the user I/O device capability information; and
processing the control commands in a user I/O device to generate output for a user.
41. A remote user interface (UI) for a computing device comprising:
control logic;
a graphics device coupled to the control logic; and
an audio device coupled to the control logic;
wherein the control logic is configured to publish graphics and audio capability information to the computing device over a data communication network and to receive graphics and audio commands from the computing device over the data communication network, wherein the format of the graphics commands received from the computing device is consistent with the graphics capability information and the format of the audio commands received from the computing device is consistent with the audio capability information;
wherein the graphics device processes the graphics commands to generate video content therefrom; and
wherein the audio device processes the audio commands to generate audio content therefrom.
42. The remote UI of claim 41, further comprising:
a user input/output (I/O) device coupled to the control logic;
wherein the control logic is further configured to publish user I/O device capability information to the computing device over the data communication network and to receive control commands from the computing device over the data communication network, wherein the format of the control commands received from the computing device is consistent with the user I/O device capability information; and
wherein the user I/O device processes the control commands to generate output for a user.
US12/878,848 2005-08-16 2010-09-09 System and method for providing a remote user interface for an application executing on a computing device Abandoned US20100332984A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/878,848 US20100332984A1 (en) 2005-08-16 2010-09-09 System and method for providing a remote user interface for an application executing on a computing device
US13/021,631 US20110157196A1 (en) 2005-08-16 2011-02-04 Remote gaming features

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/204,363 US7844442B2 (en) 2005-08-16 2005-08-16 System and method for providing a remote user interface for an application executing on a computing device
US12/878,848 US20100332984A1 (en) 2005-08-16 2010-09-09 System and method for providing a remote user interface for an application executing on a computing device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/204,363 Continuation US7844442B2 (en) 2005-08-16 2005-08-16 System and method for providing a remote user interface for an application executing on a computing device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/021,631 Continuation-In-Part US20110157196A1 (en) 2005-08-16 2011-02-04 Remote gaming features

Publications (1)

Publication Number Publication Date
US20100332984A1 true US20100332984A1 (en) 2010-12-30

Family

ID=37734975

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/204,363 Expired - Fee Related US7844442B2 (en) 2005-08-16 2005-08-16 System and method for providing a remote user interface for an application executing on a computing device
US12/878,848 Abandoned US20100332984A1 (en) 2005-08-16 2010-09-09 System and method for providing a remote user interface for an application executing on a computing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/204,363 Expired - Fee Related US7844442B2 (en) 2005-08-16 2005-08-16 System and method for providing a remote user interface for an application executing on a computing device

Country Status (2)

Country Link
US (2) US7844442B2 (en)
WO (1) WO2007023391A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110119346A1 (en) * 2009-11-13 2011-05-19 Samsung Electronics Co., Ltd. Method and apparatus for providing remote user interface services
US20110157196A1 (en) * 2005-08-16 2011-06-30 Exent Technologies, Ltd. Remote gaming features
US20110271195A1 (en) * 2010-04-30 2011-11-03 Nokia Corporation Method and apparatus for allocating content components to different hardward interfaces
US20110320953A1 (en) * 2009-12-18 2011-12-29 Nokia Corporation Method and apparatus for projecting a user interface via partition streaming
CN102707928A (en) * 2011-02-25 2012-10-03 奥多比公司 Parallelized definition and display of content in a scripting environment
US20130139103A1 (en) * 2011-11-29 2013-05-30 Citrix Systems, Inc. Integrating Native User Interface Components on a Mobile Device
US20140171190A1 (en) * 2012-12-14 2014-06-19 Nvidia Corporation Implementing a remote gaming server on a desktop computer
US8799357B2 (en) 2010-11-08 2014-08-05 Sony Corporation Methods and systems for use in providing a remote user interface

Families Citing this family (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6453334B1 (en) * 1997-06-16 2002-09-17 Streamtheory, Inc. Method and apparatus to allow remotely located computer programs and/or data to be accessed on a local computer in a secure, time-limited manner, with persistent caching
US7062567B2 (en) * 2000-11-06 2006-06-13 Endeavors Technology, Inc. Intelligent network streaming and execution system for conventionally coded applications
US8831995B2 (en) * 2000-11-06 2014-09-09 Numecent Holdings, Inc. Optimized server for streamed applications
JP2008527468A (en) * 2004-11-13 2008-07-24 ストリーム セオリー,インコーポレイテッド Hybrid local / remote streaming
US8024523B2 (en) 2007-11-07 2011-09-20 Endeavors Technologies, Inc. Opportunistic block transmission with time constraints
WO2006102621A2 (en) * 2005-03-23 2006-09-28 Stream Theory, Inc. System and method for tracking changes to files in streaming applications
US20060218165A1 (en) * 2005-03-23 2006-09-28 Vries Jeffrey De Explicit overlay integration rules
US7844442B2 (en) * 2005-08-16 2010-11-30 Exent Technologies, Ltd. System and method for providing a remote user interface for an application executing on a computing device
CN101346727A (en) * 2005-12-27 2009-01-14 日本电气株式会社 Program execution control method, device, and execution control program
KR100788693B1 (en) * 2006-01-12 2007-12-26 삼성전자주식회사 Method and apparatus for storing and restoring a state information of remote user interface
US7868893B2 (en) * 2006-03-07 2011-01-11 Graphics Properties Holdings, Inc. Integration of graphical application content into the graphical scene of another application
US20080201751A1 (en) * 2006-04-18 2008-08-21 Sherjil Ahmed Wireless Media Transmission Systems and Methods
US9198084B2 (en) 2006-05-26 2015-11-24 Qualcomm Incorporated Wireless architecture for a traditional wire-based protocol
US20080010482A1 (en) * 2006-06-13 2008-01-10 Microsoft Corporation Remote control of a media computing device
US8261345B2 (en) 2006-10-23 2012-09-04 Endeavors Technologies, Inc. Rule-based application access management
US7949708B2 (en) * 2007-06-08 2011-05-24 Microsoft Corporation Using a remote handheld device as a local device
US8954876B1 (en) * 2007-10-09 2015-02-10 Teradici Corporation Method and apparatus for providing a session status indicator
US8892738B2 (en) 2007-11-07 2014-11-18 Numecent Holdings, Inc. Deriving component statistics for a stream enabled application
TWI450749B (en) * 2007-11-21 2014-09-01 Mstar Semiconductor Inc Game processing apparatus
US8954541B2 (en) * 2007-12-29 2015-02-10 Amx Llc Method, computer-readable medium, and system for discovery and registration of controlled devices associated with self-describing modules
US8433747B2 (en) * 2008-02-01 2013-04-30 Microsoft Corporation Graphics remoting architecture
GB2459335B (en) * 2008-04-25 2013-01-09 Tenomichi Ltd Temporary modification for extending functionality of computer games and software applications
US20090284476A1 (en) * 2008-05-13 2009-11-19 Apple Inc. Pushing a user interface to a remote device
US20100293462A1 (en) * 2008-05-13 2010-11-18 Apple Inc. Pushing a user interface to a remote device
AU2013200021B2 (en) * 2008-05-13 2016-03-10 Apple Inc. Pushing a user interface to a remote device
US9870130B2 (en) 2008-05-13 2018-01-16 Apple Inc. Pushing a user interface to a remote device
US8970647B2 (en) * 2008-05-13 2015-03-03 Apple Inc. Pushing a graphical user interface to a remote device with display rules provided by the remote device
US9311115B2 (en) 2008-05-13 2016-04-12 Apple Inc. Pushing a graphical user interface to a remote device with display rules provided by the remote device
KR101613319B1 (en) 2008-08-14 2016-04-19 삼성전자주식회사 Method and system for inputting service in home network using universal plug and play
US8180891B1 (en) * 2008-11-26 2012-05-15 Free Stream Media Corp. Discovery, access control, and communication with networked services from within a security sandbox
US8572251B2 (en) 2008-11-26 2013-10-29 Microsoft Corporation Hardware acceleration for remote desktop protocol
US9398089B2 (en) 2008-12-11 2016-07-19 Qualcomm Incorporated Dynamic resource sharing among multiple wireless devices
US9317856B2 (en) * 2009-01-19 2016-04-19 Alcatel Lucent System, method and computer readable medium for application placement
US20100199008A1 (en) * 2009-01-30 2010-08-05 Kwang Wee Lee System and method for implementing a remote input device using virtualization techniques for wireless device
US8477082B2 (en) 2009-01-30 2013-07-02 Cassis International Pte Ltd. System and method for implementing a remote display using a virtualization technique
EP2406693B1 (en) * 2009-03-13 2015-09-09 ABB Technology AG A method for control in a process control system implemented in part by one or more computer implemented run-time processes
US8640097B2 (en) * 2009-03-16 2014-01-28 Microsoft Corporation Hosted application platform with extensible media format
US9588803B2 (en) 2009-05-11 2017-03-07 Microsoft Technology Licensing, Llc Executing native-code applications in a browser
US8875033B2 (en) * 2009-05-18 2014-10-28 National Instruments Corporation Static analysis of a graphical program in a browser
US9264248B2 (en) 2009-07-02 2016-02-16 Qualcomm Incorporated System and method for avoiding and resolving conflicts in a wireless mobile display digital interface multicast environment
US9104452B2 (en) * 2009-07-27 2015-08-11 Microsoft Technology Licensing, Llc Hybrid remote sessions
KR101612845B1 (en) * 2009-11-12 2016-04-15 삼성전자주식회사 Method and apparatus for providing remote UI service
US9582238B2 (en) 2009-12-14 2017-02-28 Qualcomm Incorporated Decomposed multi-stream (DMS) techniques for video display systems
US20110154214A1 (en) * 2009-12-18 2011-06-23 Microsoft Corporation Offloading Content Retrieval And Decoding In Pluggable Content-Handling Systems
US8688776B1 (en) * 2009-12-29 2014-04-01 The Directv Group, Inc. Emulation tool and method of using the same for a content distribution system
US20110208506A1 (en) * 2010-02-24 2011-08-25 Sling Media Inc. Systems and methods for emulating network-enabled media components
EP2553561A4 (en) * 2010-04-01 2016-03-30 Citrix Systems Inc Interacting with remote applications displayed within a virtual desktop of a tablet computing device
US9323921B2 (en) * 2010-07-13 2016-04-26 Microsoft Technology Licensing, Llc Ultra-low cost sandboxing for application appliances
EP2616954B1 (en) * 2010-09-18 2021-03-31 Google LLC A method and mechanism for rendering graphics remotely
EP2463772A1 (en) * 2010-11-29 2012-06-13 NEC CASIO Mobile Communications, Ltd. Method for dynamically allocating an external peripheral to device application
US8903705B2 (en) 2010-12-17 2014-12-02 Microsoft Corporation Application compatibility shims for minimal client computers
US10135900B2 (en) 2011-01-21 2018-11-20 Qualcomm Incorporated User input back channel for wireless displays
US9065876B2 (en) 2011-01-21 2015-06-23 Qualcomm Incorporated User input back channel from a wireless sink device to a wireless source device for multi-touch gesture wireless displays
US20130013318A1 (en) 2011-01-21 2013-01-10 Qualcomm Incorporated User input back channel for wireless displays
US9413803B2 (en) 2011-01-21 2016-08-09 Qualcomm Incorporated User input back channel for wireless displays
US9787725B2 (en) 2011-01-21 2017-10-10 Qualcomm Incorporated User input back channel for wireless displays
US8964783B2 (en) 2011-01-21 2015-02-24 Qualcomm Incorporated User input back channel for wireless displays
US10108386B2 (en) * 2011-02-04 2018-10-23 Qualcomm Incorporated Content provisioning for wireless back channel
US9503771B2 (en) 2011-02-04 2016-11-22 Qualcomm Incorporated Low latency wireless display for graphics
US20130039408A1 (en) * 2011-02-07 2013-02-14 Screenovate Technologies Ltd Method for enhancing compression and transmission process of a screen image
US9495183B2 (en) 2011-05-16 2016-11-15 Microsoft Technology Licensing, Llc Instruction set emulation for guest operating systems
CN102867284B (en) * 2011-07-07 2016-10-26 腾讯科技(深圳)有限公司 A kind of graph drawing engine device and its implementation
US8898459B2 (en) 2011-08-31 2014-11-25 At&T Intellectual Property I, L.P. Policy configuration for mobile device applications
US8918841B2 (en) * 2011-08-31 2014-12-23 At&T Intellectual Property I, L.P. Hardware interface access control for mobile applications
US9389933B2 (en) 2011-12-12 2016-07-12 Microsoft Technology Licensing, Llc Facilitating system service request interactions for hardware-protected applications
US9413538B2 (en) 2011-12-12 2016-08-09 Microsoft Technology Licensing, Llc Cryptographic certification of secure hosted execution environments
US9525998B2 (en) 2012-01-06 2016-12-20 Qualcomm Incorporated Wireless display with multiscreen service
BR102012000848B1 (en) * 2012-01-13 2020-07-14 Mirakulo Software Ltda SYSTEM AND METHODS FOR INTEGRATING PORTABLE DEVICES WITH DIGITAL TV SYSTEMS
US20130263278A1 (en) 2012-03-28 2013-10-03 Ae Squared Ltd. Method and apparatus for controlling operations performed by a mobile co
ES2439804B1 (en) * 2012-04-19 2014-10-29 Universitat Politècnica De Catalunya Procedure, system and piece of executable code to virtualize a hardware resource associated with a computer system
CN102736936B (en) * 2012-05-31 2015-01-28 东南大学 Method for remotely interacting console programs
US20140184613A1 (en) * 2013-01-01 2014-07-03 Doron Exterman Method for offloading graphic processing unit (gpu) processing tasks to remote computers
CN105450701A (en) * 2014-08-28 2016-03-30 冠捷投资有限公司 System for remotely and dynamically managing display device
US10756985B2 (en) 2015-01-27 2020-08-25 Nutanix, Inc. Architecture for implementing user interfaces for centralized management of a computing environment
GB2537814B (en) * 2015-04-14 2017-10-18 Avecto Ltd Computer device and method for controlling untrusted access to a peripheral device
US10599459B2 (en) 2016-02-12 2020-03-24 Nutanix, Inc. Entity database distributed replication
US10380038B2 (en) * 2017-08-24 2019-08-13 Re Mago Holding Ltd Method, apparatus, and computer-readable medium for implementation of a universal hardware-software interface
US10700991B2 (en) 2017-11-27 2020-06-30 Nutanix, Inc. Multi-cluster resource management
US10599444B2 (en) * 2018-01-09 2020-03-24 Microsoft Technology Licensing, Llc Extensible input stack for processing input device data
US10819817B2 (en) * 2019-02-04 2020-10-27 Dell Products L.P. HTML5 multimedia redirection

Citations (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4356545A (en) * 1979-08-02 1982-10-26 Data General Corporation Apparatus for monitoring and/or controlling the operations of a computer from a remote location
US5440699A (en) * 1991-06-24 1995-08-08 Compaq Computer Corporation System by which a remote computer receives screen images from and transmits commands to a host computer
US5546538A (en) * 1993-12-14 1996-08-13 Intel Corporation System for processing handwriting written by user of portable computer by server or processing by the computer when the computer no longer communicate with server
US5627977A (en) * 1994-04-19 1997-05-06 Orchid Systems, Inc. Trainable user interface translator
US6052120A (en) * 1996-10-01 2000-04-18 Diamond Multimedia Systems, Inc. Method of operating a portable interactive graphics display tablet and communications systems
US6084584A (en) * 1996-10-01 2000-07-04 Diamond Multimedia Systems, Inc. Computer system supporting portable interactive graphics display tablet and communications systems
US6085247A (en) * 1998-06-08 2000-07-04 Microsoft Corporation Server operating system for supporting multiple client-server sessions and dynamic reconnection of users to previous sessions using different computers
US6166734A (en) * 1996-10-01 2000-12-26 Diamond Multimedia Systems, Inc. Portable interactive graphics display tablet and communications system
US6219695B1 (en) * 1997-09-16 2001-04-17 Texas Instruments Incorporated Circuits, systems, and methods for communicating computer video output to a remote location
US6243772B1 (en) * 1997-01-31 2001-06-05 Sharewave, Inc. Method and system for coupling a personal computer with an appliance unit via a wireless communication link to provide an output display presentation
US20010009424A1 (en) * 2000-01-24 2001-07-26 Kiyonori Sekiguchi Apparatus and method for remotely operating plurality of information devices connected to a network provided with plug-and-play function
US20020029285A1 (en) * 2000-05-26 2002-03-07 Henry Collins Adapting graphical data, processing activity to changing network conditions
US20020045484A1 (en) * 2000-09-18 2002-04-18 Eck Charles P. Video game distribution network
US20020107072A1 (en) * 2001-02-07 2002-08-08 Giobbi John J. Centralized gaming system with modifiable remote display terminals
US20030101294A1 (en) * 2001-11-20 2003-05-29 Ylian Saint-Hilaire Method and architecture to support interaction between a host computer and remote devices
US20030218632A1 (en) * 2002-05-23 2003-11-27 Tony Altwies Method and architecture of an event transform oriented operating environment for a personal mobile display system
US20030232648A1 (en) * 2002-06-14 2003-12-18 Prindle Joseph Charles Videophone and videoconferencing apparatus and method for a video game console
US20030234809A1 (en) * 2002-06-19 2003-12-25 Parker Kathryn L. Method and system for remotely operating a computer
US20040073908A1 (en) * 2002-10-10 2004-04-15 International Business Machines Corporation Apparatus and method for offloading and sharing CPU and RAM utilization in a network of machines
US6732067B1 (en) * 1999-05-12 2004-05-04 Unisys Corporation System and adapter card for remote console emulation
US20040172486A1 (en) * 1997-01-31 2004-09-02 Cirrus Logic, Inc. Method and apparatus for incorporating an appliance unit into a computer system
US20040189677A1 (en) * 2003-03-25 2004-09-30 Nvidia Corporation Remote graphical user interface support using a graphics processing unit
US6874009B1 (en) * 2000-02-16 2005-03-29 Raja Tuli Portable high speed internet device with user fees
US20050091607A1 (en) * 2003-10-24 2005-04-28 Matsushita Electric Industrial Co., Ltd. Remote operation system, communication apparatus remote control system and document inspection apparatus
US20050104889A1 (en) * 2002-03-01 2005-05-19 Graham Clemie Centralised interactive graphical application server
US6897833B1 (en) * 1999-09-10 2005-05-24 Hewlett-Packard Development Company, L.P. Portable user interface
US6904519B2 (en) * 1998-06-12 2005-06-07 Microsoft Corporation Method and computer program product for offloading processing tasks from software to hardware
US6915327B1 (en) * 2000-10-30 2005-07-05 Raja Singh Tuli Portable high speed communication device peripheral connectivity
US6924790B1 (en) * 1995-10-16 2005-08-02 Nec Corporation Mode switching for pen-based computer systems
US6928461B2 (en) * 2001-01-24 2005-08-09 Raja Singh Tuli Portable high speed internet access device with encryption
US20050278455A1 (en) * 2004-06-11 2005-12-15 Seiko Epson Corporation Image transfer using drawing command hooking
US7043697B1 (en) * 2000-05-15 2006-05-09 Intel Corporation Virtual display driver
US20060282855A1 (en) * 2005-05-05 2006-12-14 Digital Display Innovations, Llc Multiple remote display system
US20070033653A1 (en) * 2005-08-08 2007-02-08 Klein Edward E System and method for managing sensitive information
US7274368B1 (en) * 2000-07-31 2007-09-25 Silicon Graphics, Inc. System method and computer program product for remote graphics processing
US7404014B2 (en) * 1995-07-05 2008-07-22 Microsoft Corporation Method and system for transmitting and determining the effects of display orders from shared application between a host and shadow computer
US20080222165A9 (en) * 2001-02-01 2008-09-11 Microsoft Corporation Method and system for providing universal remote control of computing devices
US7430681B1 (en) * 2005-03-30 2008-09-30 Teradici Corporation Methods and apparatus for interfacing a drawing memory with a remote display controller
US7694324B2 (en) * 2004-08-13 2010-04-06 Microsoft Corporation Rendering graphics/image data using dynamically generated video streams
US7844442B2 (en) * 2005-08-16 2010-11-30 Exent Technologies, Ltd. System and method for providing a remote user interface for an application executing on a computing device
US20110157196A1 (en) * 2005-08-16 2011-06-30 Exent Technologies, Ltd. Remote gaming features

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5036484A (en) 1988-11-23 1991-07-30 International Business Machines Corporation Personal computer/host emulation system for handling host data with personal computer application programs at personal computers
US5646740A (en) 1995-12-06 1997-07-08 Xerox Corporation Partial or untimed production trees to specify diagnostics operations requiring multiple module cooperation
JP3210603B2 (en) 1997-07-04 2001-09-17 インターナショナル・ビジネス・マシーンズ・コーポレーション Image processing method, server and recording medium
AU2002321997A1 (en) 2001-03-05 2002-12-23 Anysoft Limited Partnership Technique for integrating information from one or more remotely located sources

Patent Citations (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4356545A (en) * 1979-08-02 1982-10-26 Data General Corporation Apparatus for monitoring and/or controlling the operations of a computer from a remote location
US5440699A (en) * 1991-06-24 1995-08-08 Compaq Computer Corporation System by which a remote computer receives screen images from and transmits commands to a host computer
US5546538A (en) * 1993-12-14 1996-08-13 Intel Corporation System for processing handwriting written by user of portable computer by server or processing by the computer when the computer no longer communicate with server
US5627977A (en) * 1994-04-19 1997-05-06 Orchid Systems, Inc. Trainable user interface translator
US7404014B2 (en) * 1995-07-05 2008-07-22 Microsoft Corporation Method and system for transmitting and determining the effects of display orders from shared application between a host and shadow computer
US6924790B1 (en) * 1995-10-16 2005-08-02 Nec Corporation Mode switching for pen-based computer systems
US6052120A (en) * 1996-10-01 2000-04-18 Diamond Multimedia Systems, Inc. Method of operating a portable interactive graphics display tablet and communications systems
US6166734A (en) * 1996-10-01 2000-12-26 Diamond Multimedia Systems, Inc. Portable interactive graphics display tablet and communications system
US6084584A (en) * 1996-10-01 2000-07-04 Diamond Multimedia Systems, Inc. Computer system supporting portable interactive graphics display tablet and communications systems
US6243772B1 (en) * 1997-01-31 2001-06-05 Sharewave, Inc. Method and system for coupling a personal computer with an appliance unit via a wireless communication link to provide an output display presentation
US20040172486A1 (en) * 1997-01-31 2004-09-02 Cirrus Logic, Inc. Method and apparatus for incorporating an appliance unit into a computer system
US6219695B1 (en) * 1997-09-16 2001-04-17 Texas Instruments Incorporated Circuits, systems, and methods for communicating computer video output to a remote location
US6085247A (en) * 1998-06-08 2000-07-04 Microsoft Corporation Server operating system for supporting multiple client-server sessions and dynamic reconnection of users to previous sessions using different computers
US6904519B2 (en) * 1998-06-12 2005-06-07 Microsoft Corporation Method and computer program product for offloading processing tasks from software to hardware
US6732067B1 (en) * 1999-05-12 2004-05-04 Unisys Corporation System and adapter card for remote console emulation
US6897833B1 (en) * 1999-09-10 2005-05-24 Hewlett-Packard Development Company, L.P. Portable user interface
US20010009424A1 (en) * 2000-01-24 2001-07-26 Kiyonori Sekiguchi Apparatus and method for remotely operating plurality of information devices connected to a network provided with plug-and-play function
US6874009B1 (en) * 2000-02-16 2005-03-29 Raja Tuli Portable high speed internet device with user fees
US7043697B1 (en) * 2000-05-15 2006-05-09 Intel Corporation Virtual display driver
US20020029285A1 (en) * 2000-05-26 2002-03-07 Henry Collins Adapting graphical data, processing activity to changing network conditions
US7274368B1 (en) * 2000-07-31 2007-09-25 Silicon Graphics, Inc. System method and computer program product for remote graphics processing
US20020045484A1 (en) * 2000-09-18 2002-04-18 Eck Charles P. Video game distribution network
US6915327B1 (en) * 2000-10-30 2005-07-05 Raja Singh Tuli Portable high speed communication device peripheral connectivity
US6928461B2 (en) * 2001-01-24 2005-08-09 Raja Singh Tuli Portable high speed internet access device with encryption
US20080222165A9 (en) * 2001-02-01 2008-09-11 Microsoft Corporation Method and system for providing universal remote control of computing devices
US20020107072A1 (en) * 2001-02-07 2002-08-08 Giobbi John J. Centralized gaming system with modifiable remote display terminals
US20060282514A1 (en) * 2001-11-20 2006-12-14 Ylian Saint-Hilaire Method and architecture to support interaction between a host computer and remote devices
US20030101294A1 (en) * 2001-11-20 2003-05-29 Ylian Saint-Hilaire Method and architecture to support interaction between a host computer and remote devices
US20050104889A1 (en) * 2002-03-01 2005-05-19 Graham Clemie Centralised interactive graphical application server
US20030218632A1 (en) * 2002-05-23 2003-11-27 Tony Altwies Method and architecture of an event transform oriented operating environment for a personal mobile display system
US20030232648A1 (en) * 2002-06-14 2003-12-18 Prindle Joseph Charles Videophone and videoconferencing apparatus and method for a video game console
US20030234809A1 (en) * 2002-06-19 2003-12-25 Parker Kathryn L. Method and system for remotely operating a computer
US20040073908A1 (en) * 2002-10-10 2004-04-15 International Business Machines Corporation Apparatus and method for offloading and sharing CPU and RAM utilization in a network of machines
US20040189677A1 (en) * 2003-03-25 2004-09-30 Nvidia Corporation Remote graphical user interface support using a graphics processing unit
US20050091607A1 (en) * 2003-10-24 2005-04-28 Matsushita Electric Industrial Co., Ltd. Remote operation system, communication apparatus remote control system and document inspection apparatus
US20050278455A1 (en) * 2004-06-11 2005-12-15 Seiko Epson Corporation Image transfer using drawing command hooking
US20110043531A1 (en) * 2004-06-11 2011-02-24 Seiko Epson Corporation Image transfer using drawing command hooking
US7694324B2 (en) * 2004-08-13 2010-04-06 Microsoft Corporation Rendering graphics/image data using dynamically generated video streams
US7430681B1 (en) * 2005-03-30 2008-09-30 Teradici Corporation Methods and apparatus for interfacing a drawing memory with a remote display controller
US20060282855A1 (en) * 2005-05-05 2006-12-14 Digital Display Innovations, Llc Multiple remote display system
US20070033653A1 (en) * 2005-08-08 2007-02-08 Klein Edward E System and method for managing sensitive information
US7844442B2 (en) * 2005-08-16 2010-11-30 Exent Technologies, Ltd. System and method for providing a remote user interface for an application executing on a computing device
US20110157196A1 (en) * 2005-08-16 2011-06-30 Exent Technologies, Ltd. Remote gaming features

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110157196A1 (en) * 2005-08-16 2011-06-30 Exent Technologies, Ltd. Remote gaming features
US20110119346A1 (en) * 2009-11-13 2011-05-19 Samsung Electronics Co., Ltd. Method and apparatus for providing remote user interface services
US11381415B2 (en) 2009-11-13 2022-07-05 Samsung Electronics Co., Ltd. Method and apparatus for providing remote user interface services
US10454701B2 (en) 2009-11-13 2019-10-22 Samsung Electronics Co., Ltd. Method and apparatus for providing remote user interface services
US10951432B2 (en) 2009-11-13 2021-03-16 Samsung Electronics Co., Ltd. Method and apparatus for providing remote user interface services
US20110320953A1 (en) * 2009-12-18 2011-12-29 Nokia Corporation Method and apparatus for projecting a user interface via partition streaming
US20110271195A1 (en) * 2010-04-30 2011-11-03 Nokia Corporation Method and apparatus for allocating content components to different hardward interfaces
US11108848B2 (en) 2010-11-08 2021-08-31 Saturn Licensing Llc Methods and systems for use in providing a remote user interface
US8799357B2 (en) 2010-11-08 2014-08-05 Sony Corporation Methods and systems for use in providing a remote user interface
CN102707928A (en) * 2011-02-25 2012-10-03 奥多比公司 Parallelized definition and display of content in a scripting environment
US20130139103A1 (en) * 2011-11-29 2013-05-30 Citrix Systems, Inc. Integrating Native User Interface Components on a Mobile Device
US9612724B2 (en) * 2011-11-29 2017-04-04 Citrix Systems, Inc. Integrating native user interface components on a mobile device
US20140171190A1 (en) * 2012-12-14 2014-06-19 Nvidia Corporation Implementing a remote gaming server on a desktop computer
US10118095B2 (en) * 2012-12-14 2018-11-06 Nvidia Corporation Implementing a remote gaming server on a desktop computer

Also Published As

Publication number Publication date
US7844442B2 (en) 2010-11-30
WO2007023391A2 (en) 2007-03-01
WO2007023391A3 (en) 2007-10-04
US20070043550A1 (en) 2007-02-22

Similar Documents

Publication Publication Date Title
US7844442B2 (en) System and method for providing a remote user interface for an application executing on a computing device
US11909820B2 (en) Method and apparatus for execution of applications in a cloud system
US7707606B2 (en) Content and application download based on a home network system configuration profile
US8463912B2 (en) Remote displays in mobile communication networks
US8903897B2 (en) System and method for providing interactive content to non-native application environments
US7444438B2 (en) Method and architecture to support interaction between a host computer and remote devices
EP2261809A1 (en) Distributed network game system
WO2017124860A1 (en) Distributed wireless multi-screen virtual machine service system
CN102158553A (en) Method and device for playing multi-media files for remote desktop
US8645559B2 (en) Redirection of multiple remote devices
US20030110217A1 (en) Method and apparatus for a networked projection system
US20010047431A1 (en) HAVi-VHN bridge solution
KR20140106838A (en) Cloud service provide apparatus and method using game flatform based on streaming
US7788392B2 (en) Mechanism for universal media redirection control
JP7193181B2 (en) Distributed system of Android online game application supporting multiple terminals and multiple networks
CN116636223A (en) Meta-universe stream transmission system and method
KR100675130B1 (en) Method for providing contents to set-top box by third party's action and system thereof
WO2001091482A1 (en) Remote displays in mobile communication networks
JP2001236525A (en) Device, method and system for processing information, and recording medium
US20140141875A1 (en) Temporary modification for extending functionality of computer games and software applications.
US10130877B2 (en) Remote gaming and projection
US20060142991A1 (en) Remote USB network device control
CN113949746A (en) Internet of things virtual sensor implementation method and device and intelligent terminal
CN104468692B (en) Communication management device, terminal, system, method, program and information storage medium
TWI823146B (en) Edge side rendering operation method and system for real-time mr interactive application

Legal Events

Date Code Title Description
AS Assignment

Owner name: EXENT TECHNOLOGIES, LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TZRUYA, YOAV M.;REEL/FRAME:025007/0744

Effective date: 20060615

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION