US20080184128A1 - Mobile device user interface for remote interaction - Google Patents
Mobile device user interface for remote interaction Download PDFInfo
- Publication number
- US20080184128A1 US20080184128A1 US12/020,472 US2047208A US2008184128A1 US 20080184128 A1 US20080184128 A1 US 20080184128A1 US 2047208 A US2047208 A US 2047208A US 2008184128 A1 US2008184128 A1 US 2008184128A1
- Authority
- US
- United States
- Prior art keywords
- webpage
- viewing application
- user
- overlay
- server
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4782—Web browsing, e.g. WebTV
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/107—Selection of coding mode or of prediction mode between spatial and temporal predictive coding, e.g. picture refresh
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/146—Data rate or code amount at the encoder output
- H04N19/15—Data rate or code amount at the encoder output by monitoring actual compressed data size at the memory before deciding storage at the transmission buffer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/156—Availability of hardware or computational resources, e.g. encoding based on power-saving criteria
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/164—Feedback from the receiver or from the transmission channel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/167—Position within a video image, e.g. region of interest [ROI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/59—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
- H04N19/87—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving scene cut or scene change detection in combination with video compression
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234336—Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by media transcoding, e.g. video is transformed into a slideshow of still pictures or audio is converted into text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41407—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8451—Structuring of content, e.g. decomposing content into time segments using Advanced Video Coding [AVC]
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/06—Colour space transformation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/14—Solving problems related to the presentation of information to be displayed
- G09G2340/145—Solving problems related to the presentation of information to be displayed related to small screens
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/02—Networking aspects
- G09G2370/022—Centralised management of display operation, e.g. in a server instead of locally
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
- G09G2370/042—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller for monitor identification
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/10—Use of a protocol of communication by packets in interfaces along the display data pipeline
Definitions
- the present invention relates to a system for displaying a web browsing session.
- the present invention is related to a viewing application for displaying a web browsing session from a separate device.
- Access to applications, including web browsers, is provided for in various client-server environments. Placing a web browser on a server for delivery to a client presents a large number of issues, including issues with the delivery of the browsing experience to the client user, such as handling interactive objects within a web page. For interaction with handheld clients, such as cellular phones, bandwidth and display size constraints pose additional challenges in delivering a satisfactory web browsing experience from a server, including dealing with any latency within the system evident to the end user.
- a system includes a web browsing engine residing on a first device, with a viewing application residing on a second device and operatively coupled to the web browsing engine, where the viewing application is adapted to display a portion of a webpage rendered by the web browsing engine and an overlay graphical component.
- the system also includes a recognition engine adapted to identify an element on the webpage and communicate information regarding the element to the viewing application.
- a system for displaying a web browsing session includes a recognition engine further adapted to communicate information regarding the element to the viewing application responsive to the viewing application passing user input to the recognition engine.
- user input comprises a selection of a location on the webpage displayed on the second device, the location corresponding to the element's location on the webpage.
- a system for displaying a web browsing session includes a recognition engine further adapted to communicate information regarding the element to the viewing application with the portion of the webpage.
- system further comprises a state manager engine operatively coupled to the viewing application and the web browsing engine, the state manager engine adapted to synchronize user input displaying locally on the viewing application with user input transferred to the web browsing engine.
- FIG. 1 is a block diagram illustrating some aspects of a client-server architecture of the present invention, according to one embodiment.
- FIG. 2 is a block diagram illustrating some aspects of the present invention in connection with a server, according to one embodiment.
- FIG. 3 is a block diagram illustrating some aspects of an architectural overview of the present invention, including a server, an audio server and a client, according to one embodiment.
- FIG. 4 is a block diagram illustrating some aspects of the present invention in connection with a client, according to one embodiment.
- FIG. 5 is a diagram illustrating some aspects of multiple-user software architecture, according to one embodiment.
- FIG. 6 is a flowchart illustrating some supporting aspects of capturing a succession of video frames, according to one embodiment.
- FIG. 7 is a flowchart illustrating some supporting aspects of sending a succession of video frames, according to one embodiment.
- FIG. 8 is a diagram illustrating some aspects of a client-server exchange, according to one embodiment.
- FIG. 9 is a diagram illustrating some aspects of a client-server exchange, including an accompanying exchange within the server, according to one embodiment.
- FIG. 10 is a diagram illustrating some aspects of viewport move operations and related state management, according to one embodiment.
- FIG. 11 is a diagram illustrating some aspects of a client-server exchange with respect to state management, according to one embodiment.
- FIG. 12 is a diagram illustrating some aspects of a client-server exchange, including an accompanying exchange between a server and network storage, according to one embodiment.
- FIG. 13 is a diagram illustrating some aspects of displaying a web browsing session from a server to a cellular phone, according to one embodiment.
- FIG. 14 is a diagram illustrating some aspects of a user interface display, according to one embodiment.
- FIG. 15 is a diagram illustrating some aspects of a user interface display, according to one embodiment.
- FIG. 16 illustrates an example computer system suitable for use in association with a client-server architecture for remote interaction, according to one embodiment.
- embodiments of the invention are described in connection with a server or a mobile client device, such as an example mobile client device.
- a server or a mobile client device such as an example mobile client device.
- Various specific details are set forth herein regarding embodiments with respect to servers and mobile client devices to aid in understanding the present invention. However, such specific details are intended to be illustrative, and are not intended to restrict in any way the scope of the present invention as claimed herein.
- the invention can be used in connection with a wide variety of contexts, including, for example, client devices operating in a wired network.
- embodiments of the invention are described in connection with a web browsing application, but such descriptions are intended to be illustrative and examples, and in no way limit the scope of the invention as claimed.
- Various embodiments of the invention may be used in connection with many different types of programs, including an operating system (OS), a wide variety of applications, including word processing, spreadsheet, presentation, and database applications, and so forth.
- OS operating system
- applications including word processing,
- the present invention is implemented at least partially in a conventional server computer system running an OS, such as a Microsoft OS, available from Microsoft Corporation; various versions of Linux; various versions of UNIX; a MacOS, available from Apple Computer Inc.; and/or other operating systems.
- the present invention is implemented in a conventional personal computer system running an OS such as Microsoft Windows Vista or XP (or another Windows version), MacOS X (or another MacOS version), various versions of Linux, various versions of UNIX, or any other OS designed to generally manage operations on a computing device.
- the present invention can be implemented on, or in connection with, devices other than personal computers, such as, for example, personal digital assistants (PDAs), cell phones, computing devices in which one or more computing resources is located remotely and accessed via a network, running on a variety of operating systems.
- PDAs personal digital assistants
- the invention may be included as add-on software, or it may be a feature of an application that is bundled with a computer system or sold separately, or it may even be implemented as functionality embedded in hardware.
- Output generated by the invention can be displayed on a screen, transmitted to a remote device, stored in a database or other storage mechanism, printed, or used in any other way.
- the invention makes use of input provided to the computer system via input devices such as a keyboard (screen-based or physical, in a variety of forms), scroll wheels, number pads, stylus-based inputs, a touchscreen or touchpad, etc.
- input devices such as a keyboard (screen-based or physical, in a variety of forms), scroll wheels, number pads, stylus-based inputs, a touchscreen or touchpad, etc.
- Such components including their operation and interactions with one another and with a central processing unit of the personal computer, are well known in the art of computer systems and therefore are not depicted here.
- FIG. 1 is a block diagram illustrating some aspects of system 100 of the present invention, according to one embodiment.
- System 100 employs a client-server architecture that includes a number of server application instances running on server 200 , including server application 1 ( 102 ), server application 2 ( 104 ), server application 3 ( 106 ), and a wide-ranging number of additional server applications (represented by ellipsis 108 ), up to server application n ( 110 ).
- server application is used herein to denote a server-side application, i.e., an application running on one or more servers.
- Server application n ( 110 ) represents the number of server application instances that happen to be running in system 100 at any given point.
- Server 200 also includes user manager module 502 , which serves to manage multiple users among the multiple server application instances 102 - 110 .
- User manager module 502 is described herein in FIG. 5 , and represents one of potential multiple user managers running on server 200 .
- Server 200 is running one instance of an OS underlying server applications 102 - 110 . In another embodiment, server 200 may run multiple instances of an OS, each OS instance including one or more application instances.
- Server 200 also includes provision manager module 1205 , which is described herein in FIG. 12 .
- FIG. 1 illustrates multiple server applications 102 - 110
- a number of different types of programs may be alternately used, including, for instance, an OS.
- Server applications 102 - 110 illustrated in FIG. 1 may run on one server 200 or any number of servers, as, for example, in one or more server farm environments.
- Server applications 102 - 110 may each comprise instances of different server applications, or may all comprise an instance of one server application.
- each server application 102 - 110 could comprise a separate instance of a web browsing application.
- server application 1 ( 102 ) includes application 112 , plugin 114 , state manager module 115 , recognition module 117 , audio data generator 116 , audio encoder module 120 , video encoder module 124 , and command process module 126 .
- Video encoder module 124 makes use of feedback parameter 125 .
- Video encoder module 124 is operatively coupled to application 112 , and is adapted to receive a succession of captures ( 122 ) of the user interface (UI) of application 112 for encoding into video frames for transmission via network 128 .
- the succession of captures ( 122 ) of the UI comprise data that is captured and transferred from application 112 to video encoder 124 by a separate module, described and illustrated in FIG. 2 (image management module 216 ).
- State manager module 115 manages state information, as will be described in relation to subsequent Figures.
- Recognition module 117 identifies elements related to output of application 112 , as will be described in relation to subsequent Figures.
- user interface refers to all or a portion of any user interface associated with a wide variety of computer programs.
- the encoding of application UI captures ( 122 ) is not limited to any particular encoding or video compression format, and may include a wide variety of video compression techniques, ranging from the use of a video compression standard, such as H.264, to an entirely customized form of video compression, to a modified version of a video compression standard, and so forth.
- Audio encoder module 120 is operatively coupled to audio data generator 116 of application 112 , and is adapted to transform audio captures 118 (e.g., an audio stream) of audio data generator 116 into an encoded audio stream for transmission via network 128 . Audio captures 118 comprises data being transferred from audio data generator 116 to audio encoder module 120 .
- Audio data generator 116 is operatively coupled to application 112 , and is adapted to generate the audio data accompanying application 112 .
- Plugin 114 is operatively coupled to application 112 and command process module 126 .
- Plugin 114 is adapted to facilitate the interface between application 112 and command process module 126 .
- Server 200 is further described herein in FIG. 2 .
- System 100 includes a number of clients, including client 1 ( 400 ), client 2 ( 132 ), client 3 ( 134 ), and a wide-ranging number of additional clients (represented by ellipsis 136 ), up to client n ( 138 ), with client n ( 138 ) representing the number of clients that happen to be engaged in the system at any given point.
- the different clients comprise different, non-related client devices.
- client 1 ( 400 ) may include audio decoder module 142 , video decoder module 144 , command process module 146 , viewing application 148 , state manager module 149 , and speaker 150 .
- Video decoder module 144 may be adapted to decode the succession of video frames encoded by video encoder module 124 , where the successive video frames have been transmitted across network 128 for reception by client 1 ( 400 ).
- Video decoder module 144 may be operatively coupled to viewing application 148 , and adapted to communicate the decoded video frames to viewing application 148 for display of the video frames on client 1 ( 400 ).
- State manager module 149 manages state information, as will be described in relation to subsequent Figures.
- Client 1 ( 400 ) includes speaker 150 , and audio decoder module 142 is operatively coupled to speaker 150 .
- Audio decoder module 142 is adapted to decode the audio captures encoded by audio encoder module 120 , where the encoded audio has been transmitted across network 128 for reception by client 1 ( 400 ). After decoding the audio stream, audio decoder module 142 may communicate the decoded audio to speaker 150 for audio output from client 1 ( 400 ).
- Viewing application 148 is adapted to receive user input and communicate the user input to command process module 146 .
- Command process module 146 is adapted to communicate the user input back to command process module 126 of application 102 via network 128 .
- Command process module 126 is adapted to communicate the user input to application 112 via plugin 114 .
- Plugin 114 facilitates the remote interactive use of application 112 via the system 100 described in FIG. 1 .
- Plugin 114 may also be an extension.
- application 112 may be customized for use with the client-server architecture of this invention to the extent that a special plugin is not needed.
- neither a plugin or special application modifications may be needed.
- Command process module 146 is adapted to communicate one or more feedback parameters 125 to command process module 126 .
- Command process module 126 is adapted to communicate the one or more feedback parameters 125 to video encoder module 124 and audio encoder module 120 for their respective encoding of the succession of application UI captures 122 and audio captures 118 .
- the one or more feedback parameters 125 may comprise one or more of a wide range of parameters, including a bandwidth parameter relating to at least a portion of network 128 , a device parameter of client 1 ( 400 ) or a user input for client 1 ( 400 ).
- the one or more feedback parameters 125 may comprise a bandwidth parameter, which may include any estimated or measured bandwidth data point.
- An example bandwidth parameter may include estimated bandwidth based on measurements of certain packets traversing between server 200 and client 1 ( 400 ), (e.g., how much data sent divided by traversal time to obtain a throughput value), or other bandwidth information obtained from, or in conjunction with, network 128 , including from a network protocol.
- the one or more feedback parameters 125 may comprise user input for client 1 ( 400 ), including, for example, a user request for encoding performed in a certain format or manner, with such a request being requested and communicated by viewing application 148 .
- the one or more feedback parameters 125 may comprise a display resolution of client 1 ( 400 ) (e.g., CGA, QVGA, VGA, NTSC, PAL, WVGA, SVGA, XGA, etc.).
- the one or more feedback parameters 125 may comprise other screen parameters (e.g., screen size, refresh capabilities, backlighting capabilities, screen technology, etc.) or other parameters of the client device (e.g., device processor, available memory for use in storing video frames, location if GPS or other location technology-enabled, etc.). None of the example feedback parameters discussed above are meant to exclude their combined use with each other, or other feedback parameters.
- video encoder module 124 may be adapted to at least partially base its video sample rate on the one of more feedback parameters 125 .
- each client may potentially comprise a different type of client device, each with its own one or more feedback parameters.
- Client 1 ( 400 ) is further described herein in FIG. 4 .
- client-server architecture illustrated in FIG. 1 is merely an example, and that the invention may be practiced and implemented using many other architectures and environments.
- FIG. 2 is a block diagram illustrating some aspects of the present invention in connection with server 200 , according to one embodiment.
- Server 200 includes user manager module 502 , provision manager module 1205 , server application 1 ( 102 ), application 112 , plugin 114 , state manager module 115 , recognition module 117 , audio data generator 116 , audio encoder module 120 , image management module 216 , memory 218 , video encoder module 124 (which includes feedback parameter 125 ), command process module 126 , and align module 224 .
- Command process module 126 includes client interpreter sub-module 228
- plugin 114 includes client implementer sub-module 208 .
- the components illustrated in FIG. 2 with the same numbers as components illustrated in FIG. 1 correspond to those respective components of FIG.
- server application 102 is illustrated as a representative instance of multiple server applications running on server 200 , each of the multiple server applications being associated with its own distinct client (clients are not shown in this illustration).
- user manager module 502 represents one of potential multiple user managers running on server 200 .
- Image management module 216 serves to capture the UI of application 112 (as the UI would appear on a screen) and save the capture in memory 218 . Any capture process such as screen-scraping may be used, and image management module 216 may perform this capture at any desired rate. Image management module 216 also compares the last prior capture of the application UI to the current capture to determine whether any changes have occurred in a particular area of the application UI. Any image/video frame matching process may be used for this comparison operation. Image management module 216 serves to repetitively perform this function.
- image management module 216 detects any change in the particular area of interest, a delta flag is set to indicate that the area of interest has changed.
- image management module 216 serves to convert the native format of the UI rendered data to a video frame format more suited for compression and transmission to the client device (e.g., color space transformation, data format transformation, etc.).
- Image management module 216 serves to resize the image for the reformatted video frame.
- multiple parameters of the applicable client device were included in the one or more feedback parameters 125 , allowing image management module 216 to perform the reformatting and resizing based on client device parameters (the relevant parameters having been communicated to image management module 216 ).
- Image management module 216 periodically checks (based on its sample interval) if the delta flag has been set. If the delta flag is detected as set during a check, the reformatted/resized video frame in memory 218 is encoded by video encoder module 124 for transmission to the client device.
- Client interpreter sub-module 228 of command process module 126 serves to interpret data received from client device 400 and to translate this data for use in connection with video encoder module 124 , audio encoder module 120 and application 112 (e.g., user commands, etc.). Client interpreter sub-module 228 serves to pass the feedback parameters 125 to video encoder module 124 and audio encoder 120 for use in encoding.
- Client interpreter sub-module 228 of command process module 126 serves to translate client-received data for use in connection with plugin 114 and its client implementer sub-module 208 .
- the client device passes coordinates (of a cursor, etc.) relative to the client device's screen to command process 126 .
- Client interpreter sub-module 228 serves to determine the corresponding location in relation to the viewport of the client device and the application UI.
- Client interpreter sub-module 228 then communicates the translated coordinates to plugin 114 for use by its client implementer sub-module 208 .
- Client implementer sub-module 208 serves to translate from conventional user input to a format appropriate for application 112 , and then to directly inject the translated input into application 112 .
- Align module 224 correlates and cross-stamps video frames encoded by video encoder module 124 and audio encoded by audio encoder module 120 , so that the audio stream and the video frames associated with the UI of application 112 may be readily matched at client device 400 .
- Image management module 216 may also serve to time-stamp all images, and the operation of capturing audio from audio data generator 116 may also serve to timestamp the audio stream, both for down-stream alignment by align module 224 , as would be appreciated by one skilled in the art.
- all alignment/matching of audio and video frames may be performed at the client device.
- FIG. 2 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 3 is a functional block diagram 300 illustrating some aspects of an architectural overview of the present invention, including a server, an audio server and a client, according to one embodiment.
- audio is sent to the client from dedicated audio server 304 .
- Functional block diagram 300 includes server 302 , audio server 304 and client 306 , with client 306 operatively linked to server 302 and audio server 304 via network 310 (via connections 312 , 314 and 316 ).
- Server 302 is operatively linked to audio server 304 via connection 308 .
- Server 302 includes application 318 , plugin 322 , state manager module 323 , recognition module 327 , audio data generator 320 , video encoder module 324 (including feedback parameter 325 ), command process module 326 , audio interceptor module 330 , PID (process identifier) manager module 332 , and time-stamp manager module 334 .
- Video encoder module 324 operates as described in FIG. 2 , being analogous to video encoder module 124 (and likewise, for feedback parameter 325 with respect to feedback parameter 125 ). Video encoder module 324 operates to encode application UI captures 328 and to communicate the encoded video frames for transmission to client 306 . In the process of obtaining application UI captures, the resulting UI captures are time-stamped. Time-stamp manager module 334 facilitates the time-stamping of the UI captures. Command process module 326 operates as described in FIG. 2 , being analogous to command process module 126 .
- Audio data generator 320 renders an audio stream (not shown) for application 318 .
- Audio interceptor module 330 intercepts or traps this audio stream for redirection to audio server 304 , and may timestamp the audio stream.
- Time-stamp manager module 334 may facilitate the time-stamping of the audio stream.
- Audio interceptor module 330 may make use of a customized DLL to facilitate such a redirection of the audio stream.
- PID manager module 332 serves to detect and manage the different process IDs of the multiple applications running on server 302 .
- PID manager module 332 may stamp each audio stream redirected to audio server with the process ID of its associated application.
- Audio server 304 includes audio stream processing module 336 and PID authentication module 338 .
- Audio stream processing module 336 serves to encode the audio streams received from the applications running on server 302 , and perform any conversion desired (e.g., conversion of sample rates, bit depths, channel counts, buffer size, etc.).
- User Datagram Protocol ports are used (not shown) to direct each audio stream to its destination client device; other protocols may be used in other embodiments.
- Audio stream processing module 336 directs each audio stream to the port associated with the audio stream's corresponding client device (i.e., the client device displaying the video frames corresponding to the audio stream). Audio stream processing module 336 may work in association with PID authentication module 338 to verify and direct the multiple audio streams streaming from server 302 to the appropriate port.
- Client 306 includes video decoder module 340 , audio decoder module 342 , command process module 344 and audio/video sync module 346 . After client 306 receives and decodes the applicable audio and video streams from server 302 (i.e., the audio and video streams of the application instantiated for client 306 ), audio/video sync module 346 correlates the time-stamps on both streams and works in conjunction with audio decoder module 342 and video decoder module 340 to synchronize output to speaker 348 and viewing application 350 , respectively. Client 306 also includes state manager module 351 to manage state information.
- server 302 i.e., the audio and video streams of the application instantiated for client 306
- audio/video sync module 346 correlates the time-stamps on both streams and works in conjunction with audio decoder module 342 and video decoder module 340 to synchronize output to speaker 348 and viewing application 350 , respectively.
- Client 306 also includes state manager module 351 to manage state information
- FIG. 3 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 4 is a block diagram illustrating some aspects of the present invention in connection with a client, according to one embodiment.
- Client 400 includes video decoder module 144 , audio decoder module 142 , audio/video sync module 406 , command process module 146 , speaker 150 , viewing application 148 , state manager module 149 , and connections 410 , 412 and 414 .
- Video decoder module 144 receives encoded video frames via connection 412 , while audio decoder module 142 receives an encoded audio stream via connection 414 .
- Audio/video sync module 406 serves to match time-stamps or another type of identifier on the audio stream and the video frames for synced output via speaker 150 and viewing application 148 , respectively.
- Audio decoder module 142 , video decoder module 144 and viewing application 148 all may serve to provide feedback to command process module 146 , to communicate back to the server-side application feedback parameters (not illustrated in FIG. 4 ), including to vary the sample rate and/or compression of the video encoding, the audio encoding, etc.
- Command process module 146 serves to pass feedback parameters of client 400 for use in video and/or audio encoding upon initiation of a session or during a session.
- Such feedback parameters may include one or more of the following parameters: display resolution, screen size, processor identification or capabilities, memory capabilities/parameters, speaker capabilities, and so forth.
- Viewing application 148 displays the succession of video frames of a portion of the server-side application's UI. Viewing application 148 serves to facilitate communicating user input control, including user commands, to command process module 146 for transmission back to the server. Client user input control passed back to the server may include, for example, input from: a keyboard (screen-based or physical, in a variety of forms), scroll wheels, number pads, stylus-based inputs, a touchscreen or touchpad, etc. Viewing application 148 serves to aggregate certain user input for sending, such as opening up a local text box for text entry. State manager module 149 manages state information, as will be described in relation to sub-sequent Figures.
- viewing application 148 comprises an application based on the Java Platform, Micro Edition, and portable to BREW (Binary Runtime Environment for Wireless). In other embodiments, viewing application 148 comprises an application based on the Symbian platform, a Linux-based platform (e.g., Android), a Palm platform, a Pocket PC/Microsoft Smartphone platform, or another platform capable of supporting the functionality described herein. In some embodiments, viewing application 148 runs as a stand alone Java application, while in other embodiments, viewing application 148 runs as a plugin to a standard mobile browser, as would be appreciated one skilled in the art. In one embodiment, viewing application 148 includes a number of modules to facilitate implementing the functionality described herein, including modules to facilitate browser navigation, client I/O and viewport tracking (not shown), as would be appreciated by one of skill in the art.
- modules to facilitate implementing the functionality described herein, including modules to facilitate browser navigation, client I/O and viewport tracking (not shown), as would be appreciated by one of skill in the art.
- FIG. 4 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 5 is a diagram 500 illustrating some aspects of a multiple-user software architecture, according to one embodiment.
- User manager module 502 includes worker thread 504 , worker thread 506 , and a wide-ranging number of additional worker threads (represented by ellipsis 508 ); with ‘worker thread n’ being represented by worker thread 510 .
- Worker thread 510 represents the total number of worker threads that happen to be running in the system at any given point. Each worker thread corresponds to a list of active users, and the lists of active users may potentially comprise a different number of active users. As illustrated in FIG.
- worker thread 504 corresponds to thread cycle 512
- worker thread 506 corresponds to thread cycle 514
- the variable number of worker threads represented by ellipsis 508 corresponds to the same variable number of thread cycles represented by ellipsis 518
- worker thread 510 corresponds to thread cycle 516 . As illustrated in FIG.
- worker thread 504 cycles through user 1 ( 520 ), user 2 ( 522 ), user 3 ( 524 ) and user 4 ( 526 ); worker thread 506 cycles through user 5 ( 528 ), user 6 ( 530 ) and user 7 ( 532 ); and worker thread 516 cycles through user 8 ( 534 ), user 9 ( 536 ), user 10 ( 538 ), user 11 ( 540 ) and user 12 ( 542 ).
- the number of users supported by the worker threads illustrated in FIG. 5 is meant to represent a snapshot at an arbitrary point in time, as the number of users supported by any given thread is dynamic.
- User manager module 502 may be set to instantiate a finite number of worker threads before instantiating additional worker threads to manage further users added to the system.
- the number of worker threads in the overall architecture illustrated by FIG. 5 will vary according to various embodiments.
- the parameters regarding the number of active users assigned per worker thread will also vary according to various embodiments.
- User manager module 502 runs on a server (as illustrated in FIG. 1 ) where multiple instances of applications (as illustrated in FIG. 1 ) are also running. User manager module 502 thus serves to manage multiple users in an environment of multiple application instances. When a new user is introduced into the overall system of FIG. 1 , the new user is assigned to a worker thread ( 504 - 510 ) to facilitate the interaction between a specific client and a specific server-side application.
- FIG. 5 illustrates multiple users being assigned to a single thread
- a single user may be assigned to their own single thread.
- a user may be assigned to either a shared thread or a dedicated thread depending on one or more factors, such as the current loading/usage of the overall system, the user's service policy with the provider of the respective service operating an embodiment of this invention, and so forth.
- User manager module 502 facilitates load balancing of multiple users in a number of ways, as each worker thread cycles through their respective list of active users and processes one active event for each user.
- the active events that may be processed include: (a) send one video frame update to the client or (b) update state information pertaining to the client's viewing application and the server-side application/UI.
- worker thread 512 will time slice between the respective video encoder/command process modules of users 1 ( 520 ) through 4 ( 526 ) to perform (a) and (b) above, with video encoder module 124 and command process module 126 comprising the video encoder/command process modules of user 1 ( 520 ).
- a separate thread would be operatively coupled to audio encoder module 120 to continuously send the encoded audio stream to the client, when audio data is present to send.
- a processing-intensive session e.g., for a high resolution, high frame rate video
- more than one active event may be processed per user.
- a variable number of active events may be processed per user, based on a wide range of factors, including the level of underlying application activity for the applicable user, the user's service policy with the provider of the respective service operating an embodiment of this invention, and so forth.
- User manager module 502 may move specific users among different worker threads to load balance among processing-intensive users and processing-light users. For example, if multiple users being serviced by one worker thread are in need of being serviced with high frame rate video, user manager module 502 may move one or more such users to another thread servicing only, or predominately, processing-light users. Another thread may also be instantiated for this purpose. As one skilled in the art would appreciate, a user may be treated as an object, and moved to another thread as objects are transferred among threads. The timing of such object moves may take place at specific junctures in the display of video frames by a client device's viewing application, in order to minimize disruption of a user's experience.
- FIG. 5 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 6 is a flowchart 600 illustrating some supporting aspects of capturing a succession of video frames, according to one embodiment.
- Operation 602 (Render Application UI to memory) is performed initially, either by a plugin to an application or by an application itself.
- Operation 602 serves to capture the UI of an application as the UI would appear on a screen and save the capture in a memory buffer ( 218 of FIG. 2 ); actual display of the UI on a screen is not required, but may be used.
- Operation 604 (Any delta from prior Application UI capture?) then serves to compare the last prior capture of the application UI to the current capture to determine whether any changes have occurred.
- This delta checking operation may be performed in a wide number of ways, including, for example, hashing pixel blocks of the current UI capture and comparing the hash values to an analogous pixel-hash table generated from the prior UI capture.
- the hash values may then also be available for potential use in any compression method utilized, e.g., matching blocks of successive video frames, matching blocks against a reference frame, etc.
- the application may notify the server when a change occurs prior to operation 602 .
- Operation 606 (Delay) may be implemented by a timer (not shown), before operation 602 is repeatedly performed. If operation 604 is determined in the affirmative, Operation 608 (Convert to appropriate format) is then performed. Operation 608 serves to convert the native format of the UI rendered data to another format more suited for compression and transmission over a network for display on a client device (e.g., color space transformation, data format transformation, etc.).
- Operation 610 (Resize) is then performed. Operation 610 serves to resize the native screen size inherent to the UI rendered data to another size more suited for display on a client device (e.g., a cellular phone, a handheld computer, etc.). Operation 610 may make use of one or more feedback parameters (not shown) of the client device communicated to the server-side application and its accompanying video encoder instantiation. Operation 612 (Store in memory for encoder) then follows, storing the converted video frame for use by a video encoder. Operation 614 (Flag video update needed) then follows, setting an indication for use by an operation determining if a video update is needed (see operation 712 (Video update needed?) of FIG. 7 ). Operation 616 (Delay) then follows, and may be implemented by a timer (not shown), before operation 602 is repeatedly performed.
- a timer not shown
- FIG. 6 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 7 is a flowchart 700 illustrating some supporting aspects of sending a succession of video frames, according to one embodiment.
- Update User N, 702 represents a start of a sequence of steps, with the sequence of steps representing the steps undertaken by the worker threads of FIG. 5 for each user in the applicable worker thread's list of active users.
- a worker thread initially performs operation 704 , (Read back channel).
- Operation 706 Any data to process for user?) then follows, where it is determined if anything pertinent for User N came in from the network that needs to be processed. Responsive to data pertinent to User N being detected, operation 708 (Process network events) then follows.
- Incoming data pertinent to User N may comprise, for example, information regarding user input to the client device, such as attempting to zoom in on a particular part of the server-side application UI (as shown by a video frame of the server-side application UI displayed on a viewing application running on the client device).
- Operation 708 may include communicating such processed information to its next destination, e.g., if a zoom command had been sent from the client, the zoom command would be appropriately processed and forwarded to the server-side application before the worker thread proceeded to the next applicable operation.
- operation 710 (Update needed?) is then performed. Operation 710 may depend on a counter (not shown) being set when the last video frame for the applicable user was sent, or, more specifically, when operation 712 (Video update needed?) was last performed. If the counter has not yet reached its endpoint, then the worker thread performing the operations will then proceed to operation 718 (Increment N) to commence the sequence of steps illustrated in FIG. 7 for the next applicable user.
- the counter controls the frame rate for the succession of video frames being communicated from the server to the client, or, more specifically, the allowable frame rate, as will be further described below in relation to operation 712 , (Video update needed?). For example, for an allowable frame rate of ten times per second, the counter would be set to count to 100 milliseconds (e.g., from 100 milliseconds down to zero, or vice-versa).
- operation 712 (Video update needed?) is performed. Operation 712 may comprise checking a ‘video update needed’ flag, as was described in relation to FIG. 6 , or some such similar operation. Operation 712 serves to determine whether anything has changed in the portion of the server-side application being displayed by the client in the client's viewing application. If operation 712 is determined in the affirmative, operation 714 (Grab needed video info) is then performed. Operation 714 serves to obtain any video frame information needed to update the video frame of an application UI from the last transmitted video frame of the UI, and may make use of a wide range of video frame/compression techniques, including video frame/compression standards, customized methods, and combinations thereof.
- operation 716 (Send network event) is then performed. Operation 716 serves to send to the client one video frame update, or updated state information pertaining to the client's viewing application and the server-side application/UI. Operation 716 may post the applicable data for User N to the queue for User N's network connection to effectuate this.
- operation 716 (Send network event) may be still be performed if there is updated state information pertaining to the client's viewing application and the server-side application/UI.
- the worker thread servicing multiple users may cycle through and serve all of the users on its active user list without any one user significantly consuming the worker thread's time to the detriment of any other particular user.
- the worker thread's load across all of its supported users increases, servicing times for all of the worker thread's active users will gradually increase. Load balancing is thus inherently embedded in this manner. Additional load balancing techniques are described in connection with FIG. 5 .
- FIG. 7 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 8 is a diagram illustrating some aspects of client-server exchange 800 , according to one embodiment.
- Client-server exchange 800 depicts a session exchange between server 802 and client device 804 .
- server 802 may refer to any server-side machine, and may include a number of servers, either located in one facility or geographically dispersed, operating in conjunction to facilitate the operations described in FIG. 8 . These servers may include authentication servers, database servers, etc.
- Client device 804 initiates client-server exchange 800 with operation 806 , with a user launching a viewing application on client device 804 .
- the viewing application then facilitates opening a connection to server 802 via a network connection via operation 808 .
- Operation 808 makes use of one or more standard Internet protocols, or variations thereof, as would be appreciated by one of skill in the art. Operation 808 serves to pass the user's identity (e.g., by telephone number, carrier account, etc.) and client device's ( 804 ) display resolution and size to server 802 , for use by server 802 in the session.
- Server 802 then performs operation 810 , which launches and provisions a server application instance, with the server application customized based on the user's preferences.
- the user's preferences are fetched from a database (not illustrated) where they have been associated with the user's identity.
- the server application renders in a virtual frame buffer. In another embodiment, the server application may render to a screen.
- Operation 812 then follows, where audio/video encoder modules, and an accompanying command process module, are launched and provisioned with client device's ( 804 ) display resolution and size for customized encoding for client device 804 .
- the command process and encoder modules may also be provisioned with a level of service associated with the user's identity, providing the particular user with a priority-level with regard to other users using the system.
- the video encoder may convert and encode video frames of the server application's UI (e.g., converting the video frames rendered by the server application to QVGA resolution from the server application's native rendering resolution because client device 804 supports QVGA resolution).
- the video encoder module also resizes the server application's native UI size rendering to suitably fit within, or work together with client device's ( 804 ) screen size.
- the audio encoder module encodes an audio stream output of the server application based on the speaker capabilities of client device 804 (e.g., if client device is known to be a cellular phone, the audio may be encoded such the quality encoded does not exceed the particular phone's speaker capabilities, or of a default level used for cellular phones).
- Arrow 816 illustrates the communication of the encoded audio and video to client device 804 .
- Operation 818 subsequently follows, where the viewing application's decoder modules (audio and video) decode the audio and video frames received.
- the video frames may be displayed in the viewing application on the client, and the audio may be output to client device's ( 804 ) speakers (if audio is present and client 804 is not on mute), as depicted by operation 820 .
- Operation 822 subsequently follows, depicting an on-going series of interactions (represented by arrows 826 and 828 ) between server 802 and client device 804 , also represented by operation 824 on the client side.
- Operation 822 depicts the server application only sending information to the encoder modules when the UI or audio output change, with the encoders encoding this information for transmittal to client device 804 .
- a video encoder of server ( 802 ) thus asynchronously communicates video frames based on changes in the UI. Video is sent as video frames to display UI changes, and not as commands outside of a video frame format.
- Operation 824 depicts the user interacting with the virtual application, with the user's inputs being transformed into parameters and being passed back to the server application. Operation 822 further depicts the server-side command process module translating user input parameters to operate on the server application UI, with the server application UI accordingly changing. Operation 824 completes the cyclical sequence by further depicting the encoded audio and video resulting from user inputs to the virtual application being received, decoded and displayed in the viewing application.
- FIG. 8 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 9 is a diagram illustrating some aspects of client-server exchange 900 , according to one embodiment.
- Client-server exchange 900 depicts a session exchange between server 903 and client 904 , with an accompanying exchange between encoder/command process modules 902 and application 906 (both running on server 903 ) also being illustrated.
- Application 906 includes state management module 907 and client 904 includes state management module 905 , which will be discussed in relation to FIG. 10 .
- Application 906 comprises a web browsing application in this embodiment.
- Encoder/command process modules 902 comprise audio and video encoder modules and a command process module. References to exchanges with encoder/command process modules 902 may only specifically comprise an exchange with one of these modules, as would be appreciated by one skilled in the art.
- a functional element similarly situated as encoder/command process modules 902 may comprise a video encoder module and a command process module, but not an audio encoder module.
- server 903 may refer to any server-side machine, and may include a number of servers, either located in one facility or geographically dispersed, operating in conjunction to facilitate the operations described in FIG. 9 . These servers may include authentication servers, database servers, etc.
- Client 904 initiates client-server exchange 900 with operation 908 , open connection.
- Server 903 responds with operation 910 , connection confirmed.
- Client 904 then send its capabilities to encoder/command process modules 902 , including screen size and other device parameters, via operation 912 .
- the device parameters may include a wide variety of device parameters, including a device processor, memory, screen characteristics, etc.
- Client 904 then sends a URL via operation 914 , which may comprise a saved URL (e.g., a homepage) or a URL entered by the user of client 904 .
- Encoder/command process modules 902 in turn communicate the URL to application 906 via operation 916 , and application 906 then loads the URL via operation 918 .
- Application 906 also passes the width (w) and height (h) of the web page associated with the URL to encoder/command process modules 902 via operation 920 .
- Encoder/command process modules 902 then communicates the web page size to client 904 , as well as the viewport visible on the client screen, including parameters characterizing the viewport of the client, e.g., a corner coordinate (x, y) and an associated zoom factor (z), via operation 922 .
- the parameters characterizing the viewport of the client may comprise absolute position information, relative position information, etc.
- a screen capture of the webpage viewport (the portion of the browser UI that the viewport has been associated with) then takes place via operation 924 , in accordance with a number of techniques known in the art.
- a video frame of the web page visible through the viewport is then communicated to client 904 via operation 926 .
- a subsequent screen capture 930 then takes place after a variable sample interval 928 , with the associated video frame being communicated via operation 932 .
- Arrow symbol 929 commonly used to indicate a variable element, is illustrated crossing variable sample interval 928 to indicate this novel feature.
- An asynchronous feedback channel provides feedback via operation 934 .
- This feedback may be used to vary the sample interval 928 based on one or more feedback parameters, including client device parameters, user input parameters, and/or estimated bandwidth parameters, such as bandwidth parameters based on measurements of the packets traversing back and forth between server 903 and client 904 .
- RTCP protocol or a similar such protocol (standardized or customized) may be used in connection with providing such feedback, as illustrated by operation 936 .
- Ellipsis 938 and cycle 940 illustrate the repetitive nature of the interaction between server 903 sending video frames to client 904 .
- Sample interval 928 may also be at least partially varied based on the rate of change of the underlying webpage being viewed. For example, if little to no change is detected in the underlying webpage being viewed by client 904 , then the frame sample interval may be adjusted upward. Likewise, for a very dynamic webpage, or content within a webpage, the frame sample interval may be adjusted downward.
- the user of client 904 may move the viewport from which a webpage is being viewed, to view another portion of the webpage, as depicted in operation 942 , with x′ and y′ comprising new parameters of the viewport.
- the new portion of the webpage that matches the new viewport will then be captured via operation 944 , and a video frame of the new viewport will be communicated to client 904 via operation 946 .
- the user of client 904 may again move the viewport, as depicted in operation 948 , with x′′ and y′′ comprising new parameters of the viewport. This time, the new viewport extends beyond what would be displayed on the server browser window, and thus the browser itself must scroll to capture the desired portion of the webpage, as depicted in operation 950 . Having appropriately scrolled, as depicted via operation 952 , a screen capture of the new viewport will then be obtained, as illustrated in operation 954 , with the resulting video frame communicated via operation 956 .
- the user of client 904 may also use a mouse or mouse-equivalent (e.g., finger tap/motion on a touchscreen, multi-directional button, trackpoint stylus moving a cursor, etc.), as shown via operation 958 , where a mouse down motion is made, with the new coordinates of the mouse being passed as (a, b).
- Client 904 will pass coordinates relative to the client device's screen back to encoder/command process modules 902 in such an operation, with encoder/command process modules 902 determining the corresponding location in relation to the viewport and underlying webpage.
- a mouse or mouse-equivalent e.g., finger tap/motion on a touchscreen, multi-directional button, trackpoint stylus moving a cursor, etc.
- server 903 is running an underlying Windows OS, permitting the injection of a mouse message with the appropriate location information to the window associated with browser 906 (whether there is an actual screen being used for rendering or not). This is illustrated via operation 960 , and the screen cursor would resultantly move in application 906 , and be communicated back in a video frame to client 904 as described above.
- similar such functions may be used if available, or some analogous other such techniques, as would be appreciated by one skilled in the art.
- mouse-driven event is used broadly herein to include input control events triggered by a wide variety of mouse or mouse-equivalent control inputs on a variety of devices (e.g., finger tap/motion on a touchscreen, multi-directional button, trackpoint, stylus moving a cursor, etc.).
- Other input driven control events may work in the same manner as well.
- FIG. 9 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 10 is a diagram 1000 illustrating some aspects of viewport move operations and related state management, according to one embodiment.
- FIG. 9 is referenced throughout the description of FIG. 10 , as the two figures are related.
- Diagram 1000 includes webpage 1002 , with a width of w′ 1004 , and a height of h′ 1006 .
- Webpage 1002 is illustrated partially rendered in FIG. 10 , with rendered webpage portion 1008 having a width of w 1010 , and a height of h 1012 .
- Webpage portions 1014 , 1020 , and 1026 are illustrated, each having a left corner coordinates of (x, y) 1016 , (x′, y′) 1022 , and (x′′, y′′) 1028 , respectively, and associated zoom factors of 1018 , 1024 , and 1030 , respectively.
- Webpage portion 1026 includes cursor position (a, b) 1032 .
- Webpage portions 1014 , 1020 , and 1026 relate to operations 922 , 942 and 948 , respectively, of FIG. 9 .
- Webpage portion 1014 corresponds to a portion of webpage 1004 sent for remote viewing, which comprises a viewport of client (being indicated by (x, y) 1016 and zoom factor 1018 ).
- state manager module 907 of server 903 having previously identified webpage portion 1014 as the current state, updates its current state webpage to webpage portion 1020 .
- State manager module 905 of client 904 does likewise upon client 904 displaying webpage portion 1020 .
- State manager modules 907 and 905 then identify webpage portion 1014 as the prior state.
- Client 904 may request prior state webpage portion 1020 from server 903 , such as, for example, via a back icon (not shown).
- client 904 may display a locally cached version of the prior state webpage portion while server 903 is in the process of obtaining/sending the current version of the prior state webpage portion.
- Webpage portion 1026 likewise becomes the next current viewport of client 904 per operations 948 - 956 , and state manager modules 907 and 905 likewise updating the current state webpage to webpage portion 1026 , with the addition of internal application scrolling operations 950 and 952 due to part of 1026 not being on rendered webpage 1008 .
- the entire applicable webpage is rendered (analogous to h′ 1006 by w′ 1004 of FIG. 10 ), and thus there are no internal scrolling operations to perform.
- cursor position sub-states are maintained relative to viewport views and the viewport view's corresponding webpage portion.
- webpage portion 1026 comprises the current state
- webpage portion 1026 includes cursor position (a, b) 1032 .
- cursor position (a, b) is updated, so too is its corresponding sub-state maintained by state manager module 907 .
- FIG. 10 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 11 is a diagram illustrating some aspects of a client-server exchange 1100 with respect to state management, according to one embodiment.
- Client-server exchange 1100 depicts a session exchange between client 1102 and server 1104 .
- server 1102 may refer to any server-side machine, and may include a number of servers, either located in one facility or geographically dispersed, operating in conjunction to facilitate the operations described in FIG. 11 . These servers may include authentication servers, database servers, etc.
- Server 1102 includes state manager module 1104 to manage state information, and client 1106 includes state manager module 1108 to manage state information.
- State manager module 1104 and state manager module 1108 operate as described herein; their common name describes their general function and is not meant to imply they are instances of the same module. In another embodiment, these two modules may be instances of the same module. In yet another embodiment, state manager modules 1104 and 1108 may operate as one logical software unit.
- Operation 1110 illustrates state manager module 1104 identifying a portion of a webpage sent for remote viewing, with the portion comprising a viewport view of client 1106 .
- Operation 1114 illustrates state manager module 1104 identifying this webpage portion as the current state webpage portion.
- Operation 1112 illustrates state manager module 1108 identifying a portion of a webpage being displayed, with operation 1112 illustrating state manager module 1108 identifying this webpage portion as the current state webpage portion.
- the webpage portions are defined areas within a webpage, and thus in another embodiment, a current state checker module (not shown) may be used to periodically verify the current states are uniform across server 1102 and client 1106 .
- a common table (not shown) may be shared among state manager modules 1104 and 1108 , where states are defined in terms of location identifiers of a specific portion of a specific webpage.
- Operation 1118 illustrates a user of client 1106 selecting a second portion of the webpage.
- the user can make such a selection in a wide variety of ways, including moving a navigation toggle or a scroll wheel, using a touch screen or keypad, etc.
- the webpage portions further comprise an area of the webpage surrounding the viewport of client 1106 .
- the second portion of the webpage may already reside on client 1106 .
- Operation 1120 illustrates checking if the second portion is already in local memory.
- the webpage portions match the viewport of client 1106 , and a local memory check, like operation 1120 , may check if the desired webpage portion had been previously loaded and was still resident in local memory. Determining whether on object of interest is in local memory can be performed in a number of ways, as would be appreciated by one of skill in the art.
- operation 1120 was determined in the negative, and parameters for the second portion of the webpage are thus communicated via operation 1122 .
- FIGS. 9 and 10 provide more detail regarding passing parameters.
- a second portion of a webpage is identified by server 1102 and sent to client 1106 , as illustrated by operations 1124 and 1126 .
- State manager modules 1104 and 1108 accordingly identify the second portion as the current state portion via operations 1128 and 1130 .
- State manager modules 1104 and 1108 then identify the former prior state webpage portion as the prior state in operations 1132 and 1134 .
- operation 1120 may also be determined in the positive (not shown).
- the second webpage portion may not be requested by, and provided to, client 1106 .
- Client 1106 may simply display the second webpage portion and relay parameters regarding the second webpage portion in order for state manager module 1104 of server 1102 to update its state information. Displaying the second webpage portion may also comprise an intermediate step before obtaining a more current version of the second webpage portion from server 1102 .
- This plurality of states may comprise an ordered succession of states, as may be recorded in a browsing session using an embodiment of the invention.
- This ordered succession of states may be available toggling through the website portion views of a user session, saved as a series of bookmarks, etc.
- a feature of managing state information as described herein includes enabling the sharing of a webpage portion with another user, in accordance with one embodiment.
- a webpage portion may be included in an identifier that can be placed in a message for sending, with the identifier adapted to facilitate transport to the applicable webpage portion.
- the message could be an email message, an instant-messaging message, a messaging feature used in a social network, etc.
- the identifier may route the recipient of the message into their system account (an embodiment client-server system account), where the webpage portion included in the identifier can be accessed like a prior webpage portion state of the user.
- a database may be used in conjunction with such identifiers.
- other ways may be used to augment a webpage portion identifier so that it may be transferred among different users, both within and outside of embodiment client-server system accounts.
- FIG. 11 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 12 is a diagram illustrating some aspects of a client-server exchange, including an accompanying exchange between a server and network storage, according to one embodiment.
- Client-server exchange 1200 depicts a session exchange between client 1202 and server 1204 , with an accompanying exchange between server 1204 and network storage 1206 .
- server 1204 may refer to any server-side machine, and may include a number of servers, either located in one facility or geographically dispersed, operating in conjunction to facilitate the operations described in FIG. 12 .
- Network storage 1206 may refer to any one or more storage units, operating in combination or any other way, that are networked to server 1204 .
- Server 1204 includes provision manager module 1205 and user manager module 502 (of FIG. 5 ).
- Operations 1208 , 1210 and 1212 serve to initiate a session.
- Operation 1214 launches a unique browser instance for client 1202 .
- Provision manager module 1205 uses the user identifier passed in operation 1210 (e.g., a telephone number, account number, etc.), or another identifier associated with the user identifier, to fetch browser state information for the user from network storage 1206 via operations 1216 and 1218 .
- Provision manager module 1205 likewise fetches a customer profile for the user from network storage 1206 via operations 1216 and 1218 .
- the customer profile and the browser state information may reside in different, unrelated portions of network storage 1206 .
- Browser state information for the user may include any type of user information associated with a browser, including bookmarks, cookies, caches, etc.
- provision manager module 1205 provisions the unique browser instance launched for the user as would be appreciated by one skilled in the art (e.g., by using automated directory copies, automated provisioning techniques, etc.).
- provision manager module 1205 works in conjunction with user manager module 502 (described in relation to FIG. 5 ) to provision resources based on the user's customer profile.
- the user's customer profile could include the user's past usage history (e.g., categorized as bandwidth low, medium, or heavy), encoding usage (also possibly categorized), etc.
- Provision manager module 1205 and user manager module 502 together operate to provide different levels of service to a user, based on a level of service provider plan, etc.
- a wide range of parameters may be configured, depending on the particular embodiment, including peak/average bandwidth Quality of Service (QoS), video compression settings for motion video (i.e., quality), video frame rate, video image size limits and refitting, server memory, CPU and/or disk usage.
- QoS peak/average bandwidth Quality of Service
- the parameters configured could be set statically per user/profile, or the parameters could dynamically change, based on a number of factors, including the type of URL being used. In this way, the service provider could provide higher quality video for low-bandwidth pages (e.g., news websites, etc.) and place more restrictions on bandwidth-intensive websites. As described in relation to FIG. 5 , provisioning a user among worker threads may also be used to implement tiered levels of service.
- Arrows 1224 and 1226 depict a web browsing session.
- the browser state information will be updated based on the user's use during the session (e.g., the addition of cookies based on websites visited, modifications to bookmarks during the sessions, etc.).
- the user's customer profile will also be tracked for updating the user's customer profile on terminating the session.
- user manager module 502 may perform a state copy of a user session to transfer a user to another thread, either on the same server or a different server.
- User manager module 502 may operate in conjunction with provision manager module 1205 to facilitate such a move. While moving a user from one thread to another is discussed in relation to FIG. 5 , provision manager module 1205 may also facilitate such a state copy by updating state information during a user session.
- User manager module 502 may then pause a browsing session, while provision manager module 1205 provisions another browser instance with the updated state information at the desired destination.
- User manager module 502 may then move the user session to the desired destination by performing a state copy (which may include a directory copy, etc.).
- the applicable user web browsing session may then resume. The timing of such transfers may take place at specific junctures in the display of video frames by a client device's viewing application, in order to minimize disruption of a user's experience, such as when a user makes a request of a URL.
- the user may close their virtual browser, as shown in operation 1232 , and accompanying operation 1234 .
- the user's browser end state will also be captured by provision manager module 1205 , as shown via operation 1236 , via a directory save, etc.
- the user's customer profile will be updated based on the usage information captured, as shown in operation 1238 . For example, the user may visit mostly text-based websites during a session, and have their average bandwidth and encoding averages lowered by such a resource-light session.
- Provision manager module 1205 will update the user's state information and customer profile based on the session, saving the updated versions in network storage 1206 , as server 1204 closes the unique server browser instantiated at the beginning of the session.
- FIG. 12 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 13 is a diagram illustrating some aspects of displaying a web browsing session from a server to a cellular phone, according to one embodiment.
- Server 1302 is illustrated, upon which a web browsing engine (not shown) executes, rendering webpage 1304 .
- Webpage 1304 is rendered into a buffer on server 1302 in the embodiment discussed in relation to FIG. 13 , but webpage 1304 may be also rendered to a screen in another embodiment.
- a recognition engine (not shown) also runs on server 1302 , which identifies elements of interest on webpage 1304 , such as hyperlink 1306 .
- Recognition engine may use any number of techniques known to one skilled in the art to identify elements of interests with respect to webpage 1304 , including parsing through the HTML/XHTML/Document Object Model data to obtain access to the rendered webpage data/contents of webpage 1304 (including location information of the contents of webpage 1304 ).
- other techniques for parsing webpage contents may be used, including a tabbing function to cycle through components of a webpage.
- optical character recognition techniques known to those skilled in the art may additionally or alternatively be used.
- the information regarding the elements of interest may be communicated along with webpage 1304 , such as in the form of a polygon map.
- a viewing application (not shown) running on cellular phone 1310 may then match such a polygon map with the video of webpage 1304 being displayed.
- the information regarding the elements of interest may be communicated from server 1302 to cellular phone 1310 in combination with the video frames in a wide variety of ways, as would be appreciated by one of skill in the art regarding sending two data sets to be matched at the destination.
- the information regarding the elements of interest may be communicated in response to viewing application passing user input back to the recognition engine on server 1302 .
- a user may select a location on the image being displayed on cellular phone 1310 corresponding to a textbox (not shown) on webpage 1304 , for the user to input text into the text box.
- the location of the user selection may be passed back to server 1302 .
- Recognition engine on server 1302 may then communicate to the viewing application that the location selected by the user corresponds to a textbox.
- the viewing application may then locally render a generic textbox for the input of text by the user, sending the text entered by the user back to the recognition engine once the user has completed the entry.
- the recognition engine may send information regarding the textbox to the viewing application for the viewing application to customize the locally-rendered textbox, including, for example, information regarding acceptable string length, acceptable character entries, string-formatting, etc.
- This information may be used by the viewing application to locally police acceptable input for a given textbox, as well as for formatting the display of input (e.g., causing password textbox entries to appear as “*******” instead of cleartext).
- a textbox, radio button, etc. will already contain text (e.g., have an initial state with initial text), be selected, etc., and this information will be sent to the client to appear in the locally rendered textbox, radio button, etc.
- a state manager on the backend server will also track the state of interactive objects on a webpage, with the state manager serving to synchronize client user input with the web browsing engine on the backend server.
- the state manager may also serve to react to events on the server that occur while a client control is activated (such as a control losing focus suddenly), and may alert a user in some embodiments if a synchronization was unsuccessful, including, for example, providing a reentry prompt.
- state E.g. current selection(s) in a list box, or current state of a radio button.
- Cellular phone 1310 is illustrated in FIG. 13 as the destination of webpage 1304 (labeled webpage portion 1308 with respect to a portion being displayed on cellular phone 1310 ), after the webpage has been transformed into video frames as discussed in regard to other Figures herein.
- the viewing application executing on cellular phone 1310 displays webpage portion 1308 along with address bar 1312 (displaying URL 1340 ), toolbar 1318 , and an overlay graphical component, cursor 1314 , with address bar 1312 , toolbar 1318 and cursor 1314 being rendered locally on cellular phone 1310 .
- Address bar 1312 and toolbar 1318 may appear in a number of locations together with a portion of webpage 1304 , according to a variety of embodiments.
- address bar 1312 and toolbar 1318 may be user-configurable to appear in other display locations, while in some embodiments, address bar 1312 and toolbar 1318 may automatically adjust their locations based on positioning of cellular phone 1310 for cellular phones with motion detection ability.
- Hyperlink 1316 (analogous to hyperlink 1306 rendered on server 1302 ) is displayed in the video image displayed on cellular phone 1310 .
- the viewing application may be adapted to transform cursor icon 1314 into hand icon 1315 (or another icon to denote an underlying hyperlink) while cursor icon 1314 overlays hyperlink 1316 .
- Address bar 1312 comprising a roll out address bar in the embodiment illustrated in FIG. 13 , may include any number of functional user interface elements.
- Address bar 1312 rolls out when “A” icon 1320 is tapped, and rolls back into main toolbar 1318 when “A” icon 1320 is tapped when address bar 1312 is already rolled out, or upon inactivity for a defined time period.
- Address bar 1312 displays the current URL and may also wrap around to another line when a user is entering a long URL.
- Home icon 1334 allows for a user to set a home webpage by a number of means, either through cellular phone 1310 or through an import function through a specialized portal, as would be appreciated by one of skill in the art.
- Favorites icon 1336 allows for a user to save URLs for easy reference, and may also be set up through cellular phone 1310 or through an import function through a specialized portal, as would be appreciated by one of skill in the art.
- “V” icon 1338 allows for a voice/speech URL entry. In some embodiments, viewing application may interface with speech recognition capabilities of cellular phone 1310 for such voice/speech URL entries.
- Back and Forward icons 1322 allow for a user to scroll through webpages for previously entered URLs. In some embodiments, Back and Forward icons 1322 may allow for a user to scroll through webpage portion views. In some embodiments, such icons may be configurable to either scroll through prior webpages, or through prior webpage portion views, based on user selection of the two options.
- “X” stop icon 1326 allows for stopping the loading of a portion of a webpage, and thus appears while a portion of a webpage is being loaded. In some embodiments, “X” stop icon 1326 may display during the loading of, and permit cessation of the loading of, an entire webpage.
- “X” stop icon 1326 is replaced by “R” refresh icon 1324 , to allow for a reloading of the portion of the webpage currently being viewed, or the reloading of an entire webpage depending on the embodiment.
- Zoom in/out icons 1328 allow for zoom functionality and control. Zoom in/out icons 1328 may allow for multiple functionality based on the duration of selecting the icon. For example, for zooming out, each tap may incrementally zoom out, while a long press may zoom out to a full page view. Scroll icons 130 and 132 allow for scrolling in the client window.
- the icons discussed herein may be touched-based for cellular phones with touch screen functionality, may include a menu-based user interface (softkeys), and/or may be selectable via scroll/selection buttons or any other input features of cellular phones.
- a specialized touch pattern e.g., a drag, etc.
- right-button mouse options may be locally rendered by the viewing application.
- web browsing engine on server 1302 may make use of small screen rendering techniques to render all, or part of, webpage 1304 to fit within the width of the screen of cellular phone 1310 .
- the viewing application on cellular phone 1310 may allow for user selection of a small screen rendering view, allowing the user to switch between multiple view formats known in the art.
- Intra-web page views displayed on cellular phone 1310 may be tracked in some embodiments, allowing the center of a view to be maintained across multiple view formats.
- FIG. 13 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 14 is a diagram illustrating some aspects of a user interface display, according to one embodiment.
- Webpage portion 1402 illustrates a collection of textboxes of checkout section 1401 , specifically shipping information section 1403 .
- Webpage portion 1402 appears as if displayed on a mobile device (not shown), such as a cellular phone, with toolbar 1439 appearing on the bottom of webpage portion 1402 .
- toolbar 1439 may appear elsewhere or may roll out of view, either by default or by user configuration.
- Shipping information section 1403 includes checkbox 1415 for selecting the ‘use billing address’ option (illustrated as unchecked), textbox 1416 for first name information 1404 , textbox 1418 for last name information 1406 , textbox 1420 for address information 1408 , textbox 1422 for city information 1410 , dropdown menu 1428 for state information 1412 , textbox 1424 for zip-code information 1425 , textbox 1426 for email information 1414 , and radio buttons 1434 and 1438 , indicating “yes” 1432 or “no” 1436 responses to question 1430 .
- information regarding the interactive elements was communicated along with webpage portion 1402 to the mobile device displaying webpage portion 1402 .
- information regarding an interactive element may be communicated to the viewing application once the user has selected a location on the webpage being displayed corresponding to the interactive element.
- Webpage portion 1440 will be used to illustrate further operations with respect to webpage portion 1402 .
- a check appearing in checkbox 1415 would be rendered on the backend server, in accordance with one embodiment.
- the viewing application may locally render checkbox 1415 and the check, or just the check, depending on the embodiment.
- the viewing application When a user selects textbox 1416 to enter their first name, the viewing application displaying portion of webpage 1402 locally renders textbox outline 1442 and rolls out overlay textbox 1444 (the rolling feature indicated by arrow 1446 for illustration purposes). As the user inputs their name, the letters entered by the user display in overlay textbox 1444 .
- the viewing application will communicate the information to a state manager on the backend server, with the state manager serving to synchronize client user input with the web browsing engine on the backend server.
- an overlay textbox for character entry may be locally rendered directly where textbox 1416 is shown.
- toolbar 1448 may transform to an overlay textbox for character entry.
- the viewing application received information regarding dropdown menu 1428 from the recognition engine residing on the backend server, including the available menu options (e.g., items listed, the initial selection, or state, of the drop-down menu), together with the information to display webpage 1402 .
- Viewing application then locally rendered dropdown menu 1428 , and will locally render an expanded menu upon the user selecting dropdown menu 1428 .
- viewing application will communicate the information to a state manager on the backend server, with the state manager serving to synchronize client user input with the web browsing engine on the backend server.
- other implementations may be used to locally render the selections available from a dropdown menu.
- the viewing application received information regarding radio buttons 1434 and 1438 from the recognition engine residing on the backend server, together with the information to display webpage 1402 .
- Viewing application then locally rendered radio buttons 1434 and 1438 , and locally rendered radio button 1434 selected upon the user tapping radio button 1434 ( FIG. 14 illustrating an embodiment where the mobile device includes a touch screen).
- viewing application will communicate the information to a state manager on the backend server, with the state manager serving to synchronize client user input with the web browsing engine on the backend server.
- other implementations may remotely render selection of a radio button.
- FIG. 14 is merely an example, and that the invention may be practiced and implemented in many other ways.
- one or more of the above interactive elements of a webpage may be rendered completely on the backend server.
- a state manager module within the viewing application may facilitate collecting user input and synchronizing such user input with user input provided to the backend web browsing engine.
- FIG. 15 is a diagram illustrating some aspects of a user interface display, according to one embodiment.
- Webpage portion 1502 appears as if displayed on a mobile device (not shown), such as a cellular phone, with toolbar 1504 appearing on the side of webpage portion 1502 . In other embodiments, toolbar 1504 may appear elsewhere or may roll out of view, either by default or by user configuration.
- Webpage portion 1502 illustrates news section 1506 , including headline 1508 , text section 1510 and webpage video 1512 .
- information regarding the interactive element of webpage video 1512 was communicated along with webpage portion 1502 to the mobile device displaying webpage portion 1502 .
- information regarding webpage video 1512 may be communicated to the viewing application once the user has selected a location on the image displayed corresponding to webpage video 1512 .
- Full screen video view 1518 will be used to illustrate an operation with respect to webpage portion 1502 .
- the viewing application locally rendering cursor icon 1514 will transform cursor icon 1514 to hand icon 1516 based on the information communicated from the backend server to the viewing application.
- the viewing application may not make such a transformation.
- a play icon may be displayed on webpage video 1512 , either placed there by the viewing application or as displayed on the underlying webpage.
- Full screen video view 1518 includes a video-control user interface 1522 , the length of which corresponds to the length of webpage video 1520 , viewed-portion segment 1526 , and pause/play icon 1524 , which may be alternatively used for pausing or playing webpage video 1520 , depending on the current play state of webpage video 1520 .
- the viewing application may not have a specialized video-control user interface for playing webpage video 1520 .
- the viewing application may have a different specialized video-control user interface for playing webpage video 1520 .
- the viewing application may play video 1512 as shown in webpage portion 1502 , and not switch to full screen video view 1518 . In some embodiments, whether video 1512 switches to full screen video view 1518 may be user-configurable.
- FIG. 15 is merely an example, and that the invention may be practiced and implemented in many other ways.
- FIG. 16 illustrates an example computer system suitable for use in association with a client-server architecture for remote interaction, according to one embodiment.
- computer system 1600 may represent either a computer operating as a server, or a computer operating as a client, with the general components illustrated in FIG. 16 potentially varying with each respective representation, as would be appreciated by one of skill in the art.
- Computer system 1600 may include one or more processors 1602 and may include system memory 1604 .
- computer system 1600 may include storage 1606 in the form of one or more devices (such as a hard drive, an optical or another type of disk, electronic memory, including flash memory, and so forth), input/output devices 1608 (as a keyboard (screen-based or physical, in a variety of forms), scroll wheels, number pads, stylus-based inputs, a touchscreen or touchpad, etc.) and communication interfaces 1610 (to connect to a LAN, a WAN, a wired or wireless network, and so forth).
- the elements may be coupled to each other via system bus 1612 , which may represent one or more buses. In the case where system bus 1612 represents multiple buses, the multiple buses may be bridged by one or more bus bridges (not shown).
- processor(s) 1602 may comprise a controller, and system memory 1604 and storage 1606 may comprise one cohesive memory component.
- computing system 1600 may at least be partially incorporated in a larger computing system.
- System memory 1604 and storage 1606 may be employed to store a working copy and a permanent copy of the programming instructions implementing various aspects of the one or more earlier described embodiments of the present invention.
- Any software portions described herein need not include discrete software modules. Any software configuration described above is meant only by way of example; other configurations are contemplated by and within the scope of various embodiments of the present invention.
- engine is used herein to denote any software or hardware configuration, or combination thereof, that performs the function or functions referenced.
- web browsing engine is used herein to describe any software or hardware configuration, or combination thereof, that performs a web browsing function.
- modules have been described to implement various functions.
- part or all of the modules may be implemented in hardware, for example, using one or more Application Specific Integrated Circuits (ASICs) instead.
- ASICs Application Specific Integrated Circuits
- Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by real time network operating systems.
- the present invention also relates to an apparatus for performing the operations herein.
- This apparatus may be specially constructed for the required purposes, or it may include a computer (including any type of computer, depending on various embodiments, including a server, personal computer, tablet device, handheld computer, PDA, cellular phone, etc.) selectively activated or reconfigured by a computer program stored on a computer readable medium that can be accessed by the computer.
- a computer including any type of computer, depending on various embodiments, including a server, personal computer, tablet device, handheld computer, PDA, cellular phone, etc.
- Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
- the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs, including multi-core designs, for increased computing capability.
- the present invention is well suited to a wide variety of computer network systems over numerous topologies.
- the configuration and management of large networks include storage devices and computers that are communicatively coupled to dissimilar computers and storage devices over a network, such as the Internet.
Abstract
Description
- The present application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 60/886,577, filed on Jan. 25, 2007, entitled “SYSTEM FOR WEB BROWSING VIDEO,” which is incorporated by reference in its entirety.
- The present invention relates to a system for displaying a web browsing session. In particular, the present invention is related to a viewing application for displaying a web browsing session from a separate device.
- Access to applications, including web browsers, is provided for in various client-server environments. Placing a web browser on a server for delivery to a client presents a large number of issues, including issues with the delivery of the browsing experience to the client user, such as handling interactive objects within a web page. For interaction with handheld clients, such as cellular phones, bandwidth and display size constraints pose additional challenges in delivering a satisfactory web browsing experience from a server, including dealing with any latency within the system evident to the end user.
- There exists a need to support full-featured web browsing sessions on a diverse cross-section of bandwidth and capability-limited mobile devices in a way that addresses these challenges and advantageously utilizes a client-server environment, as well as to support the use of other applications in this same manner. Embodiments of this invention will address other needs as well.
- In various embodiments, the present invention provides systems and methods pertaining to displaying a web browsing session. In one embodiment, a system includes a web browsing engine residing on a first device, with a viewing application residing on a second device and operatively coupled to the web browsing engine, where the viewing application is adapted to display a portion of a webpage rendered by the web browsing engine and an overlay graphical component. In the same embodiment, the system also includes a recognition engine adapted to identify an element on the webpage and communicate information regarding the element to the viewing application.
- In another embodiment, a system for displaying a web browsing session includes a recognition engine further adapted to communicate information regarding the element to the viewing application responsive to the viewing application passing user input to the recognition engine. In yet another embodiment, such user input comprises a selection of a location on the webpage displayed on the second device, the location corresponding to the element's location on the webpage.
- In still yet another embodiment, a system for displaying a web browsing session includes a recognition engine further adapted to communicate information regarding the element to the viewing application with the portion of the webpage.
- In a further embodiment, the system further comprises a state manager engine operatively coupled to the viewing application and the web browsing engine, the state manager engine adapted to synchronize user input displaying locally on the viewing application with user input transferred to the web browsing engine.
- One skilled in the art will recognize that the present invention can be implemented in a wide variety of ways, and many different kinds of apparatus and systems may implement various embodiments of the invention.
-
FIG. 1 is a block diagram illustrating some aspects of a client-server architecture of the present invention, according to one embodiment. -
FIG. 2 is a block diagram illustrating some aspects of the present invention in connection with a server, according to one embodiment. -
FIG. 3 is a block diagram illustrating some aspects of an architectural overview of the present invention, including a server, an audio server and a client, according to one embodiment. -
FIG. 4 is a block diagram illustrating some aspects of the present invention in connection with a client, according to one embodiment. -
FIG. 5 is a diagram illustrating some aspects of multiple-user software architecture, according to one embodiment. -
FIG. 6 is a flowchart illustrating some supporting aspects of capturing a succession of video frames, according to one embodiment. -
FIG. 7 is a flowchart illustrating some supporting aspects of sending a succession of video frames, according to one embodiment. -
FIG. 8 is a diagram illustrating some aspects of a client-server exchange, according to one embodiment. -
FIG. 9 is a diagram illustrating some aspects of a client-server exchange, including an accompanying exchange within the server, according to one embodiment. -
FIG. 10 is a diagram illustrating some aspects of viewport move operations and related state management, according to one embodiment. -
FIG. 11 is a diagram illustrating some aspects of a client-server exchange with respect to state management, according to one embodiment. -
FIG. 12 is a diagram illustrating some aspects of a client-server exchange, including an accompanying exchange between a server and network storage, according to one embodiment. -
FIG. 13 is a diagram illustrating some aspects of displaying a web browsing session from a server to a cellular phone, according to one embodiment. -
FIG. 14 is a diagram illustrating some aspects of a user interface display, according to one embodiment. -
FIG. 15 is a diagram illustrating some aspects of a user interface display, according to one embodiment. -
FIG. 16 illustrates an example computer system suitable for use in association with a client-server architecture for remote interaction, according to one embodiment. - One skilled in the art will recognize that these Figures are merely examples of the operation of the invention according to one or some embodiments, and that other architectures, method steps, exchanges and modes of operation can be used without departing from the essential characteristics of the invention.
- The present invention is now described more fully with reference to the accompanying Figures, in which one or some embodiments of the invention are shown. The present invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather these embodiments are provided so that this disclosure will be complete and will fully convey principles of the invention to those skilled in the art.
- For illustrative purposes, embodiments of the invention are described in connection with a server or a mobile client device, such as an example mobile client device. Various specific details are set forth herein regarding embodiments with respect to servers and mobile client devices to aid in understanding the present invention. However, such specific details are intended to be illustrative, and are not intended to restrict in any way the scope of the present invention as claimed herein. In particular, one skilled in the art will recognize that the invention can be used in connection with a wide variety of contexts, including, for example, client devices operating in a wired network. In addition, embodiments of the invention are described in connection with a web browsing application, but such descriptions are intended to be illustrative and examples, and in no way limit the scope of the invention as claimed. Various embodiments of the invention may be used in connection with many different types of programs, including an operating system (OS), a wide variety of applications, including word processing, spreadsheet, presentation, and database applications, and so forth.
- In some embodiments, the present invention is implemented at least partially in a conventional server computer system running an OS, such as a Microsoft OS, available from Microsoft Corporation; various versions of Linux; various versions of UNIX; a MacOS, available from Apple Computer Inc.; and/or other operating systems. In some embodiments, the present invention is implemented in a conventional personal computer system running an OS such as Microsoft Windows Vista or XP (or another Windows version), MacOS X (or another MacOS version), various versions of Linux, various versions of UNIX, or any other OS designed to generally manage operations on a computing device.
- In addition, the present invention can be implemented on, or in connection with, devices other than personal computers, such as, for example, personal digital assistants (PDAs), cell phones, computing devices in which one or more computing resources is located remotely and accessed via a network, running on a variety of operating systems. The invention may be included as add-on software, or it may be a feature of an application that is bundled with a computer system or sold separately, or it may even be implemented as functionality embedded in hardware.
- Output generated by the invention can be displayed on a screen, transmitted to a remote device, stored in a database or other storage mechanism, printed, or used in any other way. In addition, in some embodiments, the invention makes use of input provided to the computer system via input devices such as a keyboard (screen-based or physical, in a variety of forms), scroll wheels, number pads, stylus-based inputs, a touchscreen or touchpad, etc. Such components, including their operation and interactions with one another and with a central processing unit of the personal computer, are well known in the art of computer systems and therefore are not depicted here.
- Any software portions described herein with reference to modules need not include discrete software modules. Any software configuration described herein is meant only by way of example; other configurations are contemplated by and within the scope of various embodiments of the present invention. The term, engine, is used herein to denote any software or hardware configuration, or combination thereof, that performs the function or functions referenced.
- Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearance of the phrase “in one embodiment” in various places in the specification does not necessarily refer to the same embodiment. The appearance of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same set of embodiments. The appearance of the phrase “in various embodiments” in various places in the specification are not necessarily all referring to the same set of embodiments.
-
FIG. 1 is a block diagram illustrating some aspects ofsystem 100 of the present invention, according to one embodiment.System 100 employs a client-server architecture that includes a number of server application instances running onserver 200, including server application 1 (102), server application 2 (104), server application 3 (106), and a wide-ranging number of additional server applications (represented by ellipsis 108), up to server application n (110). The term “server application” is used herein to denote a server-side application, i.e., an application running on one or more servers. Server application n (110) represents the number of server application instances that happen to be running insystem 100 at any given point.Server 200 also includesuser manager module 502, which serves to manage multiple users among the multiple server application instances 102-110.User manager module 502 is described herein inFIG. 5 , and represents one of potential multiple user managers running onserver 200.Server 200 is running one instance of an OS underlying server applications 102-110. In another embodiment,server 200 may run multiple instances of an OS, each OS instance including one or more application instances.Server 200 also includesprovision manager module 1205, which is described herein inFIG. 12 . - While
FIG. 1 illustrates multiple server applications 102-110, in other embodiments, a number of different types of programs may be alternately used, including, for instance, an OS. Server applications 102-110 illustrated inFIG. 1 may run on oneserver 200 or any number of servers, as, for example, in one or more server farm environments. Server applications 102-110 may each comprise instances of different server applications, or may all comprise an instance of one server application. For example, each server application 102-110 could comprise a separate instance of a web browsing application. - Describing server application 1 (102) in further detail, as an example server application instance, server application 1 (102) includes
application 112,plugin 114,state manager module 115,recognition module 117,audio data generator 116,audio encoder module 120,video encoder module 124, andcommand process module 126.Video encoder module 124 makes use offeedback parameter 125. -
Video encoder module 124 is operatively coupled toapplication 112, and is adapted to receive a succession of captures (122) of the user interface (UI) ofapplication 112 for encoding into video frames for transmission vianetwork 128. The succession of captures (122) of the UI comprise data that is captured and transferred fromapplication 112 tovideo encoder 124 by a separate module, described and illustrated inFIG. 2 (image management module 216).State manager module 115 manages state information, as will be described in relation to subsequent Figures.Recognition module 117 identifies elements related to output ofapplication 112, as will be described in relation to subsequent Figures. The term, user interface, as used throughout this disclosure, refers to all or a portion of any user interface associated with a wide variety of computer programs. - The encoding of application UI captures (122) is not limited to any particular encoding or video compression format, and may include a wide variety of video compression techniques, ranging from the use of a video compression standard, such as H.264, to an entirely customized form of video compression, to a modified version of a video compression standard, and so forth.
-
Audio encoder module 120 is operatively coupled toaudio data generator 116 ofapplication 112, and is adapted to transform audio captures 118 (e.g., an audio stream) ofaudio data generator 116 into an encoded audio stream for transmission vianetwork 128. Audio captures 118 comprises data being transferred fromaudio data generator 116 toaudio encoder module 120. -
Audio data generator 116 is operatively coupled toapplication 112, and is adapted to generate the audiodata accompanying application 112.Plugin 114 is operatively coupled toapplication 112 andcommand process module 126.Plugin 114 is adapted to facilitate the interface betweenapplication 112 andcommand process module 126. -
Server 200 is further described herein inFIG. 2 . -
System 100 includes a number of clients, including client 1 (400), client 2 (132), client 3 (134), and a wide-ranging number of additional clients (represented by ellipsis 136), up to client n (138), with client n (138) representing the number of clients that happen to be engaged in the system at any given point. As illustrated inFIG. 1 , the different clients comprise different, non-related client devices. - Describing client 1 (400) in further detail, as an example client, client 1 (400) may include
audio decoder module 142,video decoder module 144,command process module 146,viewing application 148,state manager module 149, andspeaker 150.Video decoder module 144 may be adapted to decode the succession of video frames encoded byvideo encoder module 124, where the successive video frames have been transmitted acrossnetwork 128 for reception by client 1 (400).Video decoder module 144 may be operatively coupled toviewing application 148, and adapted to communicate the decoded video frames toviewing application 148 for display of the video frames on client 1 (400).State manager module 149 manages state information, as will be described in relation to subsequent Figures. - Client 1 (400) includes
speaker 150, andaudio decoder module 142 is operatively coupled tospeaker 150.Audio decoder module 142 is adapted to decode the audio captures encoded byaudio encoder module 120, where the encoded audio has been transmitted acrossnetwork 128 for reception by client 1 (400). After decoding the audio stream,audio decoder module 142 may communicate the decoded audio tospeaker 150 for audio output from client 1 (400). -
Viewing application 148 is adapted to receive user input and communicate the user input to commandprocess module 146.Command process module 146 is adapted to communicate the user input back tocommand process module 126 ofapplication 102 vianetwork 128.Command process module 126 is adapted to communicate the user input toapplication 112 viaplugin 114. -
Plugin 114 facilitates the remote interactive use ofapplication 112 via thesystem 100 described inFIG. 1 .Plugin 114 may also be an extension. In another embodiment,application 112 may be customized for use with the client-server architecture of this invention to the extent that a special plugin is not needed. In yet another embodiment, neither a plugin or special application modifications may be needed. -
Command process module 146 is adapted to communicate one ormore feedback parameters 125 tocommand process module 126.Command process module 126 is adapted to communicate the one ormore feedback parameters 125 tovideo encoder module 124 andaudio encoder module 120 for their respective encoding of the succession of application UI captures 122 and audio captures 118. The one ormore feedback parameters 125 may comprise one or more of a wide range of parameters, including a bandwidth parameter relating to at least a portion ofnetwork 128, a device parameter of client 1 (400) or a user input for client 1 (400). - The one or
more feedback parameters 125 may comprise a bandwidth parameter, which may include any estimated or measured bandwidth data point. An example bandwidth parameter may include estimated bandwidth based on measurements of certain packets traversing betweenserver 200 and client 1 (400), (e.g., how much data sent divided by traversal time to obtain a throughput value), or other bandwidth information obtained from, or in conjunction with,network 128, including from a network protocol. The one ormore feedback parameters 125 may comprise user input for client 1 (400), including, for example, a user request for encoding performed in a certain format or manner, with such a request being requested and communicated by viewingapplication 148. The one ormore feedback parameters 125 may comprise a display resolution of client 1 (400) (e.g., CGA, QVGA, VGA, NTSC, PAL, WVGA, SVGA, XGA, etc.). The one ormore feedback parameters 125 may comprise other screen parameters (e.g., screen size, refresh capabilities, backlighting capabilities, screen technology, etc.) or other parameters of the client device (e.g., device processor, available memory for use in storing video frames, location if GPS or other location technology-enabled, etc.). None of the example feedback parameters discussed above are meant to exclude their combined use with each other, or other feedback parameters. In some embodiments,video encoder module 124 may be adapted to at least partially base its video sample rate on the one ofmore feedback parameters 125. - The multiple clients depicted in
FIG. 1 are illustrated to indicate that each client may potentially comprise a different type of client device, each with its own one or more feedback parameters. - Client 1 (400) is further described herein in
FIG. 4 . - One skilled in the art will recognize that the client-server architecture illustrated in
FIG. 1 is merely an example, and that the invention may be practiced and implemented using many other architectures and environments. -
FIG. 2 is a block diagram illustrating some aspects of the present invention in connection withserver 200, according to one embodiment.Server 200 includesuser manager module 502,provision manager module 1205, server application 1 (102),application 112,plugin 114,state manager module 115,recognition module 117,audio data generator 116,audio encoder module 120,image management module 216,memory 218, video encoder module 124 (which includes feedback parameter 125),command process module 126, and alignmodule 224.Command process module 126 includesclient interpreter sub-module 228, andplugin 114 includes client implementersub-module 208. The components illustrated inFIG. 2 with the same numbers as components illustrated inFIG. 1 correspond to those respective components ofFIG. 1 , and thus their general operation will not be repeated. While one running application is illustrated with respect toserver 200,server application 102 is illustrated as a representative instance of multiple server applications running onserver 200, each of the multiple server applications being associated with its own distinct client (clients are not shown in this illustration). Additionally,user manager module 502 represents one of potential multiple user managers running onserver 200. -
Image management module 216 serves to capture the UI of application 112 (as the UI would appear on a screen) and save the capture inmemory 218. Any capture process such as screen-scraping may be used, andimage management module 216 may perform this capture at any desired rate.Image management module 216 also compares the last prior capture of the application UI to the current capture to determine whether any changes have occurred in a particular area of the application UI. Any image/video frame matching process may be used for this comparison operation.Image management module 216 serves to repetitively perform this function. - If
image management module 216 detects any change in the particular area of interest, a delta flag is set to indicate that the area of interest has changed. Upon detecting a change,image management module 216 serves to convert the native format of the UI rendered data to a video frame format more suited for compression and transmission to the client device (e.g., color space transformation, data format transformation, etc.).Image management module 216 serves to resize the image for the reformatted video frame. In the embodiment ofFIG. 2 , multiple parameters of the applicable client device were included in the one ormore feedback parameters 125, allowingimage management module 216 to perform the reformatting and resizing based on client device parameters (the relevant parameters having been communicated to image management module 216). -
Image management module 216 periodically checks (based on its sample interval) if the delta flag has been set. If the delta flag is detected as set during a check, the reformatted/resized video frame inmemory 218 is encoded byvideo encoder module 124 for transmission to the client device. - Client interpreter sub-module 228 of
command process module 126 serves to interpret data received fromclient device 400 and to translate this data for use in connection withvideo encoder module 124,audio encoder module 120 and application 112 (e.g., user commands, etc.).Client interpreter sub-module 228 serves to pass thefeedback parameters 125 tovideo encoder module 124 andaudio encoder 120 for use in encoding. - Client interpreter sub-module 228 of
command process module 126 serves to translate client-received data for use in connection withplugin 114 and its client implementersub-module 208. In communicating back user input, the client device passes coordinates (of a cursor, etc.) relative to the client device's screen to commandprocess 126.Client interpreter sub-module 228 serves to determine the corresponding location in relation to the viewport of the client device and the application UI. Client interpreter sub-module 228 then communicates the translated coordinates toplugin 114 for use by its client implementersub-module 208. Client implementer sub-module 208 serves to translate from conventional user input to a format appropriate forapplication 112, and then to directly inject the translated input intoapplication 112. -
Align module 224 correlates and cross-stamps video frames encoded byvideo encoder module 124 and audio encoded byaudio encoder module 120, so that the audio stream and the video frames associated with the UI ofapplication 112 may be readily matched atclient device 400.Image management module 216 may also serve to time-stamp all images, and the operation of capturing audio fromaudio data generator 116 may also serve to timestamp the audio stream, both for down-stream alignment byalign module 224, as would be appreciated by one skilled in the art. In another embodiment, all alignment/matching of audio and video frames may be performed at the client device. - One skilled in the art will recognize that the illustration of
FIG. 2 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 3 is a functional block diagram 300 illustrating some aspects of an architectural overview of the present invention, including a server, an audio server and a client, according to one embodiment. In this embodiment, audio is sent to the client from dedicatedaudio server 304. Functional block diagram 300 includesserver 302,audio server 304 andclient 306, withclient 306 operatively linked toserver 302 andaudio server 304 via network 310 (viaconnections Server 302 is operatively linked toaudio server 304 viaconnection 308.Server 302 includesapplication 318,plugin 322,state manager module 323,recognition module 327,audio data generator 320, video encoder module 324 (including feedback parameter 325),command process module 326,audio interceptor module 330, PID (process identifier)manager module 332, and time-stamp manager module 334. -
Video encoder module 324 operates as described inFIG. 2 , being analogous to video encoder module 124 (and likewise, forfeedback parameter 325 with respect to feedback parameter 125).Video encoder module 324 operates to encode application UI captures 328 and to communicate the encoded video frames for transmission toclient 306. In the process of obtaining application UI captures, the resulting UI captures are time-stamped. Time-stamp manager module 334 facilitates the time-stamping of the UI captures.Command process module 326 operates as described inFIG. 2 , being analogous tocommand process module 126. - While one running application is illustrated with respect to
server 302,application 318 is illustrated as a representative instance of multiple applications running onserver 302, each of the multiple applications having its own video encoder and command process modules, and being associated with its own distinct client.Audio data generator 320 renders an audio stream (not shown) forapplication 318.Audio interceptor module 330 intercepts or traps this audio stream for redirection toaudio server 304, and may timestamp the audio stream. Time-stamp manager module 334 may facilitate the time-stamping of the audio stream.Audio interceptor module 330 may make use of a customized DLL to facilitate such a redirection of the audio stream.PID manager module 332, serves to detect and manage the different process IDs of the multiple applications running onserver 302.PID manager module 332 may stamp each audio stream redirected to audio server with the process ID of its associated application. -
Audio server 304 includes audiostream processing module 336 andPID authentication module 338. Audiostream processing module 336 serves to encode the audio streams received from the applications running onserver 302, and perform any conversion desired (e.g., conversion of sample rates, bit depths, channel counts, buffer size, etc.). In the embodiment ofFIG. 3 , User Datagram Protocol ports are used (not shown) to direct each audio stream to its destination client device; other protocols may be used in other embodiments. Audiostream processing module 336 directs each audio stream to the port associated with the audio stream's corresponding client device (i.e., the client device displaying the video frames corresponding to the audio stream). Audiostream processing module 336 may work in association withPID authentication module 338 to verify and direct the multiple audio streams streaming fromserver 302 to the appropriate port. -
Client 306 includesvideo decoder module 340,audio decoder module 342,command process module 344 and audio/video sync module 346. Afterclient 306 receives and decodes the applicable audio and video streams from server 302 (i.e., the audio and video streams of the application instantiated for client 306), audio/video sync module 346 correlates the time-stamps on both streams and works in conjunction withaudio decoder module 342 andvideo decoder module 340 to synchronize output tospeaker 348 andviewing application 350, respectively.Client 306 also includesstate manager module 351 to manage state information. - One skilled in the art will recognize that the illustration of
FIG. 3 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 4 is a block diagram illustrating some aspects of the present invention in connection with a client, according to one embodiment.Client 400 includesvideo decoder module 144,audio decoder module 142, audio/video sync module 406,command process module 146,speaker 150,viewing application 148,state manager module 149, andconnections -
Video decoder module 144 receives encoded video frames viaconnection 412, whileaudio decoder module 142 receives an encoded audio stream via connection 414. Audio/video sync module 406 serves to match time-stamps or another type of identifier on the audio stream and the video frames for synced output viaspeaker 150 andviewing application 148, respectively.Audio decoder module 142,video decoder module 144 andviewing application 148 all may serve to provide feedback tocommand process module 146, to communicate back to the server-side application feedback parameters (not illustrated inFIG. 4 ), including to vary the sample rate and/or compression of the video encoding, the audio encoding, etc. -
Command process module 146 serves to pass feedback parameters ofclient 400 for use in video and/or audio encoding upon initiation of a session or during a session. Such feedback parameters may include one or more of the following parameters: display resolution, screen size, processor identification or capabilities, memory capabilities/parameters, speaker capabilities, and so forth. -
Viewing application 148 displays the succession of video frames of a portion of the server-side application's UI.Viewing application 148 serves to facilitate communicating user input control, including user commands, to commandprocess module 146 for transmission back to the server. Client user input control passed back to the server may include, for example, input from: a keyboard (screen-based or physical, in a variety of forms), scroll wheels, number pads, stylus-based inputs, a touchscreen or touchpad, etc.Viewing application 148 serves to aggregate certain user input for sending, such as opening up a local text box for text entry.State manager module 149 manages state information, as will be described in relation to sub-sequent Figures. - In one embodiment,
viewing application 148 comprises an application based on the Java Platform, Micro Edition, and portable to BREW (Binary Runtime Environment for Wireless). In other embodiments,viewing application 148 comprises an application based on the Symbian platform, a Linux-based platform (e.g., Android), a Palm platform, a Pocket PC/Microsoft Smartphone platform, or another platform capable of supporting the functionality described herein. In some embodiments,viewing application 148 runs as a stand alone Java application, while in other embodiments,viewing application 148 runs as a plugin to a standard mobile browser, as would be appreciated one skilled in the art. In one embodiment,viewing application 148 includes a number of modules to facilitate implementing the functionality described herein, including modules to facilitate browser navigation, client I/O and viewport tracking (not shown), as would be appreciated by one of skill in the art. - One skilled in the art will recognize that the illustration of
FIG. 4 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 5 is a diagram 500 illustrating some aspects of a multiple-user software architecture, according to one embodiment.User manager module 502 includesworker thread 504,worker thread 506, and a wide-ranging number of additional worker threads (represented by ellipsis 508); with ‘worker thread n’ being represented byworker thread 510.Worker thread 510 represents the total number of worker threads that happen to be running in the system at any given point. Each worker thread corresponds to a list of active users, and the lists of active users may potentially comprise a different number of active users. As illustrated inFIG. 5 ,worker thread 504 corresponds tothread cycle 512,worker thread 506 corresponds tothread cycle 514, the variable number of worker threads represented byellipsis 508 corresponds to the same variable number of thread cycles represented byellipsis 518, andworker thread 510 corresponds tothread cycle 516. As illustrated inFIG. 5 ,worker thread 504 cycles through user 1 (520), user 2 (522), user 3 (524) and user 4 (526);worker thread 506 cycles through user 5 (528), user 6 (530) and user 7 (532); andworker thread 516 cycles through user 8 (534), user 9 (536), user 10 (538), user 11 (540) and user 12 (542). The number of users supported by the worker threads illustrated inFIG. 5 is meant to represent a snapshot at an arbitrary point in time, as the number of users supported by any given thread is dynamic. -
User manager module 502 may be set to instantiate a finite number of worker threads before instantiating additional worker threads to manage further users added to the system. The number of worker threads in the overall architecture illustrated byFIG. 5 will vary according to various embodiments. The parameters regarding the number of active users assigned per worker thread will also vary according to various embodiments. -
User manager module 502 runs on a server (as illustrated inFIG. 1 ) where multiple instances of applications (as illustrated inFIG. 1 ) are also running.User manager module 502 thus serves to manage multiple users in an environment of multiple application instances. When a new user is introduced into the overall system ofFIG. 1 , the new user is assigned to a worker thread (504-510) to facilitate the interaction between a specific client and a specific server-side application. - While the embodiment illustrated in
FIG. 5 illustrates multiple users being assigned to a single thread, in other embodiments, a single user may be assigned to their own single thread. In other embodiments, a user may be assigned to either a shared thread or a dedicated thread depending on one or more factors, such as the current loading/usage of the overall system, the user's service policy with the provider of the respective service operating an embodiment of this invention, and so forth. -
User manager module 502 facilitates load balancing of multiple users in a number of ways, as each worker thread cycles through their respective list of active users and processes one active event for each user. The active events that may be processed include: (a) send one video frame update to the client or (b) update state information pertaining to the client's viewing application and the server-side application/UI. For illustration purposes for the case in whichuser 1 is associated with server application 1 (102) ofFIG. 1 ,worker thread 512 will time slice between the respective video encoder/command process modules of users 1 (520) through 4 (526) to perform (a) and (b) above, withvideo encoder module 124 andcommand process module 126 comprising the video encoder/command process modules of user 1 (520). A separate thread (not shown) would be operatively coupled toaudio encoder module 120 to continuously send the encoded audio stream to the client, when audio data is present to send. - By operating in this manner, no single user suffers as a result of a processing-intensive session (e.g., for a high resolution, high frame rate video) of another user being serviced by the same thread. This will be further described below with reference to
FIG. 7 . In other embodiments, more than one active event may be processed per user. In other embodiments, a variable number of active events may be processed per user, based on a wide range of factors, including the level of underlying application activity for the applicable user, the user's service policy with the provider of the respective service operating an embodiment of this invention, and so forth. -
User manager module 502 may move specific users among different worker threads to load balance among processing-intensive users and processing-light users. For example, if multiple users being serviced by one worker thread are in need of being serviced with high frame rate video,user manager module 502 may move one or more such users to another thread servicing only, or predominately, processing-light users. Another thread may also be instantiated for this purpose. As one skilled in the art would appreciate, a user may be treated as an object, and moved to another thread as objects are transferred among threads. The timing of such object moves may take place at specific junctures in the display of video frames by a client device's viewing application, in order to minimize disruption of a user's experience. - One skilled in the art will recognize that the illustration of
FIG. 5 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 6 is aflowchart 600 illustrating some supporting aspects of capturing a succession of video frames, according to one embodiment. Operation 602 (Render Application UI to memory) is performed initially, either by a plugin to an application or by an application itself.Operation 602 serves to capture the UI of an application as the UI would appear on a screen and save the capture in a memory buffer (218 ofFIG. 2 ); actual display of the UI on a screen is not required, but may be used. Operation 604 (Any delta from prior Application UI capture?) then serves to compare the last prior capture of the application UI to the current capture to determine whether any changes have occurred. This delta checking operation may be performed in a wide number of ways, including, for example, hashing pixel blocks of the current UI capture and comparing the hash values to an analogous pixel-hash table generated from the prior UI capture. The hash values may then also be available for potential use in any compression method utilized, e.g., matching blocks of successive video frames, matching blocks against a reference frame, etc. Alternatively, for example, the application may notify the server when a change occurs prior tooperation 602. - If
operation 604 is determined in the negative, then Operation 606 (Delay) may be implemented by a timer (not shown), beforeoperation 602 is repeatedly performed. Ifoperation 604 is determined in the affirmative, Operation 608 (Convert to appropriate format) is then performed.Operation 608 serves to convert the native format of the UI rendered data to another format more suited for compression and transmission over a network for display on a client device (e.g., color space transformation, data format transformation, etc.). - Operation 610 (Resize) is then performed.
Operation 610 serves to resize the native screen size inherent to the UI rendered data to another size more suited for display on a client device (e.g., a cellular phone, a handheld computer, etc.).Operation 610 may make use of one or more feedback parameters (not shown) of the client device communicated to the server-side application and its accompanying video encoder instantiation. Operation 612 (Store in memory for encoder) then follows, storing the converted video frame for use by a video encoder. Operation 614 (Flag video update needed) then follows, setting an indication for use by an operation determining if a video update is needed (see operation 712 (Video update needed?) ofFIG. 7 ). Operation 616 (Delay) then follows, and may be implemented by a timer (not shown), beforeoperation 602 is repeatedly performed. - One skilled in the art will recognize that the illustration of
FIG. 6 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 7 is aflowchart 700 illustrating some supporting aspects of sending a succession of video frames, according to one embodiment. Update User N, 702, represents a start of a sequence of steps, with the sequence of steps representing the steps undertaken by the worker threads ofFIG. 5 for each user in the applicable worker thread's list of active users. A worker thread initially performsoperation 704, (Read back channel). Operation 706 (Any data to process for user?) then follows, where it is determined if anything pertinent for User N came in from the network that needs to be processed. Responsive to data pertinent to User N being detected, operation 708 (Process network events) then follows. Incoming data pertinent to User N may comprise, for example, information regarding user input to the client device, such as attempting to zoom in on a particular part of the server-side application UI (as shown by a video frame of the server-side application UI displayed on a viewing application running on the client device).Operation 708 may include communicating such processed information to its next destination, e.g., if a zoom command had been sent from the client, the zoom command would be appropriately processed and forwarded to the server-side application before the worker thread proceeded to the next applicable operation. - Either after operation 708 (Process network events) or a negative determination of operation 706 (Any data to process for user?), operation 710 (Update needed?) is then performed.
Operation 710 may depend on a counter (not shown) being set when the last video frame for the applicable user was sent, or, more specifically, when operation 712 (Video update needed?) was last performed. If the counter has not yet reached its endpoint, then the worker thread performing the operations will then proceed to operation 718 (Increment N) to commence the sequence of steps illustrated inFIG. 7 for the next applicable user. The counter controls the frame rate for the succession of video frames being communicated from the server to the client, or, more specifically, the allowable frame rate, as will be further described below in relation tooperation 712, (Video update needed?). For example, for an allowable frame rate of ten times per second, the counter would be set to count to 100 milliseconds (e.g., from 100 milliseconds down to zero, or vice-versa). - If operation 710 (Update needed?) is determined in the affirmative, then operation 712 (Video update needed?) is performed.
Operation 712 may comprise checking a ‘video update needed’ flag, as was described in relation toFIG. 6 , or some such similar operation.Operation 712 serves to determine whether anything has changed in the portion of the server-side application being displayed by the client in the client's viewing application. Ifoperation 712 is determined in the affirmative, operation 714 (Grab needed video info) is then performed.Operation 714 serves to obtain any video frame information needed to update the video frame of an application UI from the last transmitted video frame of the UI, and may make use of a wide range of video frame/compression techniques, including video frame/compression standards, customized methods, and combinations thereof. - Once operation 714 (Grab needed video info) has been performed, operation 716 (Send network event) is then performed.
Operation 716 serves to send to the client one video frame update, or updated state information pertaining to the client's viewing application and the server-side application/UI.Operation 716 may post the applicable data for User N to the queue for User N's network connection to effectuate this. - If
operation 712 is determined in the negative, operation 716 (Send network event) may be still be performed if there is updated state information pertaining to the client's viewing application and the server-side application/UI. - By transmitting only one video frame update (or state information update) per user, the worker thread servicing multiple users may cycle through and serve all of the users on its active user list without any one user significantly consuming the worker thread's time to the detriment of any other particular user. As the worker thread's load across all of its supported users increases, servicing times for all of the worker thread's active users will gradually increase. Load balancing is thus inherently embedded in this manner. Additional load balancing techniques are described in connection with
FIG. 5 . - One skilled in the art will recognize that the illustration of
FIG. 7 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 8 is a diagram illustrating some aspects of client-server exchange 800, according to one embodiment. Client-server exchange 800 depicts a session exchange betweenserver 802 andclient device 804. As described herein,server 802 may refer to any server-side machine, and may include a number of servers, either located in one facility or geographically dispersed, operating in conjunction to facilitate the operations described inFIG. 8 . These servers may include authentication servers, database servers, etc. -
Client device 804 initiates client-server exchange 800 withoperation 806, with a user launching a viewing application onclient device 804. (The term, client device, is used with respect toFIG. 8 simply because the client device's parameters are discussed throughout; the term, client, could also be used interchangeably.) The viewing application then facilitates opening a connection toserver 802 via a network connection viaoperation 808.Operation 808 makes use of one or more standard Internet protocols, or variations thereof, as would be appreciated by one of skill in the art.Operation 808 serves to pass the user's identity (e.g., by telephone number, carrier account, etc.) and client device's (804) display resolution and size toserver 802, for use byserver 802 in the session.Server 802 then performs operation 810, which launches and provisions a server application instance, with the server application customized based on the user's preferences. In the present embodiment, the user's preferences are fetched from a database (not illustrated) where they have been associated with the user's identity. Per operation 810, the server application renders in a virtual frame buffer. In another embodiment, the server application may render to a screen. -
Operation 812 then follows, where audio/video encoder modules, and an accompanying command process module, are launched and provisioned with client device's (804) display resolution and size for customized encoding forclient device 804. The command process and encoder modules may also be provisioned with a level of service associated with the user's identity, providing the particular user with a priority-level with regard to other users using the system. Subsequently, as depicted inoperation 814, the video encoder may convert and encode video frames of the server application's UI (e.g., converting the video frames rendered by the server application to QVGA resolution from the server application's native rendering resolution becauseclient device 804 supports QVGA resolution). The video encoder module also resizes the server application's native UI size rendering to suitably fit within, or work together with client device's (804) screen size. The audio encoder module encodes an audio stream output of the server application based on the speaker capabilities of client device 804 (e.g., if client device is known to be a cellular phone, the audio may be encoded such the quality encoded does not exceed the particular phone's speaker capabilities, or of a default level used for cellular phones).Arrow 816 illustrates the communication of the encoded audio and video toclient device 804. -
Operation 818 subsequently follows, where the viewing application's decoder modules (audio and video) decode the audio and video frames received. The video frames may be displayed in the viewing application on the client, and the audio may be output to client device's (804) speakers (if audio is present andclient 804 is not on mute), as depicted byoperation 820. -
Operation 822 subsequently follows, depicting an on-going series of interactions (represented byarrows 826 and 828) betweenserver 802 andclient device 804, also represented byoperation 824 on the client side.Operation 822 depicts the server application only sending information to the encoder modules when the UI or audio output change, with the encoders encoding this information for transmittal toclient device 804. Thus, if nothing changes regarding the server application's UI or audio, then audio/video information is not encoded and sent toclient device 804. A video encoder of server (802) thus asynchronously communicates video frames based on changes in the UI. Video is sent as video frames to display UI changes, and not as commands outside of a video frame format. -
Operation 824 depicts the user interacting with the virtual application, with the user's inputs being transformed into parameters and being passed back to the server application.Operation 822 further depicts the server-side command process module translating user input parameters to operate on the server application UI, with the server application UI accordingly changing.Operation 824 completes the cyclical sequence by further depicting the encoded audio and video resulting from user inputs to the virtual application being received, decoded and displayed in the viewing application. - One skilled in the art will recognize that the illustration of
FIG. 8 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 9 is a diagram illustrating some aspects of client-server exchange 900, according to one embodiment. Client-server exchange 900 depicts a session exchange betweenserver 903 andclient 904, with an accompanying exchange between encoder/command process modules 902 and application 906 (both running on server 903) also being illustrated.Application 906 includesstate management module 907 andclient 904 includesstate management module 905, which will be discussed in relation toFIG. 10 .Application 906 comprises a web browsing application in this embodiment. Encoder/command process modules 902 comprise audio and video encoder modules and a command process module. References to exchanges with encoder/command process modules 902 may only specifically comprise an exchange with one of these modules, as would be appreciated by one skilled in the art. In another embodiment, a functional element similarly situated as encoder/command process modules 902 may comprise a video encoder module and a command process module, but not an audio encoder module. As described herein,server 903 may refer to any server-side machine, and may include a number of servers, either located in one facility or geographically dispersed, operating in conjunction to facilitate the operations described inFIG. 9 . These servers may include authentication servers, database servers, etc. -
Client 904 initiates client-server exchange 900 withoperation 908, open connection.Server 903 responds withoperation 910, connection confirmed.Client 904 then send its capabilities to encoder/command process modules 902, including screen size and other device parameters, viaoperation 912. The device parameters may include a wide variety of device parameters, including a device processor, memory, screen characteristics, etc.Client 904 then sends a URL viaoperation 914, which may comprise a saved URL (e.g., a homepage) or a URL entered by the user ofclient 904. Encoder/command process modules 902 in turn communicate the URL toapplication 906 viaoperation 916, andapplication 906 then loads the URL viaoperation 918.Application 906 also passes the width (w) and height (h) of the web page associated with the URL to encoder/command process modules 902 viaoperation 920. Encoder/command process modules 902 then communicates the web page size toclient 904, as well as the viewport visible on the client screen, including parameters characterizing the viewport of the client, e.g., a corner coordinate (x, y) and an associated zoom factor (z), viaoperation 922. The parameters characterizing the viewport of the client may comprise absolute position information, relative position information, etc. - A screen capture of the webpage viewport (the portion of the browser UI that the viewport has been associated with) then takes place via
operation 924, in accordance with a number of techniques known in the art. A video frame of the web page visible through the viewport is then communicated toclient 904 viaoperation 926. Asubsequent screen capture 930 then takes place after avariable sample interval 928, with the associated video frame being communicated viaoperation 932.Arrow symbol 929, commonly used to indicate a variable element, is illustrated crossingvariable sample interval 928 to indicate this novel feature. - An asynchronous feedback channel provides feedback via
operation 934. This feedback may be used to vary thesample interval 928 based on one or more feedback parameters, including client device parameters, user input parameters, and/or estimated bandwidth parameters, such as bandwidth parameters based on measurements of the packets traversing back and forth betweenserver 903 andclient 904. RTCP protocol, or a similar such protocol (standardized or customized) may be used in connection with providing such feedback, as illustrated byoperation 936.Ellipsis 938 andcycle 940 illustrate the repetitive nature of the interaction betweenserver 903 sending video frames toclient 904. -
Sample interval 928 may also be at least partially varied based on the rate of change of the underlying webpage being viewed. For example, if little to no change is detected in the underlying webpage being viewed byclient 904, then the frame sample interval may be adjusted upward. Likewise, for a very dynamic webpage, or content within a webpage, the frame sample interval may be adjusted downward. - The user of
client 904 may move the viewport from which a webpage is being viewed, to view another portion of the webpage, as depicted in operation 942, with x′ and y′ comprising new parameters of the viewport. The new portion of the webpage that matches the new viewport will then be captured viaoperation 944, and a video frame of the new viewport will be communicated toclient 904 viaoperation 946. - The user of
client 904 may again move the viewport, as depicted inoperation 948, with x″ and y″ comprising new parameters of the viewport. This time, the new viewport extends beyond what would be displayed on the server browser window, and thus the browser itself must scroll to capture the desired portion of the webpage, as depicted inoperation 950. Having appropriately scrolled, as depicted viaoperation 952, a screen capture of the new viewport will then be obtained, as illustrated inoperation 954, with the resulting video frame communicated viaoperation 956. - The user of
client 904 may also use a mouse or mouse-equivalent (e.g., finger tap/motion on a touchscreen, multi-directional button, trackpoint stylus moving a cursor, etc.), as shown viaoperation 958, where a mouse down motion is made, with the new coordinates of the mouse being passed as (a, b).Client 904 will pass coordinates relative to the client device's screen back to encoder/command process modules 902 in such an operation, with encoder/command process modules 902 determining the corresponding location in relation to the viewport and underlying webpage. In the embodiment being described inFIG. 9 ,server 903 is running an underlying Windows OS, permitting the injection of a mouse message with the appropriate location information to the window associated with browser 906 (whether there is an actual screen being used for rendering or not). This is illustrated viaoperation 960, and the screen cursor would resultantly move inapplication 906, and be communicated back in a video frame toclient 904 as described above. In other embodiments being used in conjunction with other operating systems, similar such functions may be used if available, or some analogous other such techniques, as would be appreciated by one skilled in the art. -
Operations ellipsis 980 andcycle 982 serve to illustrate on-going interactions as long as the session betweenclient 904 andserver 903 continues. - One skilled in the art will recognize that the illustration of
FIG. 9 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 10 is a diagram 1000 illustrating some aspects of viewport move operations and related state management, according to one embodiment.FIG. 9 is referenced throughout the description ofFIG. 10 , as the two figures are related. Diagram 1000 includeswebpage 1002, with a width of w′ 1004, and a height of h′ 1006.Webpage 1002 is illustrated partially rendered inFIG. 10 , with renderedwebpage portion 1008 having a width ofw 1010, and a height of h 1012.Webpage portions Webpage portion 1026 includes cursor position (a, b) 1032.Webpage portions operations FIG. 9 . - Webpage portion 1014 corresponds to a portion of
webpage 1004 sent for remote viewing, which comprises a viewport of client (being indicated by (x, y) 1016 and zoom factor 1018). Following or performed concurrently with eitheroperations FIG. 9 ,state manager module 907 ofserver 903, having previously identified webpage portion 1014 as the current state, updates its current state webpage towebpage portion 1020.State manager module 905 ofclient 904 does likewise uponclient 904 displayingwebpage portion 1020.State manager modules Client 904 may request priorstate webpage portion 1020 fromserver 903, such as, for example, via a back icon (not shown). As an intermediate step,client 904 may display a locally cached version of the prior state webpage portion whileserver 903 is in the process of obtaining/sending the current version of the prior state webpage portion. -
Webpage portion 1026 likewise becomes the next current viewport ofclient 904 per operations 948-956, andstate manager modules webpage portion 1026, with the addition of internalapplication scrolling operations webpage 1008. In another embodiment, the entire applicable webpage is rendered (analogous to h′ 1006 by w′ 1004 ofFIG. 10 ), and thus there are no internal scrolling operations to perform. - Similar to the identification of webpage portion states, cursor position sub-states are maintained relative to viewport views and the viewport view's corresponding webpage portion. For example, while
webpage portion 1026 comprises the current state,webpage portion 1026 includes cursor position (a, b) 1032. As described in relation to operations 958-964 inFIG. 9 , as cursor position (a, b) is updated, so too is its corresponding sub-state maintained bystate manager module 907. - One skilled in the art will recognize that the illustration of
FIG. 10 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 11 is a diagram illustrating some aspects of a client-server exchange 1100 with respect to state management, according to one embodiment. Client-server exchange 1100 depicts a session exchange betweenclient 1102 andserver 1104. As described herein,server 1102 may refer to any server-side machine, and may include a number of servers, either located in one facility or geographically dispersed, operating in conjunction to facilitate the operations described inFIG. 11 . These servers may include authentication servers, database servers, etc. -
Server 1102 includesstate manager module 1104 to manage state information, andclient 1106 includesstate manager module 1108 to manage state information.State manager module 1104 andstate manager module 1108 operate as described herein; their common name describes their general function and is not meant to imply they are instances of the same module. In another embodiment, these two modules may be instances of the same module. In yet another embodiment,state manager modules -
Operation 1110 illustratesstate manager module 1104 identifying a portion of a webpage sent for remote viewing, with the portion comprising a viewport view ofclient 1106.Operation 1114 illustratesstate manager module 1104 identifying this webpage portion as the current state webpage portion.Operation 1112 illustratesstate manager module 1108 identifying a portion of a webpage being displayed, withoperation 1112 illustratingstate manager module 1108 identifying this webpage portion as the current state webpage portion. The webpage portions are defined areas within a webpage, and thus in another embodiment, a current state checker module (not shown) may be used to periodically verify the current states are uniform acrossserver 1102 andclient 1106. In yet another embodiment, a common table (not shown) may be shared amongstate manager modules -
Operation 1118 illustrates a user ofclient 1106 selecting a second portion of the webpage. The user can make such a selection in a wide variety of ways, including moving a navigation toggle or a scroll wheel, using a touch screen or keypad, etc. In the embodiment described in relation toFIG. 11 , the webpage portions further comprise an area of the webpage surrounding the viewport ofclient 1106. Thus, in the instance of a selection by the user of a limited scroll-down foroperation 1118, the second portion of the webpage may already reside onclient 1106.Operation 1120 illustrates checking if the second portion is already in local memory. In another embodiment, the webpage portions match the viewport ofclient 1106, and a local memory check, likeoperation 1120, may check if the desired webpage portion had been previously loaded and was still resident in local memory. Determining whether on object of interest is in local memory can be performed in a number of ways, as would be appreciated by one of skill in the art. - In the example illustrated,
operation 1120 was determined in the negative, and parameters for the second portion of the webpage are thus communicated viaoperation 1122.FIGS. 9 and 10 provide more detail regarding passing parameters. Similarly tooperations server 1102 and sent toclient 1106, as illustrated byoperations State manager modules operations State manager modules operations - In the embodiment illustrated in
FIG. 11 ,operation 1120 may also be determined in the positive (not shown). In the case of the second webpage portion residing completely in the local memory ofclient 1106, the second webpage portion may not be requested by, and provided to,client 1106.Client 1106 may simply display the second webpage portion and relay parameters regarding the second webpage portion in order forstate manager module 1104 ofserver 1102 to update its state information. Displaying the second webpage portion may also comprise an intermediate step before obtaining a more current version of the second webpage portion fromserver 1102. - The same operations discussed herein do not only apply to present and prior states, but also to a plurality of webpage portion states of a plurality of websites. This plurality of states may comprise an ordered succession of states, as may be recorded in a browsing session using an embodiment of the invention. This ordered succession of states may be available toggling through the website portion views of a user session, saved as a series of bookmarks, etc.
- Though not illustrated, a feature of managing state information as described herein includes enabling the sharing of a webpage portion with another user, in accordance with one embodiment. A webpage portion may be included in an identifier that can be placed in a message for sending, with the identifier adapted to facilitate transport to the applicable webpage portion. The message could be an email message, an instant-messaging message, a messaging feature used in a social network, etc. The identifier may route the recipient of the message into their system account (an embodiment client-server system account), where the webpage portion included in the identifier can be accessed like a prior webpage portion state of the user. A database may be used in conjunction with such identifiers. As will be appreciated by those skilled in the art, other ways may be used to augment a webpage portion identifier so that it may be transferred among different users, both within and outside of embodiment client-server system accounts.
- One skilled in the art will recognize that the illustration of
FIG. 11 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 12 is a diagram illustrating some aspects of a client-server exchange, including an accompanying exchange between a server and network storage, according to one embodiment. Client-server exchange 1200 depicts a session exchange betweenclient 1202 andserver 1204, with an accompanying exchange betweenserver 1204 andnetwork storage 1206. As described herein,server 1204 may refer to any server-side machine, and may include a number of servers, either located in one facility or geographically dispersed, operating in conjunction to facilitate the operations described inFIG. 12 .Network storage 1206 may refer to any one or more storage units, operating in combination or any other way, that are networked toserver 1204.Server 1204 includesprovision manager module 1205 and user manager module 502 (ofFIG. 5 ). -
Operations Operation 1214 launches a unique browser instance forclient 1202.Provision manager module 1205 uses the user identifier passed in operation 1210 (e.g., a telephone number, account number, etc.), or another identifier associated with the user identifier, to fetch browser state information for the user fromnetwork storage 1206 viaoperations Provision manager module 1205 likewise fetches a customer profile for the user fromnetwork storage 1206 viaoperations network storage 1206. Browser state information for the user may include any type of user information associated with a browser, including bookmarks, cookies, caches, etc. Viaoperation 1220,provision manager module 1205 provisions the unique browser instance launched for the user as would be appreciated by one skilled in the art (e.g., by using automated directory copies, automated provisioning techniques, etc.). - Via
operation 1222,provision manager module 1205 works in conjunction with user manager module 502 (described in relation toFIG. 5 ) to provision resources based on the user's customer profile. The user's customer profile could include the user's past usage history (e.g., categorized as bandwidth low, medium, or heavy), encoding usage (also possibly categorized), etc.Provision manager module 1205 anduser manager module 502 together operate to provide different levels of service to a user, based on a level of service provider plan, etc. A wide range of parameters may be configured, depending on the particular embodiment, including peak/average bandwidth Quality of Service (QoS), video compression settings for motion video (i.e., quality), video frame rate, video image size limits and refitting, server memory, CPU and/or disk usage. The parameters configured could be set statically per user/profile, or the parameters could dynamically change, based on a number of factors, including the type of URL being used. In this way, the service provider could provide higher quality video for low-bandwidth pages (e.g., news websites, etc.) and place more restrictions on bandwidth-intensive websites. As described in relation toFIG. 5 , provisioning a user among worker threads may also be used to implement tiered levels of service. -
Arrows operation 1228, the browser state information will be updated based on the user's use during the session (e.g., the addition of cookies based on websites visited, modifications to bookmarks during the sessions, etc.). Viaoperation 1230, at least some of the user's customer profile will also be tracked for updating the user's customer profile on terminating the session. - During a session,
user manager module 502 may perform a state copy of a user session to transfer a user to another thread, either on the same server or a different server.User manager module 502 may operate in conjunction withprovision manager module 1205 to facilitate such a move. While moving a user from one thread to another is discussed in relation toFIG. 5 ,provision manager module 1205 may also facilitate such a state copy by updating state information during a user session.User manager module 502 may then pause a browsing session, whileprovision manager module 1205 provisions another browser instance with the updated state information at the desired destination.User manager module 502 may then move the user session to the desired destination by performing a state copy (which may include a directory copy, etc.). The applicable user web browsing session may then resume. The timing of such transfers may take place at specific junctures in the display of video frames by a client device's viewing application, in order to minimize disruption of a user's experience, such as when a user makes a request of a URL. - When the user desires to end the session, the user may close their virtual browser, as shown in
operation 1232, and accompanyingoperation 1234. The user's browser end state will also be captured byprovision manager module 1205, as shown viaoperation 1236, via a directory save, etc. The user's customer profile will be updated based on the usage information captured, as shown inoperation 1238. For example, the user may visit mostly text-based websites during a session, and have their average bandwidth and encoding averages lowered by such a resource-light session.Provision manager module 1205 will update the user's state information and customer profile based on the session, saving the updated versions innetwork storage 1206, asserver 1204 closes the unique server browser instantiated at the beginning of the session. - One skilled in the art will recognize that the illustration of
FIG. 12 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 13 is a diagram illustrating some aspects of displaying a web browsing session from a server to a cellular phone, according to one embodiment.Server 1302 is illustrated, upon which a web browsing engine (not shown) executes,rendering webpage 1304.Webpage 1304 is rendered into a buffer onserver 1302 in the embodiment discussed in relation toFIG. 13 , butwebpage 1304 may be also rendered to a screen in another embodiment. A recognition engine (not shown) also runs onserver 1302, which identifies elements of interest onwebpage 1304, such ashyperlink 1306. Recognition engine may use any number of techniques known to one skilled in the art to identify elements of interests with respect towebpage 1304, including parsing through the HTML/XHTML/Document Object Model data to obtain access to the rendered webpage data/contents of webpage 1304 (including location information of the contents of webpage 1304). In some embodiments, other techniques for parsing webpage contents may be used, including a tabbing function to cycle through components of a webpage. In some embodiments, optical character recognition techniques known to those skilled in the art may additionally or alternatively be used. - In some embodiments, the information regarding the elements of interest may be communicated along with
webpage 1304, such as in the form of a polygon map. A viewing application (not shown) running oncellular phone 1310 may then match such a polygon map with the video ofwebpage 1304 being displayed. In various embodiments, the information regarding the elements of interest may be communicated fromserver 1302 tocellular phone 1310 in combination with the video frames in a wide variety of ways, as would be appreciated by one of skill in the art regarding sending two data sets to be matched at the destination. - In one embodiment, the information regarding the elements of interest may be communicated in response to viewing application passing user input back to the recognition engine on
server 1302. For example, a user may select a location on the image being displayed oncellular phone 1310 corresponding to a textbox (not shown) onwebpage 1304, for the user to input text into the text box. As described elsewhere in relation to other Figures, the location of the user selection may be passed back toserver 1302. Recognition engine onserver 1302 may then communicate to the viewing application that the location selected by the user corresponds to a textbox. In one embodiment, the viewing application may then locally render a generic textbox for the input of text by the user, sending the text entered by the user back to the recognition engine once the user has completed the entry. - In another embodiment, the recognition engine may send information regarding the textbox to the viewing application for the viewing application to customize the locally-rendered textbox, including, for example, information regarding acceptable string length, acceptable character entries, string-formatting, etc. This information may be used by the viewing application to locally police acceptable input for a given textbox, as well as for formatting the display of input (e.g., causing password textbox entries to appear as “*******” instead of cleartext). In some instances, a textbox, radio button, etc., will already contain text (e.g., have an initial state with initial text), be selected, etc., and this information will be sent to the client to appear in the locally rendered textbox, radio button, etc. A state manager on the backend server will also track the state of interactive objects on a webpage, with the state manager serving to synchronize client user input with the web browsing engine on the backend server. The state manager may also serve to react to events on the server that occur while a client control is activated (such as a control losing focus suddenly), and may alert a user in some embodiments if a synchronization was unsuccessful, including, for example, providing a reentry prompt.
- Also, this applies to state. E.g. current selection(s) in a list box, or current state of a radio button.
-
Cellular phone 1310 is illustrated inFIG. 13 as the destination of webpage 1304 (labeledwebpage portion 1308 with respect to a portion being displayed on cellular phone 1310), after the webpage has been transformed into video frames as discussed in regard to other Figures herein. The viewing application executing oncellular phone 1310displays webpage portion 1308 along with address bar 1312 (displaying URL 1340),toolbar 1318, and an overlay graphical component, cursor 1314, withaddress bar 1312,toolbar 1318 and cursor 1314 being rendered locally oncellular phone 1310.Address bar 1312 andtoolbar 1318 may appear in a number of locations together with a portion ofwebpage 1304, according to a variety of embodiments. In some embodiments,address bar 1312 andtoolbar 1318 may be user-configurable to appear in other display locations, while in some embodiments,address bar 1312 andtoolbar 1318 may automatically adjust their locations based on positioning ofcellular phone 1310 for cellular phones with motion detection ability. - Hyperlink 1316 (analogous to hyperlink 1306 rendered on server 1302) is displayed in the video image displayed on
cellular phone 1310. In some of the embodiments in which information regarding elements of interest are communicated along withwebpage 1304, the viewing application may be adapted to transform cursor icon 1314 into hand icon 1315 (or another icon to denote an underlying hyperlink) while cursor icon 1314overlays hyperlink 1316.Address bar 1312, comprising a roll out address bar in the embodiment illustrated inFIG. 13 , may include any number of functional user interface elements.Address bar 1312 rolls out when “A” icon 1320 is tapped, and rolls back intomain toolbar 1318 when “A” icon 1320 is tapped whenaddress bar 1312 is already rolled out, or upon inactivity for a defined time period.Address bar 1312 displays the current URL and may also wrap around to another line when a user is entering a long URL.Home icon 1334 allows for a user to set a home webpage by a number of means, either throughcellular phone 1310 or through an import function through a specialized portal, as would be appreciated by one of skill in the art.Favorites icon 1336 allows for a user to save URLs for easy reference, and may also be set up throughcellular phone 1310 or through an import function through a specialized portal, as would be appreciated by one of skill in the art. “V”icon 1338 allows for a voice/speech URL entry. In some embodiments, viewing application may interface with speech recognition capabilities ofcellular phone 1310 for such voice/speech URL entries. - Back and
Forward icons 1322 allow for a user to scroll through webpages for previously entered URLs. In some embodiments, Back andForward icons 1322 may allow for a user to scroll through webpage portion views. In some embodiments, such icons may be configurable to either scroll through prior webpages, or through prior webpage portion views, based on user selection of the two options. “X”stop icon 1326 allows for stopping the loading of a portion of a webpage, and thus appears while a portion of a webpage is being loaded. In some embodiments, “X”stop icon 1326 may display during the loading of, and permit cessation of the loading of, an entire webpage. When a portion of a webpage, or a webpage, has loaded, “X”stop icon 1326 is replaced by “R”refresh icon 1324, to allow for a reloading of the portion of the webpage currently being viewed, or the reloading of an entire webpage depending on the embodiment. - Zoom in/out
icons 1328 allow for zoom functionality and control. Zoom in/outicons 1328 may allow for multiple functionality based on the duration of selecting the icon. For example, for zooming out, each tap may incrementally zoom out, while a long press may zoom out to a full page view. Scrollicons 130 and 132 allow for scrolling in the client window. - Depending on the embodiment and functionality of
cellular phone 1310, the icons discussed herein may be touched-based for cellular phones with touch screen functionality, may include a menu-based user interface (softkeys), and/or may be selectable via scroll/selection buttons or any other input features of cellular phones. In some embodiments, a specialized touch pattern, e.g., a drag, etc., may be equated to a right-button mouse click, with such a parameter being passed to theserver 1302 for right-button mouse options to appear on webpage 1304 (and webpage portion 1308). In some embodiments, right-button mouse options may be locally rendered by the viewing application. - In some embodiments, web browsing engine on
server 1302 may make use of small screen rendering techniques to render all, or part of,webpage 1304 to fit within the width of the screen ofcellular phone 1310. In some embodiments, the viewing application oncellular phone 1310 may allow for user selection of a small screen rendering view, allowing the user to switch between multiple view formats known in the art. Intra-web page views displayed oncellular phone 1310 may be tracked in some embodiments, allowing the center of a view to be maintained across multiple view formats. - One skilled in the art will recognize that the illustration of
FIG. 13 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 14 is a diagram illustrating some aspects of a user interface display, according to one embodiment.Webpage portion 1402 illustrates a collection of textboxes ofcheckout section 1401, specifically shippinginformation section 1403.Webpage portion 1402 appears as if displayed on a mobile device (not shown), such as a cellular phone, withtoolbar 1439 appearing on the bottom ofwebpage portion 1402. In other embodiments,toolbar 1439 may appear elsewhere or may roll out of view, either by default or by user configuration.Shipping information section 1403 includescheckbox 1415 for selecting the ‘use billing address’ option (illustrated as unchecked),textbox 1416 forfirst name information 1404,textbox 1418 forlast name information 1406,textbox 1420 foraddress information 1408,textbox 1422 forcity information 1410, dropdown menu 1428 forstate information 1412,textbox 1424 for zip-code information 1425,textbox 1426 foremail information 1414, andradio buttons question 1430. - In the embodiment illustrated in
FIG. 14 , information regarding the interactive elements was communicated along withwebpage portion 1402 to the mobile device displayingwebpage portion 1402. In another embodiment, information regarding an interactive element may be communicated to the viewing application once the user has selected a location on the webpage being displayed corresponding to the interactive element.Webpage portion 1440 will be used to illustrate further operations with respect towebpage portion 1402. In the case of a user bypassing the address-related textboxes shown inFIG. 14 by selecting checkbox 1415 (e.g., by tapping the location of the screen wherecheckbox 1415 was being displayed), a check appearing in checkbox 1415 (not shown) would be rendered on the backend server, in accordance with one embodiment. In another embodiment, the viewing application may locally rendercheckbox 1415 and the check, or just the check, depending on the embodiment. - When a user selects
textbox 1416 to enter their first name, the viewing application displaying portion ofwebpage 1402 locally renderstextbox outline 1442 and rolls out overlay textbox 1444 (the rolling feature indicated byarrow 1446 for illustration purposes). As the user inputs their name, the letters entered by the user display inoverlay textbox 1444. Once the user has completed their entry, for example by pressing an enter key on the mobile device or selecting another textbox onwebpage portion 1440, the viewing application will communicate the information to a state manager on the backend server, with the state manager serving to synchronize client user input with the web browsing engine on the backend server. In another embodiment, an overlay textbox for character entry may be locally rendered directly wheretextbox 1416 is shown. In another embodiment,toolbar 1448 may transform to an overlay textbox for character entry. - In the embodiment illustrated in
FIG. 14 , the viewing application received information regarding dropdown menu 1428 from the recognition engine residing on the backend server, including the available menu options (e.g., items listed, the initial selection, or state, of the drop-down menu), together with the information to displaywebpage 1402. Viewing application then locally rendered dropdown menu 1428, and will locally render an expanded menu upon the user selecting dropdown menu 1428. Once the user makes a selection of the available menu options, viewing application will communicate the information to a state manager on the backend server, with the state manager serving to synchronize client user input with the web browsing engine on the backend server. In other embodiments, other implementations may be used to locally render the selections available from a dropdown menu. - In the embodiment illustrated in
FIG. 14 , the viewing application received information regardingradio buttons webpage 1402. Viewing application then locally renderedradio buttons radio button 1434 selected upon the user tapping radio button 1434 (FIG. 14 illustrating an embodiment where the mobile device includes a touch screen). Once selection of a radio button is made, viewing application will communicate the information to a state manager on the backend server, with the state manager serving to synchronize client user input with the web browsing engine on the backend server. In other embodiments, other implementations may remotely render selection of a radio button. - One skilled in the art will recognize that the illustration of
FIG. 14 is merely an example, and that the invention may be practiced and implemented in many other ways. In various embodiments, one or more of the above interactive elements of a webpage (for example, a textbox, a radio button, a dropdown menu, etc.) may be rendered completely on the backend server. Additionally, in some embodiments, a state manager module within the viewing application may facilitate collecting user input and synchronizing such user input with user input provided to the backend web browsing engine. -
FIG. 15 is a diagram illustrating some aspects of a user interface display, according to one embodiment.Webpage portion 1502 appears as if displayed on a mobile device (not shown), such as a cellular phone, withtoolbar 1504 appearing on the side ofwebpage portion 1502. In other embodiments,toolbar 1504 may appear elsewhere or may roll out of view, either by default or by user configuration.Webpage portion 1502 illustratesnews section 1506, including headline 1508,text section 1510 andwebpage video 1512. - In the embodiment illustrated in
FIG. 15 , information regarding the interactive element ofwebpage video 1512 was communicated along withwebpage portion 1502 to the mobile device displayingwebpage portion 1502. In another embodiment, information regardingwebpage video 1512 may be communicated to the viewing application once the user has selected a location on the image displayed corresponding towebpage video 1512. Fullscreen video view 1518 will be used to illustrate an operation with respect towebpage portion 1502. When the user movescursor icon 1514 overwebpage video 1512, the viewing application locally renderingcursor icon 1514 will transformcursor icon 1514 tohand icon 1516 based on the information communicated from the backend server to the viewing application. In another embodiment, the viewing application may not make such a transformation. In other embodiments, a play icon may be displayed onwebpage video 1512, either placed there by the viewing application or as displayed on the underlying webpage. - Once the user has selected
webpage video 1512, the viewing application illustrated inFIG. 15 will switch to fullscreen video view 1518, allowing for larger viewing of webpage video 1512 (shown aslarger webpage video 1520 in full screen video view 1518). In the embodiment illustrated, the same underlying video transport, as discussed with respect to other Figures, is used both for displayingwebpage portion 1502 and fullscreen video view 1518. Fullscreen video view 1518 includes a video-control user interface 1522, the length of which corresponds to the length ofwebpage video 1520, viewed-portion segment 1526, and pause/play icon 1524, which may be alternatively used for pausing or playingwebpage video 1520, depending on the current play state ofwebpage video 1520. In another embodiment, the viewing application may not have a specialized video-control user interface for playingwebpage video 1520. In other embodiments, the viewing application may have a different specialized video-control user interface for playingwebpage video 1520. - In some embodiments, the viewing application may play
video 1512 as shown inwebpage portion 1502, and not switch to fullscreen video view 1518. In some embodiments, whethervideo 1512 switches to fullscreen video view 1518 may be user-configurable. - One skilled in the art will recognize that the illustration of
FIG. 15 is merely an example, and that the invention may be practiced and implemented in many other ways. -
FIG. 16 illustrates an example computer system suitable for use in association with a client-server architecture for remote interaction, according to one embodiment. As shown,computer system 1600 may represent either a computer operating as a server, or a computer operating as a client, with the general components illustrated inFIG. 16 potentially varying with each respective representation, as would be appreciated by one of skill in the art.Computer system 1600 may include one ormore processors 1602 and may includesystem memory 1604. Additionally,computer system 1600 may includestorage 1606 in the form of one or more devices (such as a hard drive, an optical or another type of disk, electronic memory, including flash memory, and so forth), input/output devices 1608 (as a keyboard (screen-based or physical, in a variety of forms), scroll wheels, number pads, stylus-based inputs, a touchscreen or touchpad, etc.) and communication interfaces 1610 (to connect to a LAN, a WAN, a wired or wireless network, and so forth). The elements may be coupled to each other via system bus 1612, which may represent one or more buses. In the case where system bus 1612 represents multiple buses, the multiple buses may be bridged by one or more bus bridges (not shown). When representing client devices in some embodiments, processor(s) 1602 may comprise a controller, andsystem memory 1604 andstorage 1606 may comprise one cohesive memory component. - These elements each perform their conventional functions known in the art. In various embodiments,
computing system 1600 may at least be partially incorporated in a larger computing system.System memory 1604 andstorage 1606 may be employed to store a working copy and a permanent copy of the programming instructions implementing various aspects of the one or more earlier described embodiments of the present invention. Any software portions described herein need not include discrete software modules. Any software configuration described above is meant only by way of example; other configurations are contemplated by and within the scope of various embodiments of the present invention. The term, engine, is used herein to denote any software or hardware configuration, or combination thereof, that performs the function or functions referenced. In particular, the term, web browsing engine, is used herein to describe any software or hardware configuration, or combination thereof, that performs a web browsing function. - With respect to some embodiments of the invention, modules have been described to implement various functions. In alternate embodiments, part or all of the modules may be implemented in hardware, for example, using one or more Application Specific Integrated Circuits (ASICs) instead.
- In all of the foregoing, it is appreciated that such embodiments are stated only for the purpose of example, and that other embodiments could equally be provided without departing from the essential characteristics of the present invention.
- The present invention has been described in particular detail with respect to one possible embodiment. Those of skill in the art will appreciate that the invention may be practiced in other embodiments. First, the particular naming of the components, capitalization of terms, the attributes, data structures, or any other programming or structural aspect is not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, formats, or protocols. Further, the system may be implemented via a combination of hardware and software, as described, or entirely in hardware elements. Also, the particular division of functionality between the various system components described herein is merely by way of example, and not mandatory; functions performed by a single system component may instead be performed by multiple components, and functions performed by multiple components may instead performed by a single component.
- Some portions of above description present the features of the present invention in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. These operations, while described functionally or logically, are understood to be implemented by computer programs. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules or by functional names, without loss of generality.
- Unless specifically stated otherwise as apparent from the above discussion, it is appreciated that throughout the description, discussions utilizing terms such as “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
- Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention could be embodied in software, firmware or hardware, and when embodied in software, could be downloaded to reside on and be operated from different platforms used by real time network operating systems.
- The present invention also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may include a computer (including any type of computer, depending on various embodiments, including a server, personal computer, tablet device, handheld computer, PDA, cellular phone, etc.) selectively activated or reconfigured by a computer program stored on a computer readable medium that can be accessed by the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs, including multi-core designs, for increased computing capability.
- The algorithms and operations presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will be apparent to those of skill in the, along with equivalent variations. In addition, the present invention is not described with reference to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein.
- The present invention is well suited to a wide variety of computer network systems over numerous topologies. Within this field, the configuration and management of large networks include storage devices and computers that are communicatively coupled to dissimilar computers and storage devices over a network, such as the Internet.
- Finally, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
Claims (25)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/020,472 US20080184128A1 (en) | 2007-01-25 | 2008-01-25 | Mobile device user interface for remote interaction |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US88657707P | 2007-01-25 | 2007-01-25 | |
US12/020,472 US20080184128A1 (en) | 2007-01-25 | 2008-01-25 | Mobile device user interface for remote interaction |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080184128A1 true US20080184128A1 (en) | 2008-07-31 |
Family
ID=39645203
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/020,361 Active 2031-08-31 US8630512B2 (en) | 2007-01-25 | 2008-01-25 | Dynamic client-server video tiling streaming |
US12/020,472 Abandoned US20080184128A1 (en) | 2007-01-25 | 2008-01-25 | Mobile device user interface for remote interaction |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/020,361 Active 2031-08-31 US8630512B2 (en) | 2007-01-25 | 2008-01-25 | Dynamic client-server video tiling streaming |
Country Status (2)
Country | Link |
---|---|
US (2) | US8630512B2 (en) |
WO (2) | WO2008092131A2 (en) |
Cited By (93)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080222520A1 (en) * | 2007-03-08 | 2008-09-11 | Adobe Systems Incorporated | Event-Sensitive Content for Mobile Devices |
US20100042678A1 (en) * | 2008-08-12 | 2010-02-18 | Samsung Electronics Co., Ltd. | Method and apparatus for providing/receiving user interface using user interface directory |
US20100100735A1 (en) * | 2008-10-17 | 2010-04-22 | Qualcomm Incorporated | Apparatus and method for providing a portable broadband service using a wireless convergence platform |
US20100115458A1 (en) * | 2008-10-26 | 2010-05-06 | Adam Marano | Panning a native display on a mobile computing device to a window, interpreting a gesture-based instruction to scroll contents of the window, and wrapping text on the window |
US7743339B1 (en) * | 2007-02-01 | 2010-06-22 | Adobe Systems Incorporated | Rendering text in a brew device |
US20100281156A1 (en) * | 2009-05-04 | 2010-11-04 | Kies Jonathan K | System and method of recording and sharing mobile application activities |
US20100325565A1 (en) * | 2009-06-17 | 2010-12-23 | EchoStar Technologies, L.L.C. | Apparatus and methods for generating graphical interfaces |
US20110106874A1 (en) * | 2009-11-03 | 2011-05-05 | Oto Technologies, Llc | System and method for redirecting client-side storage operations |
US20110145723A1 (en) * | 2009-12-16 | 2011-06-16 | Oto Technologies, Llc | System and method for redirecting client-side storage operations |
WO2011123859A1 (en) * | 2010-04-02 | 2011-10-06 | Skyfire Labs, Inc. | Assisted hybrid mobile browser |
US20110258530A1 (en) * | 2010-04-19 | 2011-10-20 | Se Yoon Jang | Mobile terminal and controlling method thereof |
US20110310011A1 (en) * | 2010-06-22 | 2011-12-22 | Hsni, Llc | System and method for integrating an electronic pointing device into digital image data |
US20120089926A1 (en) * | 2010-10-06 | 2012-04-12 | International Business Machines Corporation | Instant Messaging with Browser Collaboration |
US20120096072A1 (en) * | 2010-10-15 | 2012-04-19 | Samsung Electronics Co., Ltd. | Method and apparatus for updating user interface |
US8239773B1 (en) | 2008-10-28 | 2012-08-07 | United Services Automobile Association (Usaa) | Systems and methods for co-browsing on a mobile device |
US20120221682A1 (en) * | 2009-10-28 | 2012-08-30 | Nec Corporation | Remote mobile communication system and remote mobile communication method |
US20130007625A1 (en) * | 2011-03-21 | 2013-01-03 | Victor Lozinski | Apparatus, system, and method for connecting mobile devices to a backend server in an enterprise software environment and initiating a business process |
US20130058213A1 (en) * | 2010-05-10 | 2013-03-07 | Nec Corporation | Remote mobile communication system, server device, and remote mobile communication system control method |
US20130066950A1 (en) * | 2010-06-01 | 2013-03-14 | Zte Corporation | Service Development Platform, System and Method Thereof |
US20130104032A1 (en) * | 2011-10-19 | 2013-04-25 | Jiyoun Lee | Mobile terminal and method of controlling the same |
US20130121586A1 (en) * | 2010-07-26 | 2013-05-16 | Koninklijke Philips Electronics N.V. | Determining representative images for a video |
US8577963B2 (en) | 2011-06-30 | 2013-11-05 | Amazon Technologies, Inc. | Remote browsing session between client browser and network based browser |
US8589385B2 (en) | 2011-09-27 | 2013-11-19 | Amazon Technologies, Inc. | Historical browsing session management |
US8615431B1 (en) | 2011-09-29 | 2013-12-24 | Amazon Technologies, Inc. | Network content message placement management |
US8627195B1 (en) | 2012-01-26 | 2014-01-07 | Amazon Technologies, Inc. | Remote browsing and searching |
WO2014018581A1 (en) * | 2012-07-27 | 2014-01-30 | Microsoft Corporation | Web browser having user-configurable address bar button |
US20140092084A1 (en) * | 2012-08-28 | 2014-04-03 | Tencent Technology (Shenzhen) Company Limited | Webpage display method and apparatus |
US8706860B2 (en) | 2011-06-30 | 2014-04-22 | Amazon Technologies, Inc. | Remote browsing session management |
US8799412B2 (en) | 2011-06-30 | 2014-08-05 | Amazon Technologies, Inc. | Remote browsing session management |
US8832288B1 (en) * | 2012-07-13 | 2014-09-09 | Google Inc. | Transitions between remotely cached and live versions of a webpage |
US8839087B1 (en) | 2012-01-26 | 2014-09-16 | Amazon Technologies, Inc. | Remote browsing and searching |
US8849802B2 (en) | 2011-09-27 | 2014-09-30 | Amazon Technologies, Inc. | Historical browsing session management |
US8914514B1 (en) | 2011-09-27 | 2014-12-16 | Amazon Technologies, Inc. | Managing network based content |
US20140379857A1 (en) * | 2013-06-24 | 2014-12-25 | Samsung Electronics Co., Ltd. | Method and apparatus for providing content with streaming |
US20150007024A1 (en) * | 2013-06-28 | 2015-01-01 | Samsung Electronics Co., Ltd. | Method and apparatus for generating image file |
US8943197B1 (en) | 2012-08-16 | 2015-01-27 | Amazon Technologies, Inc. | Automated content update notification |
US8972477B1 (en) | 2011-12-01 | 2015-03-03 | Amazon Technologies, Inc. | Offline browsing session management |
US9002982B2 (en) | 2013-03-11 | 2015-04-07 | Amazon Technologies, Inc. | Automated desktop placement |
US9009334B1 (en) | 2011-12-09 | 2015-04-14 | Amazon Technologies, Inc. | Remote browsing session management |
US9037975B1 (en) | 2012-02-10 | 2015-05-19 | Amazon Technologies, Inc. | Zooming interaction tracking and popularity determination |
US9037696B2 (en) | 2011-08-16 | 2015-05-19 | Amazon Technologies, Inc. | Managing information associated with network resources |
US9087024B1 (en) | 2012-01-26 | 2015-07-21 | Amazon Technologies, Inc. | Narration of network content |
US9092405B1 (en) | 2012-01-26 | 2015-07-28 | Amazon Technologies, Inc. | Remote browsing and searching |
US20150220502A1 (en) * | 2014-01-31 | 2015-08-06 | Yahoo! Inc. | Compressed serialization of data for communication from a client-side application |
US20150227548A1 (en) * | 2010-01-22 | 2015-08-13 | Microsoft Technology Licensing, Llc | Storing temporary state data in separate containers |
US9117002B1 (en) | 2011-12-09 | 2015-08-25 | Amazon Technologies, Inc. | Remote browsing session management |
US9118761B1 (en) | 2010-12-20 | 2015-08-25 | United Services Automobile Association (Usaa) | Computing device assistance for phone based customer service representative interaction |
US9137210B1 (en) | 2012-02-21 | 2015-09-15 | Amazon Technologies, Inc. | Remote browsing session management |
US9148350B1 (en) | 2013-03-11 | 2015-09-29 | Amazon Technologies, Inc. | Automated data synchronization |
US20150279336A1 (en) * | 2014-04-01 | 2015-10-01 | Seiko Epson Corporation | Bidirectional display method and bidirectional display device |
US9152970B1 (en) | 2011-09-27 | 2015-10-06 | Amazon Technologies, Inc. | Remote co-browsing session management |
US9164963B2 (en) | 2006-12-05 | 2015-10-20 | Adobe Systems Incorporated | Embedded document within an application |
US9178955B1 (en) | 2011-09-27 | 2015-11-03 | Amazon Technologies, Inc. | Managing network based content |
US9183258B1 (en) | 2012-02-10 | 2015-11-10 | Amazon Technologies, Inc. | Behavior based processing of content |
US9195768B2 (en) | 2011-08-26 | 2015-11-24 | Amazon Technologies, Inc. | Remote browsing session management |
US9208316B1 (en) | 2012-02-27 | 2015-12-08 | Amazon Technologies, Inc. | Selective disabling of content portions |
US9298843B1 (en) * | 2011-09-27 | 2016-03-29 | Amazon Technologies, Inc. | User agent information management |
US9307004B1 (en) | 2012-03-28 | 2016-04-05 | Amazon Technologies, Inc. | Prioritized content transmission |
US9313100B1 (en) | 2011-11-14 | 2016-04-12 | Amazon Technologies, Inc. | Remote browsing session management |
US9330188B1 (en) | 2011-12-22 | 2016-05-03 | Amazon Technologies, Inc. | Shared browsing sessions |
US9336321B1 (en) | 2012-01-26 | 2016-05-10 | Amazon Technologies, Inc. | Remote browsing and searching |
US9374244B1 (en) | 2012-02-27 | 2016-06-21 | Amazon Technologies, Inc. | Remote browsing session management |
US9383958B1 (en) | 2011-09-27 | 2016-07-05 | Amazon Technologies, Inc. | Remote co-browsing session management |
US9460220B1 (en) | 2012-03-26 | 2016-10-04 | Amazon Technologies, Inc. | Content selection based on target device characteristics |
US20160342585A1 (en) * | 2015-05-18 | 2016-11-24 | Google Inc. | Coordinated user word selection for translation and obtaining of contextual information for the selected word |
US9509783B1 (en) | 2012-01-26 | 2016-11-29 | Amazon Technlogogies, Inc. | Customized browser images |
US20160353118A1 (en) * | 2015-06-01 | 2016-12-01 | Apple Inc. | Bandwidth Management in Devices with Simultaneous Download of Multiple Data Streams |
US9578137B1 (en) | 2013-06-13 | 2017-02-21 | Amazon Technologies, Inc. | System for enhancing script execution performance |
US9621406B2 (en) | 2011-06-30 | 2017-04-11 | Amazon Technologies, Inc. | Remote browsing session management |
US9635041B1 (en) | 2014-06-16 | 2017-04-25 | Amazon Technologies, Inc. | Distributed split browser content inspection and analysis |
US9641637B1 (en) | 2011-09-27 | 2017-05-02 | Amazon Technologies, Inc. | Network resource optimization |
US9680897B2 (en) | 2014-01-31 | 2017-06-13 | Yahoo! Inc. | Throttled scanning for optimized compression of network communicated data |
US9772979B1 (en) | 2012-08-08 | 2017-09-26 | Amazon Technologies, Inc. | Reproducing user browsing sessions |
US20180275765A1 (en) * | 2013-11-18 | 2018-09-27 | Amazon Technologies, Inc. | Account management services for load balancers |
US10089403B1 (en) | 2011-08-31 | 2018-10-02 | Amazon Technologies, Inc. | Managing network based storage |
US10142406B2 (en) | 2013-03-11 | 2018-11-27 | Amazon Technologies, Inc. | Automated data center selection |
US10152463B1 (en) | 2013-06-13 | 2018-12-11 | Amazon Technologies, Inc. | System for profiling page browsing interactions |
US10296558B1 (en) | 2012-02-27 | 2019-05-21 | Amazon Technologies, Inc. | Remote generation of composite content pages |
US10313345B2 (en) | 2013-03-11 | 2019-06-04 | Amazon Technologies, Inc. | Application marketplace for virtual desktops |
US10623243B2 (en) | 2013-06-26 | 2020-04-14 | Amazon Technologies, Inc. | Management of computing sessions |
US10664538B1 (en) | 2017-09-26 | 2020-05-26 | Amazon Technologies, Inc. | Data security and data access auditing for network accessible content |
US10686646B1 (en) | 2013-06-26 | 2020-06-16 | Amazon Technologies, Inc. | Management of computing sessions |
US10693991B1 (en) | 2011-09-27 | 2020-06-23 | Amazon Technologies, Inc. | Remote browsing session management |
US10726095B1 (en) | 2017-09-26 | 2020-07-28 | Amazon Technologies, Inc. | Network content layout using an intermediary system |
US10754242B2 (en) | 2017-06-30 | 2020-08-25 | Apple Inc. | Adaptive resolution and projection format in multi-direction video |
US10769353B2 (en) | 2014-01-31 | 2020-09-08 | Oath Inc. | Dynamic streaming content provided by server and client-side tracking application |
KR20200126949A (en) * | 2020-10-26 | 2020-11-09 | 삼성전자주식회사 | Image file generating method and apparatus thereof |
US10924747B2 (en) | 2017-02-27 | 2021-02-16 | Apple Inc. | Video coding techniques for multi-view video |
US10999602B2 (en) | 2016-12-23 | 2021-05-04 | Apple Inc. | Sphere projected motion estimation/compensation and mode decision |
US11093752B2 (en) | 2017-06-02 | 2021-08-17 | Apple Inc. | Object tracking in multi-view video |
US11259046B2 (en) | 2017-02-15 | 2022-02-22 | Apple Inc. | Processing of equirectangular object data to compensate for distortion by spherical projections |
US20230215583A1 (en) * | 2021-06-14 | 2023-07-06 | Preh Holding, Llc | Connected body surface care module |
US11880425B2 (en) | 2021-04-02 | 2024-01-23 | Content Square SAS | System and method for identifying and correcting webpage zone target misidentifications |
Families Citing this family (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100788698B1 (en) * | 2006-07-13 | 2007-12-26 | 삼성전자주식회사 | Display service method and network device capable of performing the method, and storage medium thereof |
US9846750B2 (en) * | 2007-11-30 | 2017-12-19 | Apple Inc. | Adding tiles to a graphical user interface |
US20100050221A1 (en) * | 2008-06-20 | 2010-02-25 | Mccutchen David J | Image Delivery System with Image Quality Varying with Frame Rate |
US20090320081A1 (en) * | 2008-06-24 | 2009-12-24 | Chui Charles K | Providing and Displaying Video at Multiple Resolution and Quality Levels |
US11647243B2 (en) * | 2009-06-26 | 2023-05-09 | Seagate Technology Llc | System and method for using an application on a mobile device to transfer internet media content |
US9361130B2 (en) | 2010-05-03 | 2016-06-07 | Apple Inc. | Systems, methods, and computer program products providing an integrated user interface for reading content |
US20110298689A1 (en) * | 2010-06-03 | 2011-12-08 | Microsoft Corporation | Device for Sharing Photographs in Social Settings |
US20110311144A1 (en) * | 2010-06-17 | 2011-12-22 | Microsoft Corporation | Rgb/depth camera for improving speech recognition |
US8850075B2 (en) * | 2011-07-06 | 2014-09-30 | Microsoft Corporation | Predictive, multi-layer caching architectures |
US8819296B2 (en) | 2011-11-17 | 2014-08-26 | Nokia Corporation | Apparatus, a method and a computer program |
US9064447B2 (en) * | 2011-12-13 | 2015-06-23 | Vmware, Inc. | Methods and devices for filtering and displaying data |
US8959431B2 (en) * | 2012-01-16 | 2015-02-17 | Microsoft Corporation | Low resolution placeholder content for document navigation |
KR20140055132A (en) * | 2012-10-30 | 2014-05-09 | 삼성전자주식회사 | Method and apparatus for processing webpage in terminal using cloud server |
US10095663B2 (en) | 2012-11-14 | 2018-10-09 | Amazon Technologies, Inc. | Delivery and display of page previews during page retrieval events |
US20150128030A1 (en) * | 2013-11-01 | 2015-05-07 | M/s. MobileMotion Technologies Private Limited | Method for inline image resizing |
US9582160B2 (en) | 2013-11-14 | 2017-02-28 | Apple Inc. | Semi-automatic organic layout for media streams |
US9489104B2 (en) | 2013-11-14 | 2016-11-08 | Apple Inc. | Viewable frame identification |
CN103646656B (en) * | 2013-11-29 | 2016-05-04 | 腾讯科技(成都)有限公司 | Sound effect treatment method, device, plugin manager and audio plug-in unit |
GB2521407B (en) * | 2013-12-18 | 2019-02-27 | Displaylink Uk Ltd | Display system |
US20150254806A1 (en) * | 2014-03-07 | 2015-09-10 | Apple Inc. | Efficient Progressive Loading Of Media Items |
US11169666B1 (en) | 2014-05-22 | 2021-11-09 | Amazon Technologies, Inc. | Distributed content browsing system using transferred hardware-independent graphics commands |
GB2530751A (en) * | 2014-09-30 | 2016-04-06 | Sony Corp | Video data encoding and decoding |
US9785332B1 (en) | 2014-12-05 | 2017-10-10 | Amazon Technologies, Inc. | Conserving processing resources by controlling updates to damaged tiles of a content page |
US10546038B2 (en) * | 2014-12-08 | 2020-01-28 | Amazon Technologies, Inc. | Intelligent browser-based display tiling |
US10410398B2 (en) * | 2015-02-20 | 2019-09-10 | Qualcomm Incorporated | Systems and methods for reducing memory bandwidth using low quality tiles |
US10735512B2 (en) * | 2015-02-23 | 2020-08-04 | MyGnar, Inc. | Managing data |
US10506244B2 (en) * | 2015-03-06 | 2019-12-10 | Qualcomm Incorporated | Method and apparatus for video coding using adaptive tile sizes |
USD812076S1 (en) | 2015-06-14 | 2018-03-06 | Google Llc | Display screen with graphical user interface for monitoring remote video camera |
USD803241S1 (en) | 2015-06-14 | 2017-11-21 | Google Inc. | Display screen with animated graphical user interface for an alert screen |
US9361011B1 (en) | 2015-06-14 | 2016-06-07 | Google Inc. | Methods and systems for presenting multiple live video feeds in a user interface |
US10133443B2 (en) | 2015-06-14 | 2018-11-20 | Google Llc | Systems and methods for smart home automation using a multifunction status and entry point icon |
US10242119B1 (en) * | 2015-09-28 | 2019-03-26 | Amazon Technologies, Inc. | Systems and methods for displaying web content |
GB2543064B (en) | 2015-10-06 | 2018-08-22 | Displaylink Uk Ltd | Managing display data |
EP3182302B1 (en) * | 2015-12-14 | 2021-12-15 | Samsung Electronics Co., Ltd. | Apparatus and method for sharing state information of web browser in electronic device |
FR3049142A1 (en) * | 2016-03-16 | 2017-09-22 | Orange | ACQUIRING EXTRACTS FROM A MULTIMEDIA STREAM ON A TERMINAL |
TWI559753B (en) * | 2016-03-16 | 2016-11-21 | 晶睿通訊股份有限公司 | Method for transmitting a video on demand |
EP3446219B1 (en) * | 2016-04-22 | 2022-11-09 | Vertigo Media, Inc. | System and method for enhancing data handling in a network environment |
AU2017252566B2 (en) * | 2016-04-22 | 2022-01-27 | Sgph, Llc | System and method for enhancing data handling in a network environment |
ES2728292T3 (en) * | 2016-05-17 | 2019-10-23 | Nolve Dev S L | Server and method to provide secure access to network-based services |
USD882583S1 (en) | 2016-07-12 | 2020-04-28 | Google Llc | Display screen with graphical user interface |
US10263802B2 (en) | 2016-07-12 | 2019-04-16 | Google Llc | Methods and devices for establishing connections with remote cameras |
US10386999B2 (en) | 2016-10-26 | 2019-08-20 | Google Llc | Timeline-video relationship presentation for alert events |
USD843398S1 (en) | 2016-10-26 | 2019-03-19 | Google Llc | Display screen with graphical user interface for a timeline-video relationship presentation for alert events |
US11238290B2 (en) * | 2016-10-26 | 2022-02-01 | Google Llc | Timeline-video relationship processing for alert events |
EP3334164B1 (en) * | 2016-12-09 | 2019-08-21 | Nokia Technologies Oy | A method and an apparatus and a computer program product for video encoding and decoding |
US10362241B2 (en) * | 2016-12-30 | 2019-07-23 | Microsoft Technology Licensing, Llc | Video stream delimiter for combined frame |
US10819921B2 (en) | 2017-05-25 | 2020-10-27 | Google Llc | Camera assembly having a single-piece cover element |
US10683962B2 (en) | 2017-05-25 | 2020-06-16 | Google Llc | Thermal management for a compact electronic device |
US10972685B2 (en) | 2017-05-25 | 2021-04-06 | Google Llc | Video camera assembly having an IR reflector |
CN108833976B (en) * | 2018-06-27 | 2020-01-24 | 深圳看到科技有限公司 | Method and device for evaluating picture quality after dynamic cut-stream of panoramic video |
US10880434B2 (en) * | 2018-11-05 | 2020-12-29 | Nice Ltd | Method and system for creating a fragmented video recording of events on a screen using serverless computing |
US11388288B2 (en) | 2018-11-05 | 2022-07-12 | Nice Ltd. | Systems and methods for parallel recording of events on a screen of a computer |
US10939139B2 (en) | 2018-11-29 | 2021-03-02 | Apple Inc. | Adaptive coding and streaming of multi-directional video |
US11956295B2 (en) | 2019-09-27 | 2024-04-09 | Apple Inc. | Client-end enhanced view prediction for multi-view video streaming exploiting pre-fetched data and side information |
Citations (87)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5708511A (en) * | 1995-03-24 | 1998-01-13 | Eastman Kodak Company | Method for adaptively compressing residual digital image data in a DPCM compression system |
US5727159A (en) * | 1996-04-10 | 1998-03-10 | Kikinis; Dan | System in which a Proxy-Server translates information received from the Internet into a form/format readily usable by low power portable computers |
US5821915A (en) * | 1995-10-11 | 1998-10-13 | Hewlett-Packard Company | Method and apparatus for removing artifacts from scanned halftone images |
US6008847A (en) * | 1996-04-08 | 1999-12-28 | Connectix Corporation | Temporal compression and decompression for video |
US6038257A (en) * | 1997-03-12 | 2000-03-14 | Telefonaktiebolaget L M Ericsson | Motion and still video picture transmission and display |
US6266817B1 (en) * | 1995-04-18 | 2001-07-24 | Sun Microsystems, Inc. | Decoder for a software-implemented end-to-end scalable video delivery system |
US6275534B1 (en) * | 1997-03-19 | 2001-08-14 | Nec Corporation | Moving picture transmission system and moving picture transmission apparatus used therein |
US6282240B1 (en) * | 1997-09-03 | 2001-08-28 | Oki Electric Industry Co., Ltd. | Picture coder, picture decoder, and transmission system |
US6285791B1 (en) * | 1996-12-09 | 2001-09-04 | Telecom Finland Oy | Transmission method for video or moving pictures by compressing block differences |
US6292834B1 (en) * | 1997-03-14 | 2001-09-18 | Microsoft Corporation | Dynamic bandwidth selection for efficient transmission of multimedia streams in a computer network |
US20020015532A1 (en) * | 1997-07-28 | 2002-02-07 | Physical Optics Corporation | Method of isomorphic singular manifold projection still/video imagery compression |
US6366298B1 (en) * | 1999-06-03 | 2002-04-02 | Netzero, Inc. | Monitoring of individual internet usage |
US20020041629A1 (en) * | 2000-06-30 | 2002-04-11 | Miska Hannuksela | Video error resilience |
US20020059368A1 (en) * | 2000-01-07 | 2002-05-16 | Soneticom, Inc. | Wireless remote computer interface system |
US6397230B1 (en) * | 1996-02-09 | 2002-05-28 | Geo Interactive Media Group, Ltd. | Real-time multimedia transmission |
US20020067353A1 (en) * | 2000-12-04 | 2002-06-06 | Kenyon Jeremy A. | Method and apparatus for distributing and displaying maps electronically |
US20020122491A1 (en) * | 2001-01-03 | 2002-09-05 | Marta Karczewicz | Video decoder architecture and method for using same |
US20020146074A1 (en) * | 2001-02-20 | 2002-10-10 | Cute Ltd. | Unequal error protection of variable-length data packets based on recursive systematic convolutional coding |
US6496203B1 (en) * | 1998-05-27 | 2002-12-17 | Microsoft Corporation | Standardized and application-independent graphical user interface components implemented with web technology |
US20030020722A1 (en) * | 2001-07-06 | 2003-01-30 | Mikio Miura | Image display apparatus |
US20030039312A1 (en) * | 2001-08-23 | 2003-02-27 | Michael Horowitz | System and method for video error concealment |
US6529552B1 (en) * | 1999-02-16 | 2003-03-04 | Packetvideo Corporation | Method and a device for transmission of a variable bit-rate compressed video bitstream over constant and variable capacity networks |
US20030046708A1 (en) * | 2001-08-28 | 2003-03-06 | Jutzi Curtis E. | Error correction for regional and dynamic factors in communications |
US20030079222A1 (en) * | 2000-10-06 | 2003-04-24 | Boykin Patrick Oscar | System and method for distributing perceptually encrypted encoded files of music and movies |
US6563517B1 (en) * | 1998-10-02 | 2003-05-13 | International Business Machines Corp. | Automatic data quality adjustment to reduce response time in browsing |
US6578201B1 (en) * | 1998-11-20 | 2003-06-10 | Diva Systems Corporation | Multimedia stream incorporating interactive support for multiple types of subscriber terminals |
US6584493B1 (en) * | 1999-03-02 | 2003-06-24 | Microsoft Corporation | Multiparty conferencing and collaboration system utilizing a per-host model command, control and communication structure |
US20030122954A1 (en) * | 1998-01-01 | 2003-07-03 | Kassatly L. Samuel Anthony | Video camera and method and device for capturing video, audio and data signals |
US20030132957A1 (en) * | 2002-01-15 | 2003-07-17 | International Business Machines Corporation | System for recording world wide web browsing sessions navigation on a real-time basis and for subsequently displaying the recorded sessions as surrogate browsing sessions with user enabled real-time modification |
US20030138050A1 (en) * | 2001-03-30 | 2003-07-24 | Yoshihisa Yamada | Dynamic image receiver and dynamic image transmitter |
US20030177269A1 (en) * | 2002-03-14 | 2003-09-18 | Robinson Ian N. | Method and system that tailors format of transmission to suit client capabilities and link characteristics |
US20030198184A1 (en) * | 2001-08-31 | 2003-10-23 | Joe Huang | Method of dynamically determining real-time multimedia streaming rate over a communications networks |
US20030227977A1 (en) * | 2002-05-29 | 2003-12-11 | Canon Kabushiki Kaisha | Method and device for selecting a transcoding method from a set of transcoding methods |
US20040022322A1 (en) * | 2002-07-19 | 2004-02-05 | Meetrix Corporation | Assigning prioritization during encode of independently compressed objects |
US6704024B2 (en) * | 2000-08-07 | 2004-03-09 | Zframe, Inc. | Visual content browsing using rasterized representations |
US20040067041A1 (en) * | 2002-10-02 | 2004-04-08 | Seo Kang Soo | Recording medium having a data structure for managing reproduction of graphic data and recording and reproducing methods and apparatuses |
US20040083236A1 (en) * | 1999-11-18 | 2004-04-29 | Rust David Bradley | System and method for application viewing through collaborative web browsing session |
US20040109005A1 (en) * | 2002-07-17 | 2004-06-10 | Witt Sarah Elizabeth | Video processing |
US20040184523A1 (en) * | 2003-02-25 | 2004-09-23 | Dawson Thomas Patrick | Method and system for providing reduced bandwidth for picture in picture video transmissions |
US20050052294A1 (en) * | 2003-09-07 | 2005-03-10 | Microsoft Corporation | Multi-layer run level encoding and decoding |
US20050081158A1 (en) * | 2003-10-08 | 2005-04-14 | Samsung Electronics Co., Ltd. | Apparatus and method for remote controlling |
US20050089092A1 (en) * | 2003-10-22 | 2005-04-28 | Yasuhiro Hashimoto | Moving picture encoding apparatus |
US20050100233A1 (en) * | 2000-06-06 | 2005-05-12 | Noriko Kajiki | Method and system for compressing motion image information |
US20050105619A1 (en) * | 2003-11-19 | 2005-05-19 | Institute For Information Industry | Transcoder system for adaptively reducing frame-rate |
US20050132286A1 (en) * | 2000-06-12 | 2005-06-16 | Rohrabaugh Gary B. | Resolution independent vector display of internet content |
US6909753B2 (en) * | 2001-12-05 | 2005-06-21 | Koninklijke Philips Electronics, N.V. | Combined MPEG-4 FGS and modulation algorithm for wireless video transmission |
US20050147247A1 (en) * | 2003-11-14 | 2005-07-07 | Westberg Thomas E. | Interactive television systems having POD modules and methods for use in the same |
US20050195899A1 (en) * | 2004-03-04 | 2005-09-08 | Samsung Electronics Co., Ltd. | Method and apparatus for video coding, predecoding, and video decoding for video streaming service, and image filtering method |
US20050232359A1 (en) * | 2004-04-14 | 2005-10-20 | Samsung Electronics Co., Ltd. | Inter-frame prediction method in video coding, video encoder, video decoding method, and video decoder |
US20050257167A1 (en) * | 2004-05-11 | 2005-11-17 | International Business Machines Corporation | Embedded Web dialog |
US20050267779A1 (en) * | 2004-05-31 | 2005-12-01 | Samsung Electronics Co., Ltd. | Method, apparatus, and medium for servicing clients in remote areas |
US20050283734A1 (en) * | 1999-10-29 | 2005-12-22 | Surfcast, Inc., A Delaware Corporation | System and method for simultaneous display of multiple information sources |
US6990534B2 (en) * | 2001-07-20 | 2006-01-24 | Flowfinity Wireless, Inc. | Method for a proactive browser system for implementing background frame maintenance and asynchronous frame submissions |
US20060018378A1 (en) * | 2004-07-09 | 2006-01-26 | Stmicroelectronics S.R.L. | Method and system for delivery of coded information streams, related network and computer program product therefor |
US7016963B1 (en) * | 2001-06-29 | 2006-03-21 | Glow Designs, Llc | Content management and transformation system for digital content |
US20060069797A1 (en) * | 2004-09-10 | 2006-03-30 | Microsoft Corporation | Systems and methods for multimedia remoting over terminal server connections |
US20060078051A1 (en) * | 2004-10-12 | 2006-04-13 | Yi Liang | Adaptive intra-refresh for digital video encoding |
US20060095944A1 (en) * | 2004-10-30 | 2006-05-04 | Demircin Mehmet U | Sender-side bandwidth estimation for video transmission with receiver packet buffer |
US7043745B2 (en) * | 2000-12-29 | 2006-05-09 | Etalk Corporation | System and method for reproducing a video session using accelerated frame recording |
US20060098738A1 (en) * | 2003-01-09 | 2006-05-11 | Pamela Cosman | Video encoding methods and devices |
US7054365B2 (en) * | 2000-09-27 | 2006-05-30 | Electronics And Telecommunications Research Institute | Method for providing variable bit rate in streaming service |
US20060150224A1 (en) * | 2002-12-31 | 2006-07-06 | Othon Kamariotis | Video streaming |
US20060174026A1 (en) * | 2005-01-05 | 2006-08-03 | Aaron Robinson | System and method for a remote user interface |
US7088398B1 (en) * | 2001-12-24 | 2006-08-08 | Silicon Image, Inc. | Method and apparatus for regenerating a clock for auxiliary data transmitted over a serial link with video data |
US20060174614A1 (en) * | 2005-02-08 | 2006-08-10 | Xingen Dong | Control devices for swashplate type variable displacement piston pump |
US20060184614A1 (en) * | 2005-02-03 | 2006-08-17 | The Trustees Of Columbia University In The City Of New York | Thin-client network computing method and system |
US20060210196A1 (en) * | 2000-07-24 | 2006-09-21 | Quark, Inc. | Method and system using non-uniform image blocks for rapid interactive viewing of digital image over a network |
US20060218285A1 (en) * | 2005-03-25 | 2006-09-28 | Vanish Talwar | Remote desktop performance model for assigning resources |
US20060233246A1 (en) * | 2004-12-06 | 2006-10-19 | Park Seung W | Method and apparatus for encoding, transmitting, and decoding a video signal |
US20060277478A1 (en) * | 2005-06-02 | 2006-12-07 | Microsoft Corporation | Temporary title and menu bar |
US20060282855A1 (en) * | 2005-05-05 | 2006-12-14 | Digital Display Innovations, Llc | Multiple remote display system |
US20060285594A1 (en) * | 2005-06-21 | 2006-12-21 | Changick Kim | Motion estimation and inter-mode prediction |
US20060291561A1 (en) * | 2005-06-24 | 2006-12-28 | Samsung Electronics Co., Ltd. | Motion error detector, motion error compensator comprising the same, and method for detecting and compensating motion error using the motion error compensator |
US20070098283A1 (en) * | 2005-10-06 | 2007-05-03 | Samsung Electronics Co., Ltd. | Hybrid image data processing system and method |
US20070116117A1 (en) * | 2005-11-18 | 2007-05-24 | Apple Computer, Inc. | Controlling buffer states in video compression coding to enable editing and distributed encoding |
US20070121720A1 (en) * | 2005-11-30 | 2007-05-31 | Kenji Yamane | Image coding apparatus, image decoding apparatus and image processing system |
US7257158B1 (en) * | 1998-05-18 | 2007-08-14 | Kendyl A. Román | System for transmitting video images over a computer network to a remote receiver |
US20070250711A1 (en) * | 2006-04-25 | 2007-10-25 | Phonified Llc | System and method for presenting and inputting information on a mobile device |
US20070277109A1 (en) * | 2006-05-24 | 2007-11-29 | Chen You B | Customizable user interface wrappers for web applications |
US20080062322A1 (en) * | 2006-08-28 | 2008-03-13 | Ortiva Wireless | Digital video content customization |
US20080065980A1 (en) * | 2006-09-08 | 2008-03-13 | Opera Software Asa | Modifying a markup language document which includes a clickable image |
US20080071857A1 (en) * | 2006-09-20 | 2008-03-20 | Opera Software Asa | Method, computer program, transcoding server and computer system for modifying a digital document |
US20080158333A1 (en) * | 2004-04-30 | 2008-07-03 | Worldgate Service, Inc. | Adaptive Video Telephone System |
US7483575B2 (en) * | 2002-10-25 | 2009-01-27 | Sony Corporation | Picture encoding apparatus and method, program and recording medium |
US20090219992A1 (en) * | 2005-08-04 | 2009-09-03 | Charles Chunaming Wang | Compensating delay of channel state information between receiver and transmitter during adaptive video delivery |
US7821953B2 (en) * | 2005-05-13 | 2010-10-26 | Yahoo! Inc. | Dynamically selecting CODECS for managing an audio message |
US8018850B2 (en) * | 2004-02-23 | 2011-09-13 | Sharp Laboratories Of America, Inc. | Wireless video transmission system |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6624761B2 (en) * | 1998-12-11 | 2003-09-23 | Realtime Data, Llc | Content independent data compression method and system |
US6601104B1 (en) * | 1999-03-11 | 2003-07-29 | Realtime Data Llc | System and methods for accelerated data storage and retrieval |
US6604158B1 (en) * | 1999-03-11 | 2003-08-05 | Realtime Data, Llc | System and methods for accelerated data storage and retrieval |
US6711294B1 (en) * | 1999-03-31 | 2004-03-23 | International Business Machines Corporation | Method and apparatus for reducing image data storage and processing based on device supported compression techniques |
US7095782B1 (en) * | 2000-03-01 | 2006-08-22 | Koninklijke Philips Electronics N.V. | Method and apparatus for streaming scalable video |
JP2003259310A (en) | 2002-02-28 | 2003-09-12 | Mitsubishi Electric Corp | Menu apparatus for video/audio recording and reproducing medium |
GB0303888D0 (en) * | 2003-02-19 | 2003-03-26 | Sec Dep Acting Through Ordnanc | Image streaming |
JP4998775B2 (en) | 2004-02-24 | 2012-08-15 | 日本電気株式会社 | Information distribution system and method, information distribution apparatus, receiving terminal, and information relay apparatus |
EP1585326A1 (en) * | 2004-03-30 | 2005-10-12 | Matsushita Electric Industrial Co., Ltd. | Motion vector estimation at image borders for frame rate conversion |
JP2006270690A (en) | 2005-03-25 | 2006-10-05 | Funai Electric Co Ltd | Data transmission system |
-
2008
- 2008-01-25 US US12/020,361 patent/US8630512B2/en active Active
- 2008-01-25 US US12/020,472 patent/US20080184128A1/en not_active Abandoned
- 2008-01-25 WO PCT/US2008/052129 patent/WO2008092131A2/en active Application Filing
- 2008-01-25 WO PCT/US2008/052092 patent/WO2008092104A2/en active Application Filing
Patent Citations (90)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5708511A (en) * | 1995-03-24 | 1998-01-13 | Eastman Kodak Company | Method for adaptively compressing residual digital image data in a DPCM compression system |
US6266817B1 (en) * | 1995-04-18 | 2001-07-24 | Sun Microsystems, Inc. | Decoder for a software-implemented end-to-end scalable video delivery system |
US5821915A (en) * | 1995-10-11 | 1998-10-13 | Hewlett-Packard Company | Method and apparatus for removing artifacts from scanned halftone images |
US6397230B1 (en) * | 1996-02-09 | 2002-05-28 | Geo Interactive Media Group, Ltd. | Real-time multimedia transmission |
US6008847A (en) * | 1996-04-08 | 1999-12-28 | Connectix Corporation | Temporal compression and decompression for video |
US5727159A (en) * | 1996-04-10 | 1998-03-10 | Kikinis; Dan | System in which a Proxy-Server translates information received from the Internet into a form/format readily usable by low power portable computers |
US6285791B1 (en) * | 1996-12-09 | 2001-09-04 | Telecom Finland Oy | Transmission method for video or moving pictures by compressing block differences |
US6038257A (en) * | 1997-03-12 | 2000-03-14 | Telefonaktiebolaget L M Ericsson | Motion and still video picture transmission and display |
US6292834B1 (en) * | 1997-03-14 | 2001-09-18 | Microsoft Corporation | Dynamic bandwidth selection for efficient transmission of multimedia streams in a computer network |
US6275534B1 (en) * | 1997-03-19 | 2001-08-14 | Nec Corporation | Moving picture transmission system and moving picture transmission apparatus used therein |
US20020015532A1 (en) * | 1997-07-28 | 2002-02-07 | Physical Optics Corporation | Method of isomorphic singular manifold projection still/video imagery compression |
US6282240B1 (en) * | 1997-09-03 | 2001-08-28 | Oki Electric Industry Co., Ltd. | Picture coder, picture decoder, and transmission system |
US20030122954A1 (en) * | 1998-01-01 | 2003-07-03 | Kassatly L. Samuel Anthony | Video camera and method and device for capturing video, audio and data signals |
US7257158B1 (en) * | 1998-05-18 | 2007-08-14 | Kendyl A. Román | System for transmitting video images over a computer network to a remote receiver |
US6496203B1 (en) * | 1998-05-27 | 2002-12-17 | Microsoft Corporation | Standardized and application-independent graphical user interface components implemented with web technology |
US6563517B1 (en) * | 1998-10-02 | 2003-05-13 | International Business Machines Corp. | Automatic data quality adjustment to reduce response time in browsing |
US6578201B1 (en) * | 1998-11-20 | 2003-06-10 | Diva Systems Corporation | Multimedia stream incorporating interactive support for multiple types of subscriber terminals |
US6529552B1 (en) * | 1999-02-16 | 2003-03-04 | Packetvideo Corporation | Method and a device for transmission of a variable bit-rate compressed video bitstream over constant and variable capacity networks |
US6584493B1 (en) * | 1999-03-02 | 2003-06-24 | Microsoft Corporation | Multiparty conferencing and collaboration system utilizing a per-host model command, control and communication structure |
US6366298B1 (en) * | 1999-06-03 | 2002-04-02 | Netzero, Inc. | Monitoring of individual internet usage |
US20050283734A1 (en) * | 1999-10-29 | 2005-12-22 | Surfcast, Inc., A Delaware Corporation | System and method for simultaneous display of multiple information sources |
US20040083236A1 (en) * | 1999-11-18 | 2004-04-29 | Rust David Bradley | System and method for application viewing through collaborative web browsing session |
US20020059368A1 (en) * | 2000-01-07 | 2002-05-16 | Soneticom, Inc. | Wireless remote computer interface system |
US20050100233A1 (en) * | 2000-06-06 | 2005-05-12 | Noriko Kajiki | Method and system for compressing motion image information |
US20050132286A1 (en) * | 2000-06-12 | 2005-06-16 | Rohrabaugh Gary B. | Resolution independent vector display of internet content |
US20020041629A1 (en) * | 2000-06-30 | 2002-04-11 | Miska Hannuksela | Video error resilience |
US7116843B1 (en) * | 2000-07-24 | 2006-10-03 | Quark, Inc. | Method and system using non-uniform image blocks for rapid interactive viewing of digital images over a network |
US20060210196A1 (en) * | 2000-07-24 | 2006-09-21 | Quark, Inc. | Method and system using non-uniform image blocks for rapid interactive viewing of digital image over a network |
US6704024B2 (en) * | 2000-08-07 | 2004-03-09 | Zframe, Inc. | Visual content browsing using rasterized representations |
US7054365B2 (en) * | 2000-09-27 | 2006-05-30 | Electronics And Telecommunications Research Institute | Method for providing variable bit rate in streaming service |
US20030079222A1 (en) * | 2000-10-06 | 2003-04-24 | Boykin Patrick Oscar | System and method for distributing perceptually encrypted encoded files of music and movies |
US20020067353A1 (en) * | 2000-12-04 | 2002-06-06 | Kenyon Jeremy A. | Method and apparatus for distributing and displaying maps electronically |
US7043745B2 (en) * | 2000-12-29 | 2006-05-09 | Etalk Corporation | System and method for reproducing a video session using accelerated frame recording |
US20020122491A1 (en) * | 2001-01-03 | 2002-09-05 | Marta Karczewicz | Video decoder architecture and method for using same |
US20020146074A1 (en) * | 2001-02-20 | 2002-10-10 | Cute Ltd. | Unequal error protection of variable-length data packets based on recursive systematic convolutional coding |
US20030138050A1 (en) * | 2001-03-30 | 2003-07-24 | Yoshihisa Yamada | Dynamic image receiver and dynamic image transmitter |
US7016963B1 (en) * | 2001-06-29 | 2006-03-21 | Glow Designs, Llc | Content management and transformation system for digital content |
US20030020722A1 (en) * | 2001-07-06 | 2003-01-30 | Mikio Miura | Image display apparatus |
US6990534B2 (en) * | 2001-07-20 | 2006-01-24 | Flowfinity Wireless, Inc. | Method for a proactive browser system for implementing background frame maintenance and asynchronous frame submissions |
US20060168101A1 (en) * | 2001-07-20 | 2006-07-27 | Dmytro Mikhailov | Proactive browser system |
US20030039312A1 (en) * | 2001-08-23 | 2003-02-27 | Michael Horowitz | System and method for video error concealment |
US20030046708A1 (en) * | 2001-08-28 | 2003-03-06 | Jutzi Curtis E. | Error correction for regional and dynamic factors in communications |
US20030198184A1 (en) * | 2001-08-31 | 2003-10-23 | Joe Huang | Method of dynamically determining real-time multimedia streaming rate over a communications networks |
US6909753B2 (en) * | 2001-12-05 | 2005-06-21 | Koninklijke Philips Electronics, N.V. | Combined MPEG-4 FGS and modulation algorithm for wireless video transmission |
US7088398B1 (en) * | 2001-12-24 | 2006-08-08 | Silicon Image, Inc. | Method and apparatus for regenerating a clock for auxiliary data transmitted over a serial link with video data |
US20030132957A1 (en) * | 2002-01-15 | 2003-07-17 | International Business Machines Corporation | System for recording world wide web browsing sessions navigation on a real-time basis and for subsequently displaying the recorded sessions as surrogate browsing sessions with user enabled real-time modification |
US20030177269A1 (en) * | 2002-03-14 | 2003-09-18 | Robinson Ian N. | Method and system that tailors format of transmission to suit client capabilities and link characteristics |
US20030227977A1 (en) * | 2002-05-29 | 2003-12-11 | Canon Kabushiki Kaisha | Method and device for selecting a transcoding method from a set of transcoding methods |
US20040109005A1 (en) * | 2002-07-17 | 2004-06-10 | Witt Sarah Elizabeth | Video processing |
US20040022322A1 (en) * | 2002-07-19 | 2004-02-05 | Meetrix Corporation | Assigning prioritization during encode of independently compressed objects |
US20040067041A1 (en) * | 2002-10-02 | 2004-04-08 | Seo Kang Soo | Recording medium having a data structure for managing reproduction of graphic data and recording and reproducing methods and apparatuses |
US20090245668A1 (en) * | 2002-10-25 | 2009-10-01 | Sony Corporation | Picture encoding apparatus and method, program and recording medium |
US7483575B2 (en) * | 2002-10-25 | 2009-01-27 | Sony Corporation | Picture encoding apparatus and method, program and recording medium |
US20060150224A1 (en) * | 2002-12-31 | 2006-07-06 | Othon Kamariotis | Video streaming |
US20060098738A1 (en) * | 2003-01-09 | 2006-05-11 | Pamela Cosman | Video encoding methods and devices |
US20040184523A1 (en) * | 2003-02-25 | 2004-09-23 | Dawson Thomas Patrick | Method and system for providing reduced bandwidth for picture in picture video transmissions |
US20050052294A1 (en) * | 2003-09-07 | 2005-03-10 | Microsoft Corporation | Multi-layer run level encoding and decoding |
US20050081158A1 (en) * | 2003-10-08 | 2005-04-14 | Samsung Electronics Co., Ltd. | Apparatus and method for remote controlling |
US20050089092A1 (en) * | 2003-10-22 | 2005-04-28 | Yasuhiro Hashimoto | Moving picture encoding apparatus |
US20050147247A1 (en) * | 2003-11-14 | 2005-07-07 | Westberg Thomas E. | Interactive television systems having POD modules and methods for use in the same |
US20050105619A1 (en) * | 2003-11-19 | 2005-05-19 | Institute For Information Industry | Transcoder system for adaptively reducing frame-rate |
US8018850B2 (en) * | 2004-02-23 | 2011-09-13 | Sharp Laboratories Of America, Inc. | Wireless video transmission system |
US20050195899A1 (en) * | 2004-03-04 | 2005-09-08 | Samsung Electronics Co., Ltd. | Method and apparatus for video coding, predecoding, and video decoding for video streaming service, and image filtering method |
US20050232359A1 (en) * | 2004-04-14 | 2005-10-20 | Samsung Electronics Co., Ltd. | Inter-frame prediction method in video coding, video encoder, video decoding method, and video decoder |
US20080158333A1 (en) * | 2004-04-30 | 2008-07-03 | Worldgate Service, Inc. | Adaptive Video Telephone System |
US20050257167A1 (en) * | 2004-05-11 | 2005-11-17 | International Business Machines Corporation | Embedded Web dialog |
US20050267779A1 (en) * | 2004-05-31 | 2005-12-01 | Samsung Electronics Co., Ltd. | Method, apparatus, and medium for servicing clients in remote areas |
US20060018378A1 (en) * | 2004-07-09 | 2006-01-26 | Stmicroelectronics S.R.L. | Method and system for delivery of coded information streams, related network and computer program product therefor |
US20060069797A1 (en) * | 2004-09-10 | 2006-03-30 | Microsoft Corporation | Systems and methods for multimedia remoting over terminal server connections |
US20060078051A1 (en) * | 2004-10-12 | 2006-04-13 | Yi Liang | Adaptive intra-refresh for digital video encoding |
US20060095944A1 (en) * | 2004-10-30 | 2006-05-04 | Demircin Mehmet U | Sender-side bandwidth estimation for video transmission with receiver packet buffer |
US20060233246A1 (en) * | 2004-12-06 | 2006-10-19 | Park Seung W | Method and apparatus for encoding, transmitting, and decoding a video signal |
US20060174026A1 (en) * | 2005-01-05 | 2006-08-03 | Aaron Robinson | System and method for a remote user interface |
US20060184614A1 (en) * | 2005-02-03 | 2006-08-17 | The Trustees Of Columbia University In The City Of New York | Thin-client network computing method and system |
US20060174614A1 (en) * | 2005-02-08 | 2006-08-10 | Xingen Dong | Control devices for swashplate type variable displacement piston pump |
US20060218285A1 (en) * | 2005-03-25 | 2006-09-28 | Vanish Talwar | Remote desktop performance model for assigning resources |
US20060282855A1 (en) * | 2005-05-05 | 2006-12-14 | Digital Display Innovations, Llc | Multiple remote display system |
US7821953B2 (en) * | 2005-05-13 | 2010-10-26 | Yahoo! Inc. | Dynamically selecting CODECS for managing an audio message |
US20060277478A1 (en) * | 2005-06-02 | 2006-12-07 | Microsoft Corporation | Temporary title and menu bar |
US20060285594A1 (en) * | 2005-06-21 | 2006-12-21 | Changick Kim | Motion estimation and inter-mode prediction |
US20060291561A1 (en) * | 2005-06-24 | 2006-12-28 | Samsung Electronics Co., Ltd. | Motion error detector, motion error compensator comprising the same, and method for detecting and compensating motion error using the motion error compensator |
US20090219992A1 (en) * | 2005-08-04 | 2009-09-03 | Charles Chunaming Wang | Compensating delay of channel state information between receiver and transmitter during adaptive video delivery |
US20070098283A1 (en) * | 2005-10-06 | 2007-05-03 | Samsung Electronics Co., Ltd. | Hybrid image data processing system and method |
US20070116117A1 (en) * | 2005-11-18 | 2007-05-24 | Apple Computer, Inc. | Controlling buffer states in video compression coding to enable editing and distributed encoding |
US20070121720A1 (en) * | 2005-11-30 | 2007-05-31 | Kenji Yamane | Image coding apparatus, image decoding apparatus and image processing system |
US20070250711A1 (en) * | 2006-04-25 | 2007-10-25 | Phonified Llc | System and method for presenting and inputting information on a mobile device |
US20070277109A1 (en) * | 2006-05-24 | 2007-11-29 | Chen You B | Customizable user interface wrappers for web applications |
US20080062322A1 (en) * | 2006-08-28 | 2008-03-13 | Ortiva Wireless | Digital video content customization |
US20080065980A1 (en) * | 2006-09-08 | 2008-03-13 | Opera Software Asa | Modifying a markup language document which includes a clickable image |
US20080071857A1 (en) * | 2006-09-20 | 2008-03-20 | Opera Software Asa | Method, computer program, transcoding server and computer system for modifying a digital document |
Cited By (152)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9164963B2 (en) | 2006-12-05 | 2015-10-20 | Adobe Systems Incorporated | Embedded document within an application |
US10163088B2 (en) | 2006-12-05 | 2018-12-25 | Adobe Systems Incorporated | Embedded document within an application |
US9582478B2 (en) | 2006-12-05 | 2017-02-28 | Adobe Systems Incorporated | Embedded document within an application |
US7743339B1 (en) * | 2007-02-01 | 2010-06-22 | Adobe Systems Incorporated | Rendering text in a brew device |
US8443299B1 (en) | 2007-02-01 | 2013-05-14 | Adobe Systems Incorporated | Rendering text in a brew device |
US8589779B2 (en) | 2007-03-08 | 2013-11-19 | Adobe Systems Incorporated | Event-sensitive content for mobile devices |
US20080222520A1 (en) * | 2007-03-08 | 2008-09-11 | Adobe Systems Incorporated | Event-Sensitive Content for Mobile Devices |
US20100042678A1 (en) * | 2008-08-12 | 2010-02-18 | Samsung Electronics Co., Ltd. | Method and apparatus for providing/receiving user interface using user interface directory |
US20100100735A1 (en) * | 2008-10-17 | 2010-04-22 | Qualcomm Incorporated | Apparatus and method for providing a portable broadband service using a wireless convergence platform |
KR101255658B1 (en) | 2008-10-17 | 2013-04-17 | 퀄컴 인코포레이티드 | Apparatus and method for providing a portable broadband service using a wireless convergence platform |
US9084282B2 (en) * | 2008-10-17 | 2015-07-14 | Qualcomm Incorporated | Apparatus and method for providing a portable broadband service using a wireless convergence platform |
US20100115458A1 (en) * | 2008-10-26 | 2010-05-06 | Adam Marano | Panning a native display on a mobile computing device to a window, interpreting a gesture-based instruction to scroll contents of the window, and wrapping text on the window |
US8239773B1 (en) | 2008-10-28 | 2012-08-07 | United Services Automobile Association (Usaa) | Systems and methods for co-browsing on a mobile device |
US9386443B2 (en) | 2009-05-04 | 2016-07-05 | Qualcomm Incorporated | System and method of recording and sharing mobile application activities |
US8346915B2 (en) * | 2009-05-04 | 2013-01-01 | Qualcomm Incorporated | System and method of recording and sharing mobile application activities |
US20100281156A1 (en) * | 2009-05-04 | 2010-11-04 | Kies Jonathan K | System and method of recording and sharing mobile application activities |
US20100325565A1 (en) * | 2009-06-17 | 2010-12-23 | EchoStar Technologies, L.L.C. | Apparatus and methods for generating graphical interfaces |
US20120221682A1 (en) * | 2009-10-28 | 2012-08-30 | Nec Corporation | Remote mobile communication system and remote mobile communication method |
US8738711B2 (en) * | 2009-11-03 | 2014-05-27 | Oto Technologies, Llc | System and method for redirecting client-side storage operations |
US20110106874A1 (en) * | 2009-11-03 | 2011-05-05 | Oto Technologies, Llc | System and method for redirecting client-side storage operations |
US20110145723A1 (en) * | 2009-12-16 | 2011-06-16 | Oto Technologies, Llc | System and method for redirecting client-side storage operations |
US10346365B2 (en) * | 2010-01-22 | 2019-07-09 | Microsoft Technology Licensing, Llc | Storing temporary state data in separate containers |
US20150227548A1 (en) * | 2010-01-22 | 2015-08-13 | Microsoft Technology Licensing, Llc | Storing temporary state data in separate containers |
US8468130B2 (en) | 2010-04-02 | 2013-06-18 | Skyfire Labs, Inc. | Assisted hybrid mobile browser |
WO2011123859A1 (en) * | 2010-04-02 | 2011-10-06 | Skyfire Labs, Inc. | Assisted hybrid mobile browser |
US20110258530A1 (en) * | 2010-04-19 | 2011-10-20 | Se Yoon Jang | Mobile terminal and controlling method thereof |
US8887041B2 (en) * | 2010-04-19 | 2014-11-11 | Lg Electronics Inc. | Displaying a call function within a web browser |
US20130058213A1 (en) * | 2010-05-10 | 2013-03-07 | Nec Corporation | Remote mobile communication system, server device, and remote mobile communication system control method |
US9118953B2 (en) * | 2010-05-10 | 2015-08-25 | Nec Corporation | Remote mobile communication system, server device, and remote mobile communication system control method |
US20130066950A1 (en) * | 2010-06-01 | 2013-03-14 | Zte Corporation | Service Development Platform, System and Method Thereof |
US20110310011A1 (en) * | 2010-06-22 | 2011-12-22 | Hsni, Llc | System and method for integrating an electronic pointing device into digital image data |
US9094707B2 (en) | 2010-06-22 | 2015-07-28 | Hsni Llc | System and method for integrating an electronic pointing device into digital image data |
US9294556B2 (en) | 2010-06-22 | 2016-03-22 | Hsni, Llc | System and method for integrating an electronic pointing device into digital image data |
US8717289B2 (en) * | 2010-06-22 | 2014-05-06 | Hsni Llc | System and method for integrating an electronic pointing device into digital image data |
US9948701B2 (en) | 2010-06-22 | 2018-04-17 | Hsni, Llc | System and method for integrating an electronic pointing device into digital image data |
WO2011163364A1 (en) * | 2010-06-22 | 2011-12-29 | Hsni, Llc | System and method for integrating an electronic pointing device into digital image data |
US10270844B2 (en) | 2010-06-22 | 2019-04-23 | Hsni, Llc | System and method for integrating an electronic pointing device into digital image data |
US9135509B2 (en) * | 2010-07-26 | 2015-09-15 | Koninklijke Philips N.V. | Determining representative images for a video |
US20130121586A1 (en) * | 2010-07-26 | 2013-05-16 | Koninklijke Philips Electronics N.V. | Determining representative images for a video |
US9253129B2 (en) * | 2010-10-06 | 2016-02-02 | International Business Machines Corporation | Instant messaging with browser collaboration |
US9253128B2 (en) * | 2010-10-06 | 2016-02-02 | International Business Machines Corporation | Instant messaging with browser collaboration |
US20120089926A1 (en) * | 2010-10-06 | 2012-04-12 | International Business Machines Corporation | Instant Messaging with Browser Collaboration |
US20120240057A1 (en) * | 2010-10-06 | 2012-09-20 | International Business Machines Corporation | Instant Messaging with Browser Collaboration |
US8793310B2 (en) * | 2010-10-15 | 2014-07-29 | Samsung Electronics Co., Ltd. | Method and apparatus for updating user interface |
US9712596B2 (en) | 2010-10-15 | 2017-07-18 | Samsung Electronics Co., Ltd | Method and apparatus for updating user interface |
US20120096072A1 (en) * | 2010-10-15 | 2012-04-19 | Samsung Electronics Co., Ltd. | Method and apparatus for updating user interface |
US9503580B1 (en) | 2010-12-20 | 2016-11-22 | United Services Automobile Association (Usaa) | Computing device assistance for phone based customer service representative interaction |
US9118761B1 (en) | 2010-12-20 | 2015-08-25 | United Services Automobile Association (Usaa) | Computing device assistance for phone based customer service representative interaction |
US20130007625A1 (en) * | 2011-03-21 | 2013-01-03 | Victor Lozinski | Apparatus, system, and method for connecting mobile devices to a backend server in an enterprise software environment and initiating a business process |
US8799412B2 (en) | 2011-06-30 | 2014-08-05 | Amazon Technologies, Inc. | Remote browsing session management |
US10506076B2 (en) | 2011-06-30 | 2019-12-10 | Amazon Technologies, Inc. | Remote browsing session management with multiple content versions |
US8577963B2 (en) | 2011-06-30 | 2013-11-05 | Amazon Technologies, Inc. | Remote browsing session between client browser and network based browser |
US10116487B2 (en) | 2011-06-30 | 2018-10-30 | Amazon Technologies, Inc. | Management of interactions with representations of rendered and unprocessed content |
US9621406B2 (en) | 2011-06-30 | 2017-04-11 | Amazon Technologies, Inc. | Remote browsing session management |
US8706860B2 (en) | 2011-06-30 | 2014-04-22 | Amazon Technologies, Inc. | Remote browsing session management |
US9037696B2 (en) | 2011-08-16 | 2015-05-19 | Amazon Technologies, Inc. | Managing information associated with network resources |
US9870426B2 (en) | 2011-08-16 | 2018-01-16 | Amazon Technologies, Inc. | Managing information associated with network resources |
US9195768B2 (en) | 2011-08-26 | 2015-11-24 | Amazon Technologies, Inc. | Remote browsing session management |
US10063618B2 (en) | 2011-08-26 | 2018-08-28 | Amazon Technologies, Inc. | Remote browsing session management |
US10089403B1 (en) | 2011-08-31 | 2018-10-02 | Amazon Technologies, Inc. | Managing network based storage |
US9641637B1 (en) | 2011-09-27 | 2017-05-02 | Amazon Technologies, Inc. | Network resource optimization |
US8589385B2 (en) | 2011-09-27 | 2013-11-19 | Amazon Technologies, Inc. | Historical browsing session management |
US8849802B2 (en) | 2011-09-27 | 2014-09-30 | Amazon Technologies, Inc. | Historical browsing session management |
US10693991B1 (en) | 2011-09-27 | 2020-06-23 | Amazon Technologies, Inc. | Remote browsing session management |
US9383958B1 (en) | 2011-09-27 | 2016-07-05 | Amazon Technologies, Inc. | Remote co-browsing session management |
US9298843B1 (en) * | 2011-09-27 | 2016-03-29 | Amazon Technologies, Inc. | User agent information management |
US9152970B1 (en) | 2011-09-27 | 2015-10-06 | Amazon Technologies, Inc. | Remote co-browsing session management |
US9253284B2 (en) | 2011-09-27 | 2016-02-02 | Amazon Technologies, Inc. | Historical browsing session management |
US8914514B1 (en) | 2011-09-27 | 2014-12-16 | Amazon Technologies, Inc. | Managing network based content |
US9178955B1 (en) | 2011-09-27 | 2015-11-03 | Amazon Technologies, Inc. | Managing network based content |
US8615431B1 (en) | 2011-09-29 | 2013-12-24 | Amazon Technologies, Inc. | Network content message placement management |
US20130104032A1 (en) * | 2011-10-19 | 2013-04-25 | Jiyoun Lee | Mobile terminal and method of controlling the same |
US9313100B1 (en) | 2011-11-14 | 2016-04-12 | Amazon Technologies, Inc. | Remote browsing session management |
US8972477B1 (en) | 2011-12-01 | 2015-03-03 | Amazon Technologies, Inc. | Offline browsing session management |
US10057320B2 (en) | 2011-12-01 | 2018-08-21 | Amazon Technologies, Inc. | Offline browsing session management |
US9009334B1 (en) | 2011-12-09 | 2015-04-14 | Amazon Technologies, Inc. | Remote browsing session management |
US9866615B2 (en) | 2011-12-09 | 2018-01-09 | Amazon Technologies, Inc. | Remote browsing session management |
US9479564B2 (en) | 2011-12-09 | 2016-10-25 | Amazon Technologies, Inc. | Browsing session metric creation |
US9117002B1 (en) | 2011-12-09 | 2015-08-25 | Amazon Technologies, Inc. | Remote browsing session management |
US9330188B1 (en) | 2011-12-22 | 2016-05-03 | Amazon Technologies, Inc. | Shared browsing sessions |
US9087024B1 (en) | 2012-01-26 | 2015-07-21 | Amazon Technologies, Inc. | Narration of network content |
US9898542B2 (en) | 2012-01-26 | 2018-02-20 | Amazon Technologies, Inc. | Narration of network content |
US8839087B1 (en) | 2012-01-26 | 2014-09-16 | Amazon Technologies, Inc. | Remote browsing and searching |
US9195750B2 (en) | 2012-01-26 | 2015-11-24 | Amazon Technologies, Inc. | Remote browsing and searching |
US9336321B1 (en) | 2012-01-26 | 2016-05-10 | Amazon Technologies, Inc. | Remote browsing and searching |
US9529784B2 (en) | 2012-01-26 | 2016-12-27 | Amazon Technologies, Inc. | Remote browsing and searching |
US9092405B1 (en) | 2012-01-26 | 2015-07-28 | Amazon Technologies, Inc. | Remote browsing and searching |
US8627195B1 (en) | 2012-01-26 | 2014-01-07 | Amazon Technologies, Inc. | Remote browsing and searching |
US10104188B2 (en) | 2012-01-26 | 2018-10-16 | Amazon Technologies, Inc. | Customized browser images |
US9509783B1 (en) | 2012-01-26 | 2016-11-29 | Amazon Technlogogies, Inc. | Customized browser images |
US10275433B2 (en) | 2012-01-26 | 2019-04-30 | Amazon Technologies, Inc. | Remote browsing and searching |
US9037975B1 (en) | 2012-02-10 | 2015-05-19 | Amazon Technologies, Inc. | Zooming interaction tracking and popularity determination |
US9183258B1 (en) | 2012-02-10 | 2015-11-10 | Amazon Technologies, Inc. | Behavior based processing of content |
US9137210B1 (en) | 2012-02-21 | 2015-09-15 | Amazon Technologies, Inc. | Remote browsing session management |
US10567346B2 (en) | 2012-02-21 | 2020-02-18 | Amazon Technologies, Inc. | Remote browsing session management |
US10296558B1 (en) | 2012-02-27 | 2019-05-21 | Amazon Technologies, Inc. | Remote generation of composite content pages |
US9374244B1 (en) | 2012-02-27 | 2016-06-21 | Amazon Technologies, Inc. | Remote browsing session management |
US9208316B1 (en) | 2012-02-27 | 2015-12-08 | Amazon Technologies, Inc. | Selective disabling of content portions |
US9460220B1 (en) | 2012-03-26 | 2016-10-04 | Amazon Technologies, Inc. | Content selection based on target device characteristics |
US9723067B2 (en) | 2012-03-28 | 2017-08-01 | Amazon Technologies, Inc. | Prioritized content transmission |
US9307004B1 (en) | 2012-03-28 | 2016-04-05 | Amazon Technologies, Inc. | Prioritized content transmission |
US8832288B1 (en) * | 2012-07-13 | 2014-09-09 | Google Inc. | Transitions between remotely cached and live versions of a webpage |
WO2014018581A1 (en) * | 2012-07-27 | 2014-01-30 | Microsoft Corporation | Web browser having user-configurable address bar button |
US9182954B2 (en) | 2012-07-27 | 2015-11-10 | Microsoft Technology Licensing, Llc | Web browser having user-configurable address bar button |
US9772979B1 (en) | 2012-08-08 | 2017-09-26 | Amazon Technologies, Inc. | Reproducing user browsing sessions |
US8943197B1 (en) | 2012-08-16 | 2015-01-27 | Amazon Technologies, Inc. | Automated content update notification |
US9830400B2 (en) | 2012-08-16 | 2017-11-28 | Amazon Technologies, Inc. | Automated content update notification |
US20140092084A1 (en) * | 2012-08-28 | 2014-04-03 | Tencent Technology (Shenzhen) Company Limited | Webpage display method and apparatus |
US9754391B2 (en) * | 2012-08-28 | 2017-09-05 | Tencent Technology (Shenzhen) Company Limited | Webpage display method and apparatus |
US9002982B2 (en) | 2013-03-11 | 2015-04-07 | Amazon Technologies, Inc. | Automated desktop placement |
US9148350B1 (en) | 2013-03-11 | 2015-09-29 | Amazon Technologies, Inc. | Automated data synchronization |
US10616129B2 (en) | 2013-03-11 | 2020-04-07 | Amazon Technologies, Inc. | Automated desktop placement |
US10313345B2 (en) | 2013-03-11 | 2019-06-04 | Amazon Technologies, Inc. | Application marketplace for virtual desktops |
US9552366B2 (en) | 2013-03-11 | 2017-01-24 | Amazon Technologies, Inc. | Automated data synchronization |
US10142406B2 (en) | 2013-03-11 | 2018-11-27 | Amazon Technologies, Inc. | Automated data center selection |
US9515954B2 (en) | 2013-03-11 | 2016-12-06 | Amazon Technologies, Inc. | Automated desktop placement |
US9288262B2 (en) | 2013-03-11 | 2016-03-15 | Amazon Technologies, Inc. | Automated desktop placement |
US10152463B1 (en) | 2013-06-13 | 2018-12-11 | Amazon Technologies, Inc. | System for profiling page browsing interactions |
US9578137B1 (en) | 2013-06-13 | 2017-02-21 | Amazon Technologies, Inc. | System for enhancing script execution performance |
US20140379857A1 (en) * | 2013-06-24 | 2014-12-25 | Samsung Electronics Co., Ltd. | Method and apparatus for providing content with streaming |
US10623243B2 (en) | 2013-06-26 | 2020-04-14 | Amazon Technologies, Inc. | Management of computing sessions |
US10686646B1 (en) | 2013-06-26 | 2020-06-16 | Amazon Technologies, Inc. | Management of computing sessions |
KR20150002180A (en) * | 2013-06-28 | 2015-01-07 | 삼성전자주식회사 | Image file generating method and apparatus thereof |
US20150007024A1 (en) * | 2013-06-28 | 2015-01-01 | Samsung Electronics Co., Ltd. | Method and apparatus for generating image file |
US11836436B2 (en) | 2013-06-28 | 2023-12-05 | Samsung Electronics Co., Ltd. | Method and apparatus for generating image file |
KR102172354B1 (en) | 2013-06-28 | 2020-10-30 | 삼성전자주식회사 | Image file generating method and apparatus thereof |
US10936078B2 (en) * | 2013-11-18 | 2021-03-02 | Amazon Technologies, Inc. | Account management services for load balancers |
US20180275765A1 (en) * | 2013-11-18 | 2018-09-27 | Amazon Technologies, Inc. | Account management services for load balancers |
US9779069B2 (en) * | 2014-01-31 | 2017-10-03 | Yahoo Holdings, Inc. | Model traversing based compressed serialization of user interaction data and communication from a client-side application |
US10769353B2 (en) | 2014-01-31 | 2020-09-08 | Oath Inc. | Dynamic streaming content provided by server and client-side tracking application |
US9680897B2 (en) | 2014-01-31 | 2017-06-13 | Yahoo! Inc. | Throttled scanning for optimized compression of network communicated data |
US20150220502A1 (en) * | 2014-01-31 | 2015-08-06 | Yahoo! Inc. | Compressed serialization of data for communication from a client-side application |
US20150279336A1 (en) * | 2014-04-01 | 2015-10-01 | Seiko Epson Corporation | Bidirectional display method and bidirectional display device |
CN104978079A (en) * | 2014-04-01 | 2015-10-14 | 精工爱普生株式会社 | Bidirectional display method and bidirectional display device |
US9635041B1 (en) | 2014-06-16 | 2017-04-25 | Amazon Technologies, Inc. | Distributed split browser content inspection and analysis |
US10164993B2 (en) | 2014-06-16 | 2018-12-25 | Amazon Technologies, Inc. | Distributed split browser content inspection and analysis |
US20160342585A1 (en) * | 2015-05-18 | 2016-11-24 | Google Inc. | Coordinated user word selection for translation and obtaining of contextual information for the selected word |
US10140293B2 (en) * | 2015-05-18 | 2018-11-27 | Google Llc | Coordinated user word selection for translation and obtaining of contextual information for the selected word |
US10575008B2 (en) * | 2015-06-01 | 2020-02-25 | Apple Inc. | Bandwidth management in devices with simultaneous download of multiple data streams |
US20160353118A1 (en) * | 2015-06-01 | 2016-12-01 | Apple Inc. | Bandwidth Management in Devices with Simultaneous Download of Multiple Data Streams |
US11818394B2 (en) | 2016-12-23 | 2023-11-14 | Apple Inc. | Sphere projected motion estimation/compensation and mode decision |
US10999602B2 (en) | 2016-12-23 | 2021-05-04 | Apple Inc. | Sphere projected motion estimation/compensation and mode decision |
US11259046B2 (en) | 2017-02-15 | 2022-02-22 | Apple Inc. | Processing of equirectangular object data to compensate for distortion by spherical projections |
US10924747B2 (en) | 2017-02-27 | 2021-02-16 | Apple Inc. | Video coding techniques for multi-view video |
US11093752B2 (en) | 2017-06-02 | 2021-08-17 | Apple Inc. | Object tracking in multi-view video |
US10754242B2 (en) | 2017-06-30 | 2020-08-25 | Apple Inc. | Adaptive resolution and projection format in multi-direction video |
US10726095B1 (en) | 2017-09-26 | 2020-07-28 | Amazon Technologies, Inc. | Network content layout using an intermediary system |
US10664538B1 (en) | 2017-09-26 | 2020-05-26 | Amazon Technologies, Inc. | Data security and data access auditing for network accessible content |
KR20200126949A (en) * | 2020-10-26 | 2020-11-09 | 삼성전자주식회사 | Image file generating method and apparatus thereof |
KR102247353B1 (en) | 2020-10-26 | 2021-05-03 | 삼성전자주식회사 | Image file generating method and apparatus thereof |
US11880425B2 (en) | 2021-04-02 | 2024-01-23 | Content Square SAS | System and method for identifying and correcting webpage zone target misidentifications |
US20230215583A1 (en) * | 2021-06-14 | 2023-07-06 | Preh Holding, Llc | Connected body surface care module |
Also Published As
Publication number | Publication date |
---|---|
WO2008092104A3 (en) | 2008-09-25 |
WO2008092131A3 (en) | 2008-10-02 |
US8630512B2 (en) | 2014-01-14 |
WO2008092131A2 (en) | 2008-07-31 |
US20080181498A1 (en) | 2008-07-31 |
WO2008092104A2 (en) | 2008-07-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080184128A1 (en) | Mobile device user interface for remote interaction | |
US8375304B2 (en) | Maintaining state of a web page | |
US8443398B2 (en) | Architecture for delivery of video content responsive to remote interaction | |
US8468469B1 (en) | Zooming user interface interactions | |
US10506077B2 (en) | Image-based and predictive browsing | |
CN103210672B (en) | According to terminal by virtualized for self adaptation picture method and system | |
KR101367718B1 (en) | Method and apparatus for providing mobile device interoperability | |
WO2015055081A1 (en) | Method, apparatus and mobile terminal for browser based video playback | |
US20180307767A1 (en) | Method and apparatus for notifying a user of updated content for a webpage | |
US8001213B2 (en) | Method, apparatus and computer program product for providing unrestricted content on a user terminal | |
US20110078593A1 (en) | Web browser transmission server and method of controlling operation of same | |
US20110066678A1 (en) | Webpage browsing system, server, webpage browsing method, program and recording medium for the same | |
CN110224920B (en) | Sharing method and terminal equipment | |
US9396165B2 (en) | Information display system, information display apparatus, information display method, information display program, information providing apparatus, and recording medium | |
US20150195156A1 (en) | Method and system for providing page visibility information | |
US7076523B2 (en) | Interaction interface for a composite device computing environment | |
WO2017096812A1 (en) | Webpage displaying method, mobile terminal, intelligent terminal, computer program, and storage medium | |
KR101001512B1 (en) | System for transmitting/receiving contents connected in link structure in internet page and control method thereof, and browsing apparatus used in the system | |
JP2003162471A (en) | Client, data download method, program and recording medium | |
CN107133306B (en) | Information stream transcoding device and method | |
KR100974783B1 (en) | Web viewer server producing capture image of web page variably, and web viewer service system using the web viewer server and control method thereof | |
KR100873415B1 (en) | Internet browser to provide full internet service for mobile equipment | |
CN117806687A (en) | Terminal, server and data updating method | |
KR20100012347A (en) | Web browsing system for saving image corresponding a markup page and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: DVC LABS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SWENSON, ERIK R.;BHANDARI, NITIN;VINCENT, ALEXANDER JAMES;AND OTHERS;REEL/FRAME:020419/0452 Effective date: 20080125 |
|
AS | Assignment |
Owner name: SKYFIRE LABS, INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:DVC LABS, INC.;REEL/FRAME:020667/0965 Effective date: 20080123 |
|
AS | Assignment |
Owner name: COMERICA BANK, MICHIGAN Free format text: SECURITY AGREEMENT;ASSIGNOR:SKYFIRE LABS, INC.;REEL/FRAME:025082/0390 Effective date: 20100903 |
|
AS | Assignment |
Owner name: SKYFIRE LABS, INC., CALIFORNIA Free format text: CHANGE OF ADDRESS;ASSIGNOR:SKYFIRE LABS, INC.;REEL/FRAME:026817/0504 Effective date: 20100510 |
|
AS | Assignment |
Owner name: SKYFIRE LABS, INC., CALIFORNIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:COMERICA BANK;REEL/FRAME:030141/0182 Effective date: 20130402 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: OPERA SOFTWARE IRELAND LIMITED, IRELAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SKYFIRE LABS, INC.;REEL/FRAME:032827/0175 Effective date: 20140428 |
|
AS | Assignment |
Owner name: PERFORMANCE AND PRIVACY IRELAND LTD., IRELAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OPERA SOFTWARE IRELAND LTD.;REEL/FRAME:042076/0473 Effective date: 20170228 |
|
AS | Assignment |
Owner name: OTELLO CORPORATION ASA, NORWAY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PERFORMANCE AND PRIVACY IRELAND LIMITED;REEL/FRAME:063188/0195 Effective date: 20230228 |