WO2004092863A2 - Method and apparatus for exploiting video streaming services of mobile terminals via proximity connections - Google Patents

Method and apparatus for exploiting video streaming services of mobile terminals via proximity connections Download PDF

Info

Publication number
WO2004092863A2
WO2004092863A2 PCT/IB2004/001258 IB2004001258W WO2004092863A2 WO 2004092863 A2 WO2004092863 A2 WO 2004092863A2 IB 2004001258 W IB2004001258 W IB 2004001258W WO 2004092863 A2 WO2004092863 A2 WO 2004092863A2
Authority
WO
WIPO (PCT)
Prior art keywords
mobile terminal
video
connection
image
content
Prior art date
Application number
PCT/IB2004/001258
Other languages
French (fr)
Other versions
WO2004092863A3 (en
Inventor
Timo Termo
Marko Leukkunen
Original Assignee
Nokia Corporation
Nokia, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation, Nokia, Inc. filed Critical Nokia Corporation
Publication of WO2004092863A2 publication Critical patent/WO2004092863A2/en
Publication of WO2004092863A3 publication Critical patent/WO2004092863A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4113PC
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • H04N21/43637Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6125Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/148Interfacing a video terminal to a particular transmission medium, e.g. ISDN
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations

Definitions

  • This invention relates in general to multi-modal usage of a mobile terminal, and more particularly, to providing alternative usages of mobile termii als for video conferencing and video enhancement operations.
  • Mobile terminals of the present day have evolved according to the needs and desires of the mobile terminal market.
  • mobile terminals were significantly costlier, larger, heavier, and less energy efficient than they are today.
  • the early mobile terminals provided a significantly smaller set of value added features and performance.
  • the early phases of the mobile terminal market was driven by business customers, where the key competitive mobile terminal parameters consisted of size and talk time, which was largely a function of battery charge duration.
  • the mobile terminals began to segment themselves according to price differential and performance.
  • the higher end mobile terminals, for example were targeted for business customers relying on enhanced performance where price was a secondary issue.
  • middle and low end mobile terminals were offered having a reduced function set, an affordable cost, and were targeted for private, cost-sensitive customers.
  • terminals optimized for imaging must excel in the capturing, handling, sending, and storing of images.
  • the image enabled mobile terminals must, therefore, provide a large color display for imaging content presentation and an internal camera for imaging content capture.
  • the image enabled mobile terminal must also support the Multimedia Messaging Service (MMS), Short Message Service (SMS), Personal Information Management (PIM), games, voice, etc. in order to become the mobile terminal companion of choice for the consumer.
  • MMS Multimedia Messaging Service
  • SMS Short Message Service
  • PIM Personal Information Management
  • games voice, etc.
  • Image enabled terminals that provide mobility to the user have created a strong emotional attachment between the users because the mobile imaging terminals provide a mechanism whereby users are able to share life's experiences through rich communication.
  • Person to person multimedia communications, with self-created content, represents a natural progression of today's communication paradigm and comes one step closer to virtual presence.
  • the mobile imaging terminal can not present the same amount or variety of information as can be provided to a user of a Personal Computer (PC).
  • PC Personal Computer
  • the screen size of the mobile terminal creates a challenge to content providers because the displays have less graphical area available to them for display of their content.
  • the resolution of the display By increasing the resolution of the display, more information may be projected onto the display, but the readability of the content decreases.
  • Usage of page sizes that are larger than the display size of the mobile terminal is also an option, but navigation around the oversized page requires increased scrolling interaction by the user.
  • Video conferencing is an activity that has typically been served through fixed media devices.
  • the fixed media devices including audio and video signal processing terminals along with corresponding PC based conference software to enable a virtual meeting between parties that are physically separated.
  • Each party to the video conference must provide his own set of conference supporting media, such as a PC mounted camera or Integrated Services Digital Network (ISDN) video conferencing equipment, in order to project his own personal image and any other audio/video content required to support the. video conference.
  • ISDN Integrated Services Digital Network
  • the particular PC in use by one of the parties to the video conference is not equipped with a camera or Web Cam and is thus deprived of the opportunity to provide video input to the video conference. Accordingly, there is a need in the communications industry for a method and apparatus that exploits the capabilities of mobile terminals to increase the number of value added services that may be facilitated through their use. In particular, the capabilities of image enabled mobile terminals needs to be exploited, in order to couple those capabilities to existing network infrastructure via proximity connections to enable the value added services.
  • the coupling mechanism used to exploit the capabilities of the mobile terminals needs to be adapted to enhance the services provided to a user of the mobile terminal.
  • the user's total experience in using the particular application may be enhanced through the use of other terminals/applications in proximity to the mobile terminal.
  • the present invention discloses a system and method for providing exploitation of the capabilities of mobile terminals to support and enhance their video functions.
  • a method is provided of exploiting video facilities of a mobile terminal.
  • the method comprises generating video content using the mobile terminal, establishing a proximity connection between the mobile terminal and a video processing platform, and transferring the video content from the mobile terminal to the video processing platform using the proximity connection.
  • an image processing system is provided.
  • the image processing system comprises an image enabled mobile terminal arranged to generate content and an image processing platform arranged to receive the content.
  • the image enabled mobile terminal is coupled to the image processing platform via a proximity connection to transfer the content.
  • a mobile terminal wirelessly coupled to a network which includes a network element capable of processing content from the mobile terminal in provided.
  • the mobile terminal comprises a memory capable of storing at least one of a protocol module and an imaging module, an image component configured by the imaging module to generate digital images, a processor coupled to the memory and configured by the protocol module to enable digital image exchange with the network element, and a transceiver configured to facilitate the digital image exchange with the network element.
  • a computer- readable medium having instructions stored thereon which are executable by a mobile terminal for exchanging video content with a video processing platform.
  • the instructions perform steps comprising generating digital images using imaging equipment internal to the mobile terminal, establishing a proximity connection with the video processing platform, and transmitting the digital images to the video processing platform with the established proximity connection.
  • a digital image processor proximately coupled to an image enabled mobile terminal comprises various arrangements for establishing the proximate connection to the image enabled mobile terminal, receiving video content from the image enabled mobile terminal via the proximate connection, and utilizing the received video content.
  • a computer- readable medium having instructions stored thereon which are executable by a video processing platform.
  • the instructions perform steps comprising establishing a proximate connection with an image capable mobile terminal, receiving images from the image capable mobile terminal via the proximate connection, and utilizing the received images.
  • FIG. 1 illustrates a block diagram according to the principles of the present invention
  • FIG. 2 illustrates a block diagram of key components of content capture
  • FIG. 3 illustrates a Bluetooth stack hierarchy
  • FIG. 4 illustrates a generic communication architecture according to the present invention
  • FIG. 5 illustrates an exemplary video conferencing scenario enabled by the principles of the present invention
  • FIG. 6 illustrates a flow diagram of an exemplary setup required for a proximity connection transfer according to the present invention
  • FIG. 7 illustrates a representative mobile computing arrangement suitable for providing imaging data in accordance with the present invention.
  • FIG. 8 is a representative computing system capable of carrying out image processing functions according to the present invention.
  • the present invention is directed to a method and apparatus to facilitate a proximity connection between a mobile terminal and a PC, or other image processing platform, in order to exploit the functional capabilities of the mobile terminal.
  • Various applications and embodiments in accordance with the present invention are presented that extract outbound video data, for example, from an image enabled mobile terminal to facilitate video conferencing.
  • a proximity connection between the mobile terminal and an auxiliary device, such as a PC or other equivalent terminal is established to enable the extraction of the video content from the mobile terminal.
  • the video content may be either self-generated by the mobile terminal itself, e.g., generated by a camera built into the mobile terminal, or may be generated by an external device, e.g., pre-recorded video content from a Digital Versatile Disk (DVD) or equivalent storage device, and routed through the mobile terminal for proximity transport to the auxiliary device.
  • DVD Digital Versatile Disk
  • the present invention also supports the enhancement of in-bound video data intended to be rendered on the mobile terminal's display.
  • the video data may be enhanced by, for example, allowing any video content received by the mobile terminal to be displayed by an auxiliary video display device, thus bypassing the normal video display of the mobile terminal.
  • the user of any terminal, e.g., land, mobile or otherwise, that incorporates features in accordance with the present invention may enhance his viewing pleasure by facilitating an enhanced rendition of the received video content, e.g., larger size, greater resolution, increased color palette, through the use of an auxiliary video display device.
  • FIG. 1 illustrates a high level block diagram illustrating the principles of the present invention.
  • mobile terminal 102 is arranged to transfer data to hardware platform 106 via path 118 and is arranged to receive acknowledgment of the received data via path 120.
  • the nature of the data transfer may be of any type and rate that is supported by proximity connection 104, mobile terminal 102 and hardware platform 106.
  • the data may be synchronization data that is transferred by mobile terminal 102 to hardware platform 106, e.g. a Personal Computer (PC), in order to obtain a common data store between the two devices via a data synchronization standard such as SyncML.
  • the synchronization data may support such activities as calendar synchronization, contact synchronization, to-do lists, etc.
  • SyncML may also support data types such as images, files and database objects.
  • data transfer from hardware platform 106 may also be received by mobile terminal 102. In such an instance, data flow path between hardware platform 106 and mobile terminal 102 is facilitated through path 120, while acknowledgment of the data receipt is provided by path 118.
  • block diagram 100 is discussed in terms of a content transport mechanism between mobile terminal 102 and hardware platform 106, whereby proximity connection 104 is utilized as the communication conduit between the two devices.
  • Proximity connection 104 may represent a wired and/or a wireless connection.
  • Wired implementations of proximity connection 104 may include single ended data transmission formats such as those specified by the RS232 or RS432 standards, or may include differential data transmission formats such as those specified by the RS422 or RS485 standards. Other wired implementations for higher bandwidth considerations may use the Universal Serial Bus (USB), or Fire Wire, specifications for example.
  • Wireless implementations of proximity connection 104 may include Wireless Local Area Network (WLAN), Bluetooth, Infrared, etc. as required by the particular application.
  • mobile terminal 102 may be an image enabled device having content capture/receipt capability 108.
  • Content capture/receipt 108 may provide both audio and video data, whereby the images may be presented in still and/or video mode.
  • still mode only a single image is transferred via path 110 to First-In First-Out (FIFO) buffer 114, where acknowledgement of the content receipt is generated via path 112.
  • FIFO buffer 114 buffers the content blocks, while content delivery/receipt 116 prepares for their subsequent transfer to hardware platform 106 via path 118 through proximity connection 104.
  • FIFO buffer 114 buffers the content blocks, while content delivery/receipt 116 prepares for their subsequent transfer to hardware platform 106 via path 118 through proximity connection 104.
  • Path 120 is used by content receipt/delivery 122 to acknowledge receipt of the images from content delivery 116 via proximity connection 104.
  • Buffer and synchronization block 124 is used to provide the proper frame alignment and playback speed as required by presentation 126.
  • Presentation 126 represents any Application Programming Interface (API) that is executing on hardware platform 106 including image processing software in support of video conferencing, photo identification card generation, photo identification security, etc.
  • API Application Programming Interface
  • the images transferred via proximity path 104 may be formatted in any one of a number of video formats to include Moving Pictures Expert Group (MPEG), MPEG version 4 (MPEG-4), Joint Photographic Experts Group (JPEG), to name only a few.
  • MPEG Moving Pictures Expert Group
  • MPEG-4 MPEG version 4
  • JPEG Joint Photographic Experts Group
  • vector graphic files may be transmitted where creation of digital images is facilitated through a sequence of commands or mathematical statements that place lines and shapes in a given two-dimensional or three-dimensional space.
  • vector graphics the file that results from a graphic artist's work is created and saved as a sequence of vector statements. For example, instead of containing a bit in the file for each bit of a line drawing, a vector graphic file describes a series of points to be connected.
  • the vector graphics file may be converted to a raster graphics image by content delivery/receipt 116 prior to transmission, so as to increase portability between systems.
  • Exemplary vector graphic files may be created, for example, by using Adobe Illustrator and CorelDraw.
  • animation images are also usually created as vector files, using content creation products such as Shockwave's Flash that allows creation of 2-D and 3-D animations that may be sent to content receipt/delivery 122 as a vector file and then rasterized "on the fly" as they arrive by presentation 126.
  • Content capture/receipt 108 may produce a video sequence consisting of a series of still images for ultimate buffering into FIFO buffer 114. Content capture/receipt 108 may also provide audio capture capability, depending upon the capabilities/feature selection represented by mobile terminal 102. Additionally, audio and video data streams may be received from FIFO buffer 114 by content capture/receipt 108 for subsequent display by mobile terminal 102. The audio and video data streams may have previously been received from an external device (not shown), from hardware platform 106, or from an attachment contained within an MMS message, for example.
  • FIG. 2 illustrates an exemplary block diagram of some of the key elements provided by content capture 108 as they relate to the principles of the present invention.
  • Video block 202 may provide a single image or a series of images to video encoder 206.
  • video encoder 206 implements video compression methods that exploit redundant and perceptually irrelevant parts of the video series.
  • the redundancy can be categorized into spatial, temporal, and spectral components; where spatial redundancy relates to correlation between neighboring pixels; temporal redundancy relates to objects likely to appear in present frames that were there in past frames; and spectral redundancy addresses the correlation between the different color components of the same image.
  • Video encoder 206 achieves video compression by generating motion compensation data, which describes the motion between the current and previous image.
  • Video encoder 206 may seek to establish a constant bit rate for data stream 220, in which case video encoder 206 controls the frame rate as well as the quality of images.
  • Video encoder 206 may implement a video COder/DECoder (CODEC) algorithm defined by ITU-T H.263, which is an established CODEC scheme used in various multimedia services.
  • CODEC video COder/DECoder
  • H.263 provides a wide toolbox of various encoding tools and coding complexities for different purposes.
  • a definition of the tools to be used and the allowed complexity of the mode are defined in CODEC profiles and levels, such as Profile 0, Level 10, also known as the H.263 baseline, has been defined as a mandatory video CODEC.
  • Video encoder 206 may also support decoding of video bit-stream content conforming to MPEG-4 Visual Simple Profile, Level 0.
  • Other proprietary video coding formats such as RealVideo 7 and RealVideo 8, may be used that are recognized by the RealOne Player utility.
  • Audio block 204 represents an audio generation function of mobile terminal 102, such as a microphone, that provides a series of amplitude waveforms over a period of time to audio encoder 208.
  • the amplitude waveforms are digitized prior to delivery to audio encoder, where the sampling rate imposed is dependent upon the nature of the sound to be digitized.
  • Music for example, requires a 44.1 Kilohertz (KHz) sampling rate in order to provide high quality. Speech, on the other hand, may be adequately sampled at an 8 KHz sampling rate.
  • KHz Kilohertz
  • Audio encoder 208 compresses the digitized data received from audio block 204 using a number of different compression algorithms.
  • One simple coding method uses an adaptive step size to quantize audio samples. Such a technique is used by the Interactive Multimedia Association (IMA) Adaptive Pulse Code Modulation (ADPCM) audio coding standard that reserves 4 bits per sample. Consequently, if the sampling rate is set to 8 KHz, IMA ADPCM coded audio requires a 32 KHz bit stream to be transferred by path 222.
  • IMA ADPCM coded audio requires a 32 KHz bit stream to be transferred by path 222.
  • Other simple speech coding methods include the A-Law and ⁇ -Law coding algorithms, which uses a logarithmic quantization step size and reserves 8 bits per sample.
  • AMR Adaptive Multi-Rate
  • File composer 210 receives video encoded data stream 220 and audio encoded data stream 222 and composes data file 212 from the respective data streams.
  • the audio portion of content capture 200 is optional, depending upon the capabilities/feature selection currently implemented by mobile terminal 102. In the event that a single image is to be transmitted via MMS, for example, the audio portion of the content captured by content capture 108 of FIG. 1 may be omitted by a feature that is selected locally within mobile terminal 102.
  • File composer 210 groups video data stream 220 and optionally, audio data stream 222, into file format 212. Once formatted, file format 212 may be processed locally within mobile terminal 102, streamed over transport channel 118 via proximity connection 104 to hardware platform 106, or dispatched in any number of other formats and protocols as permitted by the capabilities of mobile terminal 102. Some exemplary file formats provided by file composer 210 may include Microsoft Audio- Video Interleaved (AVI), Apple Quicklime file format (.mov), MPEG-1 file format (.mpg), 3GPP file format (.3gp), and MP4 file format (.mp4), etc. Header 214 provides specific information about the file format such as video coding type, audio coding type, length of file, file identifier, etc. Video bit stream 216 and audio bit stream 218 contain the respective video bit stream 220 and audio bit stream 222 as received and formatted by file composer 210.
  • Proximity connection 104 of FIG. 1 provides the conduit for data transfer between mobile terminal 102 and hardware platform 106.
  • proximity connection 104 is described in terms of the Bluetooth standard for localized data transfer.
  • Bluetooth technology is an industry standard for short-range wireless voice and data communications, allowing a single air interface to support local communications for distances of up to 10-20 meters.
  • Mobile terminal 102 of FIG. 1 may be implemented using a Series 60 Platform, for example, that is built upon the Symbian Operating System (OS) General Technology (GT).
  • Symbian GT provides a fully object-oriented design, preemptive multitasking, and full support for client-server architecture.
  • Symbian GT also provides the common core for API and technology, which is shared between all Symbian reference designs.
  • Some of the major components supported by Symbian GT include a multimedia server for audio recording, playback, and image-related functionality, as well as a Personal Area Network (PAN) communication stack including infrared, Bluetooth and serial communications support.
  • PAN Personal Area Network
  • Symbian GT allows the use of Bluetooth technology to allow proximity, wireless operations to utilize local service accessories.
  • the number and type of local service accessories provided by the Bluetooth connection are virtually unlimited and they include for example; bar code readers, digital pens, health monitoring devices, Global Positioning System (GPS) receivers, enhanced video feeds, and video conferencing facilitation.
  • GPS Global Positioning System
  • Bluetooth is composed of a hierarchy of components that is exemplified in Bluetooth stack hierarchy 300 shown in FIG. 3.
  • the Bluetooth communication stack may be broken into two main components.
  • the first component, Bluetooth Host Controller (BTHC) 312 provides the lower level of the stack.
  • BTHC 312 is generally implemented in hardware and allows the upper level stack, Bluetooth Host (BTH) 302, to send or receive data over a Bluetooth link and to configure the Bluetooth link.
  • BTH Bluetooth Host
  • Configuration and data transfer between BTHC 312 and BTH 302 takes place via path 322, which connects Host Controller Interface (HCI) driver 310 with HCI firmware module 312.
  • HCI Host Controller Interface
  • Bluetooth operates in the 2.4 gigahertz (GHz) Industrial, Scientific, and Medical (ISM) band. It uses a fast frequency hopping scheme with 79 frequency channels, each being 1 MHz wide.
  • Bluetooth Radio (BTR) 320 is designed to provide a low-cost, 64 kbps, full-duplex connection that exhibits low power consumption. Power consumption on the order of 10-30 milliamps (mA) is typical, where even lower power consumption exists during idle periods.
  • Baseband link controller (LC) 318 defines different packet types to be used for both synchronous and asynchronous transmission. Packet types supporting different error handling techniques, e.g., error correction/detection, and encryption, are also defined within LC 318.
  • LC 318 also mitigates any Direct Current (DC) offsets provided by BTR 320 due to special payload characteristics.
  • Link Manager Protocol (LMP) 316 is responsible for controlling the connections of a device, like connection establishment, link detachment, security management, e.g., authentication, encryption, and power management of various low power modes.
  • BTH 302 illustrates the upper level of a Bluetooth stack and is comprised primarily of software applications 304-310, and 326.
  • HCI driver 310 packages the high level components that communicate with the lower level hardware components found in BTHC 312.
  • Logical Link Control and Adaptation Protocol (L2CAP) 308 allows finer grain control of the radio link. For example, L2CAP 308 controls how multiple users of the link are multiplexed together, controls packet segmentation and reassembly, and conveys quality of service information.
  • L2CAP 308 controls how multiple users of the link are multiplexed together, controls packet segmentation and reassembly, and conveys quality of service information.
  • Service Discovery Protocol (SDP) 304 and Radio Frequency Communication (RFCOMM) protocol 306 represent middleware protocols of the Bluetooth stack.
  • RFCOMM protocol 306 allows applications communicating with Bluetooth stack 300 to treat a Bluetooth enabled device as if it were a serial communications device, in order to support legacy protocols.
  • RFCOMM protocol 306 defines a virtual set of serial port applications, which allows RFCOMM protocol 306 to replace cable enabled communications.
  • the definition of RFCOMM protocol 306 incorporates major parts of the European Telecommunication Standards Institute (ETSI) TS 07.10 standard, which defines multiplexed serial communication over a single serial link.
  • ETSI European Telecommunication Standards Institute
  • SDP Service Discovery Protocol
  • SDP Service Discovery Protocol 304 is used to locate and describe services provided by or available through another Bluetooth device.
  • SDP 304 plays an important role in managing Bluetooth devices in a Bluetooth environment by allowing discovery and service description of services offered within the environment.
  • Audio block 326 represents another middleware component of stack 300 that allows Bluetooth to offer audio and telephony support.
  • the audio portion of Bluetooth data may be transferred directly from LC 318 to audio block 326 via path 324, thereby bypassing the LMP 316, HCI 310 and 314, and the L2CAP 308 layers.
  • the Bluetooth communication stack of FIG. 3 represents the lower communication layers that support any number of higher level application embodiments according to the present invention.
  • mobile terminal 102 and hardware platform 106 may each employ Bluetooth communication stack 300, in order to facilitate image and voice data transfer, whereby presentation software and camera APIs are implemented as necessary for image generation and display.
  • FIG. 4 represents generic communication architecture 400 according to the principles of the present invention, where the BTHC layers, e.g., 412 and 422, and the BTH layers, e.g., 414 and 424, represent the Bluetooth communication stack illustrated in FIG. 3.
  • Mobile terminal 404 represents an image enabled mobile terminal that is capable of generating data streams such as those described in relation to FIG. 2.
  • Camera HW 416 and camera API 418 combine to generate video bit stream 216 of data stream 212, while required terminal software 420 and related hardware (not shown) establishes the associated audio bit stream 218 of data stream 212.
  • Data path 426 represents, for example, proximity connection 104 of FIG. 1 that exists between mobile terminal 404 and, for example, PC 402.
  • Data path 426 transfers the video/audio data streams generated by mobile terminal 404 and provides them to their corresponding peer entity within PC 402.
  • camera API 408 and camera API 418 are peer entities, where camera API 418 ultimately communicates through data path 426 to its corresponding camera API entity 408, so that images captured by camera HW 416 may ultimately be displayed by presentation block 406.
  • Required PC software 410 receives the captured images from BTHC 412 and provides a synchronized data stream as required to camera API 408 for proper delivery to presentation block 406. Communications between Bluetooth stacks 412-414 and 422-424 is facilitated through the use of sockets, which is similar to those used by a Transmission Control Protocol/Internet Protocol (TCP/IP) comiection.
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • Bluetooth sockets are used to discover other Bluetooth devices, and to read and write data over a Bluetooth radio interface.
  • the Bluetooth sockets API supports communication over both the L2CAP 308 and RFCOMM 306 layers of Bluetooth stack 300. Not only does the socket API allow a client to make a connection to a remote device, the socket API also allows the remote device to contact the client for data transfer.
  • the Bluetooth socket API has five key concepts: socket address, remote device inquiry, RFCOMM commands, L2CAP commands, and HCI commands.
  • SDP 304 performs this task by performing two main functions: discovery of devices and services within the local area, and the advertisement of services from the local device. If, for example, a Bluetooth enabled device can provide locally generated image data streams, then that service is made visible through SDP 304 to other Bluetooth enabled devices that may be interested in that functionality.
  • a Bluetooth service In order for a Bluetooth service to be advertised, it must first be represented by a service record and kept within an SDP database for access by other applications.
  • the SDP database is implemented as a server within the Symbian OS and as such, other applications wishing to discover the services offered, must first establish a connection to the server and open a session on the server.
  • the RSdp class within the Symbian OS API represents the SDP database server and allows an application to connect to it.
  • a service record in Symbian OS is created through the SDP database by managing a collection of service handles and their associated attributes that make up the service record.
  • Each service record is identified by a Universally Unique Identifier
  • each service record contains a service class and associated profile that are used to help generalize the types of service provided by the device.
  • service class numbers may represent a Bluetooth enabled mobile terminal and a more specific entry to define that the Bluetooth enabled mobile terminal also has image capability that may support either still frame or streamed video applications.
  • the service record contains a collection of attributes that are identified by an identification number that is of the TSdpAttributelD data type defined within the ⁇ btsdp.h> header file.
  • Each service handle and the associated attributes are used by the SDP database to identify attributes and their values within the database.
  • the Symbian OS API provides SDP 304 with service search patterns and attribute search patterns that are used to facilitate the device and service discovery process.
  • the service search pattern allows SDP 304 to discover and create a list of all available services within the local area, where all services discovered in the local area are services that are advertised by their own SDP agent and identified by their respective service record UUIDs.
  • the attribute search pattern allows the creation of a list of attribute IDs from a remote SDP database. Additionally, the attribute search pattern allows the searching device to create an attribute range that defines a list of attributes that are of interest to the searching device. Accordingly, attribute queries result in only those attributes of the remote Bluetooth enabled devices that fall within the attribute range specified in attribute search pattern.
  • a client application generally queries the service for more information, which may include requesting the available attributes of the identified service.
  • a client application may search for devices.
  • a client application may manually issue a query to all devices within a range and handle each response in turn.
  • the client may use the Bluetooth Device Selection User Interface (DSUI), which automatically issues queries, handles the responses, and prompts the user with a dialog box.
  • DSUI Bluetooth Device Selection User Interface
  • the dialog box enables the user the ability to select the device that he wishes to use.
  • the UI operates in conjunction with the RNotifier class supplied within the Symbian OS API.
  • FIG. 5 represents video conferencing scenario 500, whereby the parties of meeting group 502 wish to participate in the presentation offered by presenter 514.
  • Meeting group 502 and presenter 514 are spatially removed from one another, such as may be the case when a corporation has a number of production and engineering facilities that are geographically located across the globe from one another.
  • meeting group 502 may represent a group of lower level production management personnel located within the United States, who have assembled to receive and discuss the ideas presented by senior production manager 514 located at the corporation's headquarters in Finland.
  • meeting group 502 and presenter 514 are not equipped with standard video conferencing equipment, but are equipped with imaging capable mobile terminals 504 and 512.
  • image processing capable PCs 506 and 510 are provided locally to meeting group 502 and presenter 514, respectively.
  • PCs 506 and 510, and mobile terminals 504 and 512 are also equipped with proximity connection capability to facilitate communication via links 516 and 522.
  • proximity links 516 and 522 represent Bluetootli communication links, where the communication architecture 400 of FIG. 4 is utilized.
  • PC 506 and 510 may be represented by architecture 402 and mobile terminals 504 and 512 may be represented by architecture 404, where Bluetooth communications stacks 412-414 and 422-424 are arranged to support Bluetooth device and service discovery and subsequent usage of the Bluetooth devices and services.
  • PCs 506 and 510 are interconnected through internet 506 via, for example, Local Area Network (LAN) or Wide Area Network (WAN) connections 518 and 520.
  • LAN Local Area Network
  • WAN Wide Area Network
  • Each of PCs 506 and 510 are equipped with, for example, conferencing software such as NetMeeting or Timbuktu Pro, that allow audio/video data to be exchanged between them in order to create a virtual meeting between meeting group 502 and presenter 514. Since PCs 506 and 510 are not equipped with their own video capturing device, imaging enabled mobile terminals 504 and 512 are used instead.
  • mobile terminals 504 and 512 In order to ultimately create the virtual meeting, mobile terminals 504 and 512 must first discover any video conferencing support services that may be offered in the local area.
  • a user of mobile terminal 504, for example, may invoke service discovery through the Bluetooth DSUI executing within mobile terminal 504.
  • the DSUI is provided by the RNotifier class of the Symbian OS which is defined by the ⁇ E3std.h> header file.
  • the RNotifier class is designed to be used with most types of client applications, where a background thread provides the user selection dialog box that is presented to the user via the display of mobile terminal 504.
  • the Bluetooth Device Selection dialog box is represented by the KDeviceSelectionNotifierUid constant value defined in the ⁇ btextnotifiers.h> header file.
  • the user Prior to presenting the user with a device selection dialog, the user has the option of limiting the number of devices that respond to the service discovery query by using the SetDeviceClass function. In so doing, one of the users of meeting group 502 may have defined a specific class of video conference support devices to be used that includes PC 506. Once discovered, the DSUI of mobile terminal 504 allows the user to select PC 506 as the device to be used for video conferencing support. Similarly, the user of mobile terminal 512 may select PC 510 to support video conferencing on his end of the virtual meeting link.
  • Actual data transfer between mobile terminal 504 and PC 506 and mobile terminal 512 and PC 510 may be implemented through the use of either the RFCOMM 306 or L2CAP 308 protocol layers as illustrated in FIG. 3, where access to the RFCOMM transmission protocol is provided by Symbian OS socket architecture.
  • Each of mobile terminals 504 and 512, as well as PCs 506 and 510, may create a set of transmit and receive sockets. Both types of sockets are required by the scenario depicted in FIG. 5 because meeting group 502 and presenter 514 each require a view of the other on their respective PC terminals.
  • the flow diagram of FIG. 6 illustrates, for example, the steps required for RFCOMM communication enabled through the Bluetooth stack.
  • the StartL function of the CMessageServer class performs the work of starting up the RFCOMM service. Initially, StartL initiates the Bluetooth sockets protocol and initializes the
  • opening the server socket involves first connecting to the socket server and then opening a connection to the server using an RSocket object, where the RFCOMM protocol is specified to be the requested transport type.
  • the next task is to query the protocol for an available channel as in step 604. Once the available channel has been returned, a port is assigned to the available channel and the RSocket object is then bound to the port as in step 606. In the case that the RSocket object is to be a listening object, it is set up to listen for any available data present on the port that it has been bound to. Any security features that may be required, such as authentication, authorization, and encryption, are setup in step 608 after the listening socket has been setup.
  • step 604 for a transmission socket setup, assumes that the Bluetooth discovery process has identified the address of the device, e.g., PC 506, offering the service required by the Bluetooth client, e.g., mobile terminal 504.
  • the discovery process has already identified the channel on the receiving device, e.g., PC 506, to which the transmitting device, e.g. mobile terminal 504, should connect.
  • Receive event step 612 follows the YES path, for example, when data is ready to be read from the listening RSocket. Receive event step 612 takes the NO path, for example, when data is ready for transmission to the transmitting RSocket.
  • Bluetooth communication pair, PC 510 and mobile terminal 512 setup their respective Bluetooth communication link is a similar manner. Once the point-to- point connection is established between PC 506, link 518, Internet 508, link 520, and PC 510, then the virtual meeting is ready to begin and the respective opposing images are available to each other.
  • meeting group 502 for example, mobile terminal 504 is transitioned into its imaging mode and positioned such that the video image presented on the display of mobile terminal 504 captures meeting group 502.
  • Video data stream 216 and optional audio data stream 218 as depicted in data stream 212 of FIG. 2 is then captured by mobile terminal 504 and transmitted to PC 506 via Bluetooth link 516.
  • the data subsequently arrives at PC 510 via internet 508 for display to presenter 514 via PC 510.
  • mobile terminal 512 is transitioned into its imaging mode and positioned such that the video image presented on the display of mobile terminal 512 captures presenter 514.
  • Video data stream 216 and optional audio data stream 218 as depicted in data stream 212 of FIG. 2 is then captured by mobile terminal 512 and transmitted to PC 510 via Bluetooth link 522.
  • the data subsequently arrives at PC 506 via internet 508 for display to meeting group 502 via PC 506.
  • virtual meeting scenario 500 is discussed in terms of a Bluetooth API enabled through a Symbian OS, the implementation may also be implemented by exploiting native coding methods and APIs of legacy devices.
  • data synchronization APIs that currently allow mobile terminal backup/synchronization services to proxnnity devices such as PCs, may also be exploited to implement similar functionality.
  • Future terminals employing Mobile Information Device Profile (MIDP) 2.0 and Mobile Media API may offer similar features that may be exploited to provide equivalent features.
  • MIDP Mobile Information Device Profile
  • the present invention may be adapted to any number of applications involving audio/video feeds from mobile terminals via proximity connections and the subsequent use of the audio/video feeds.
  • a user of a mobile terminal may wish to engage in Internet browsing activities using the mobile terminal, but wishes to display the received content on an enhanced display consisting of a PC, overhead projector, television, or other similar video device.
  • the proximity connection e.g. wired or wireless connection
  • the user may enhance his viewing pleasure, by forwarding the received content to the auxiliary video device via the proximity connection for improved video presentation.
  • the mobile terminal may receive audio/video content from an external Compact Disk (CD) or DVD device. The mobile terminal may then subsequently transfer the received content to the video device via a proximity connection for enhanced presentation of the content received from the CD or DVD device.
  • CD Compact Disk
  • an image capable mobile terminal may take the place of a digital camera in those instances where a digital camera may be utilized.
  • Any licensing that requires a video image of the license holder to be placed onto the license card may utilize the present invention.
  • state licensing stations may be placed in public kiosks having Bluetooth functionality. Any user having access to an image enabled mobile terminal may access the kiosk through a Bluetooth proximity connection and provide the kiosk with a digital image of Mmself. The digital image being generated at the mobile terminal may be transferred to the kiosk via the Bluetooth connection.
  • the kiosk after having verified that the user has met other licensing requirements, may then render a license to the user that contains the user's digital image previously generated and transferred to the kiosk via the Bluetooth connection.
  • a security application is comprehended, whereby access is controlled through digital verification of a user's facial features.
  • a user having an image enabled mobile terminal may establish a Bluetooth connection between his mobile terminal and, for example, a security access control point at the entrance of a secured building.
  • the user may then capture an image of his facial features using his mobile terminal and then transfer a digital image of his facial features to the access control point via the previously established Bluetooth connection.
  • the security access control point may then compare the transferred digital image to a digital image database of all users having security access to the building. Once a match is found between the transferred digital image and an image contained within the digital image database, the security access control point may then facilitate entry by the user into the building. Otherwise, the security access point may deny access to the building and may provide a message indicating the denial of access to the user via the established Bluetooth connection.
  • an enhanced gaming operation is enabled through the use of multiple image enabled mobile terminals.
  • each user of a mobile terminal is engaged in a video game, whereby each user is networked to each other via known networking infrastructure.
  • the video content of each gaming participant's mobile terminal may then be transferred to a video device via a proximity connection, whereby the video device multiplexes the separate video feeds into a single gaming video stream that is subsequently displayed onto the video device.
  • a proximity connection whereby the video device multiplexes the separate video feeds into a single gaming video stream that is subsequently displayed onto the video device.
  • such an arrangement also supports gaming activity of a single player, whereby a plethora of games previously limited by the display geometry of the mobile terminal, are now enabled by the present invention. .
  • the invention is a modular invention, whereby processing functions within either a mobile terminal or a hardware platform may be utilized to implement the present invention.
  • the mobile terminals may be any type of wireless device, such as wireless/cellular telephones, personal digital assistants (PDAs), or other wireless handsets, as well as portable computing devices capable of wireless communication.
  • PDAs personal digital assistants
  • These landline and mobile devices utilize computing circuitry and software to control and manage the conventional device activity as well as the functionality provided by the present invention.
  • Hardware, firmware, software or a combination thereof may be used to perform the various imaging transfer functions described herein.
  • FIG. 7 An example of a representative ' mobile terminal computing system capable of carrying out operations in accordance with the invention is illustrated in FIG. 7.
  • the exemplary mobile computing environment 700 is merely representative of general functions that may be associated with such mobile devices, and also that landline computing systems similarly include computing circuitry to perform such operations.
  • the exemplary mobile computing arrangement 700 suitable for image capture/image data transfer functions in accordance with the present invention may be associated with a number of different types of wireless devices.
  • the representative mobile computing arrangement 700 includes a processing/control unit 702, such as a microprocessor, reduced instruction set computer (RISC), or other central processing module.
  • the processing unit 702 need not be a single device, and may include one or more processors.
  • the processing unit may include a master processor and associated slave processors coupled to communicate with the master processor.
  • the processing unit 702 controls the basic functions of the mobile terminal, and also those functions associated with the present invention as dictated by camera . hardware 730, imaging software module 726 and Bluetooth stack 728 available in the program storage/memory 704. Thus, the processing unit 702 is capable of initiating image capture and proximity connection functions associated with the present invention, whereby images captured by camera hardware 730 may be transferred to imaging software module 726 for subsequent transmission via Bluetooth stack 728.
  • the program storage/memory 704 may also include an operating system and program modules for carrying out functions and applications on the mobile terminal.
  • the program storage may include one or more of read-only memory (ROM), flash ROM, programmable and/or erasable ROM, random access memory (RAM), subscriber interface module (SIM), wireless interface module (WIM), smart card, or other removable memory device, etc.
  • ROM read-only memory
  • flash ROM programmable and/or erasable ROM
  • RAM random access memory
  • SIM subscriber interface module
  • WIM wireless interface module
  • smart card or other removable memory device, etc.
  • the program modules associated with the storage/memory 704 are stored in non-volatile electrically-erasable, programmable
  • EEPROM electrically erasable programmable read-only memory
  • flash ROM electrically erasable programmable read-only memory
  • the relevant software for carrying out conventional mobile terminal operations and operations in accordance with the present invention may also be transmitted to the mobile computing arrangement 700 via data signals, such as being downloaded electronically via one or more networks, such as the Internet and an intermediate wireless network(s).
  • the processor 702 is also coupled to user-interface 706 elements associated with the mobile terminal.
  • the user-interface 706 of the mobile terminal may include, for example, a display 708 such as a liquid crystal display, a keypad 710, speaker 712, camera hardware 730, and microphone 714. These and other user-interface components are coupled to the processor 702 as is known in the art.
  • Other user-interface mechanisms may be employed, such as voice commands, switches, touch pad/screen, graphical user interface using a pointing device, trackball, joystick, or any other user interface mechanism.
  • the mobile computing arrangement 700 also includes conventional circuitry for performing wireless transmissions.
  • a digital signal processor (DSP) 716 may be employed to perform a variety of functions, including analog-to-digital (A/D) conversion, digital-to-analog (D/A) conversion, speech coding/decoding, encryption/decryption, error detection and correction, bit stream translation, filtering, etc.
  • the transceiver 718 generally coupled to an antenna 720, transmits the outgoing radio signals 722 and receives the incoming radio signals 724 associated with the wireless device.
  • the mobile computing arrangement 700 of FIG. 7 is provided as a representative example of a computing environment in which the principles of the present invention may be applied. From the description provided herein, those skilled in the art will appreciate that the present invention is equally applicable in a variety of other currently known and future mobile and landline computing environments.
  • desktop computing devices similarly include a processor, memory, a user interface, and data communication circuitry.
  • the present invention is applicable in any known computing structure where data may be communicated via a network.
  • the invention may be implemented as a machine, process, or article of manufacture by using standard programming and/or engineering techniques to produce programming software, firmware, hardware or any combination thereof.
  • Any resulting program(s), having computer-readable program code may be embodied on one or more computer-usable media, such as disks, optical disks, removable memory devices, semiconductor memories such as RAM, ROM, PROMS, etc.
  • Articles of manufacture encompassing code to carry out functions associated with the present invention are intended to encompass a computer program that exists permanently or temporarily on any computer-usable medium or in any transmitting medium which transmits such a program.
  • Transmitting mediums include, but are not limited to, transmissions via wireless/radio wave communication networks, the Internet, intranets, telephone/modem-based network communication, hard- wired/cabled communication network, satellite communication, and other stationary or mobile network systems/communication links. From the description provided herein, those skilled in the art will be readily able to combine software created as described with appropriate general purpose or special purpose computer hardware to create an image processing system and . method in accordance with the present invention.
  • the image processing platforms or other systems for providing image 5 processing functions in connection with the present invention may be any type of computing device capable of processing and commuiricating digital information.
  • the image processing platforms utilize computing systems to control and manage the image processing activity.
  • An example of a representative computing system capable of carrying out operations in accordance with the invention is illustrated in FIG. 8. Hardware,
  • the computing structure 800 of FIG. 8 is an example computing structure that can be used in connection with such an image processing platform.
  • the example computing arrangement 800 suitable for performing the image i 5 processing activity in accordance with the present invention includes image processing platform 801, which includes a central processor (CPU) 802 coupled to random access memory (RAM) 804 and read-only memory (ROM) 806.
  • the ROM 806 may also be other types of storage media to store programs, such as programmable ROM (PROM), erasable PROM (EPROM), etc.
  • the processor 802 may communicate with other internal
  • I/O circuitry 808 and bussing 810 to provide control signals and the like.
  • image data received from proximity I/O connections 808 or Internet connection 828 may be processed in accordance with the present invention.
  • External data storage devices such as DNS or location servers, may be coupled to I/O circuitry 808 to facilitate imaging functions according to the present 5 invention.
  • databases may be locally stored in the storage/memory of image processing platform 801, or otherwise accessible via a local network or networks having a more extensive reach such as the Internet 828.
  • the processor 802 carries out a variety of functions as is known in the art, as dictated by software and/or firmware instructions.
  • Image processing platform 801 may also include one or more data storage devices, including hard and floppy disk drives 812, CD-ROM drives 814, and other hardware capable of reading and/or storing information such as DVD, etc.
  • software for carrying out the image processing and image transfer operations in accordance with the present invention may be stored and distributed on a CD-ROM 816, diskette 818 or other form of media capable of portably storing information. These storage media may be inserted into, and read by, devices such as the CD-ROM drive 814, the disk drive 812, etc.
  • the software may also be transmitted to image processing platform 801 via data signals, such as being downloaded electronically via a network, such as the Internet.
  • Image processing platform 801 is coupled to a display 820, which may be any type of known display or presentation screen, such as LCD displays, plasma display, cathode ray tubes (CRT), etc.
  • a user input interface 822 is provided, including one or more user interface mechanisms such as a mouse, keyboard, microphone, touch pad, touch screen, voice-recognition system, etc.
  • the image processing platform 801 may be coupled to other computing devices, such as the landline and/or wireless terminals via a network.
  • the server may be part of a larger network configuration as in a global area network (GAN) such as the Internet 828, which allows ultimate connection to the various landline and/or mobile client/watcher devices.
  • GAN global area network

Abstract

A method and apparatus allows proximity connections to an image enabled mobile terminal to exploit image capabilities of the mobile terminal. Mobile terminals (504 and 512) interact with Personal Computers (PC) (506 and 510) via Bluetooth connections (516 and 522) to transfer audio/video data as required to support specified activities. Video conferencing scenario 500 is enabled by exploiting imaging capabilities of the mobile terminals in lieu of a PC mounted camera. Other applications involve security, licensing, video enhancement, and gaming features that may be implemented with the present invention.

Description

METHOD AND APPARATUS FOR EXPLOITING VIDEO STREAMING
SERVICES OF MOBILE TERMINALS VIA PROXIMITY CONNECTIONS
FIELD OF THE INVENTION
This invention relates in general to multi-modal usage of a mobile terminal, and more particularly, to providing alternative usages of mobile termii als for video conferencing and video enhancement operations.
BACKGROUND OF THE INVENTION
Mobile terminals of the present day have evolved according to the needs and desires of the mobile terminal market. In the early phases of the market, mobile terminals were significantly costlier, larger, heavier, and less energy efficient than they are today. Furthermore, the early mobile terminals provided a significantly smaller set of value added features and performance. The early phases of the mobile terminal market was driven by business customers, where the key competitive mobile terminal parameters consisted of size and talk time, which was largely a function of battery charge duration. As the business market began to grow, the mobile terminals began to segment themselves according to price differential and performance. The higher end mobile terminals, for example, were targeted for business customers relying on enhanced performance where price was a secondary issue. Alternatively, middle and low end mobile terminals were offered having a reduced function set, an affordable cost, and were targeted for private, cost-sensitive customers.
As the private market began to grow, lifestyle of the private consumer began to provide an input to the segmentation of the mobile terminal portfolio. As such, a much wider range of products with a strong differentiation in design and styling began to emerge to match the highly diverse consumer segment needs. Generally speaking, the thrust of the mobile terminal market was to bring mobility to voice communications.
The evolution towards Third Generation (3G) mobile communications and digital convergence allows the opportunity to bring mobility to functions other than voice. As endless opportunities present themselves, it is important to identify the most attractive new functionalities and create mobile terminals which are optimized for use of those functionalities. It remains equally important to concurrently provide a full set of mobile applications to ensure the usability of the mobile terminal within the communications network to which the consumer has already grown accustomed.
For example, terminals optimized for imaging must excel in the capturing, handling, sending, and storing of images. The image enabled mobile terminals must, therefore, provide a large color display for imaging content presentation and an internal camera for imaging content capture. The image enabled mobile terminal must also support the Multimedia Messaging Service (MMS), Short Message Service (SMS), Personal Information Management (PIM), games, voice, etc. in order to become the mobile terminal companion of choice for the consumer.
Image enabled terminals that provide mobility to the user have created a strong emotional attachment between the users because the mobile imaging terminals provide a mechanism whereby users are able to share life's experiences through rich communication. Person to person multimedia communications, with self-created content, represents a natural progression of today's communication paradigm and comes one step closer to virtual presence.
As the progression to virtual presence continues, the number of value added applications provided by the mobile imaging terminals grows. Challenges presented by the implementation of those value added applications, however, also continues to grow. For example, although the image size and quality presented by the mobile imaging terminal is enhanced, the mobile imaging terminal can not present the same amount or variety of information as can be provided to a user of a Personal Computer (PC). As such, the screen size of the mobile terminal creates a challenge to content providers because the displays have less graphical area available to them for display of their content. By increasing the resolution of the display, more information may be projected onto the display, but the readability of the content decreases. Usage of page sizes that are larger than the display size of the mobile terminal is also an option, but navigation around the oversized page requires increased scrolling interaction by the user.
Other applications may take the mobile terminal out of its traditional role by exploiting the data processing capabilities of the mobile terminal to enhance the performance of terminals in proximity to the mobile terminal. Video conferencing, for example, is an activity that has typically been served through fixed media devices. The fixed media devices including audio and video signal processing terminals along with corresponding PC based conference software to enable a virtual meeting between parties that are physically separated. Each party to the video conference, however, must provide his own set of conference supporting media, such as a PC mounted camera or Integrated Services Digital Network (ISDN) video conferencing equipment, in order to project his own personal image and any other audio/video content required to support the. video conference. In some cases, however, the particular PC in use by one of the parties to the video conference is not equipped with a camera or Web Cam and is thus deprived of the opportunity to provide video input to the video conference. Accordingly, there is a need in the communications industry for a method and apparatus that exploits the capabilities of mobile terminals to increase the number of value added services that may be facilitated through their use. In particular, the capabilities of image enabled mobile terminals needs to be exploited, in order to couple those capabilities to existing network infrastructure via proximity connections to enable the value added services.
Additionally, the coupling mechanism used to exploit the capabilities of the mobile terminals needs to be adapted to enhance the services provided to a user of the mobile terminal. In such an adaptation, the user's total experience in using the particular application may be enhanced through the use of other terminals/applications in proximity to the mobile terminal.
SUMMARY OF THE INVENTION
To overcome limitations in the prior art, and to overcome other limitations that will become apparent upon reading and understanding the present specification, the present invention discloses a system and method for providing exploitation of the capabilities of mobile terminals to support and enhance their video functions.
In accordance with one embodiment of the invention, a method is provided of exploiting video facilities of a mobile terminal. The method comprises generating video content using the mobile terminal, establishing a proximity connection between the mobile terminal and a video processing platform, and transferring the video content from the mobile terminal to the video processing platform using the proximity connection. In accordance with another embodiment of the invention,, an image processing system is provided. The image processing system comprises an image enabled mobile terminal arranged to generate content and an image processing platform arranged to receive the content. The image enabled mobile terminal is coupled to the image processing platform via a proximity connection to transfer the content.
In accordance with another embodiment of the invention, a mobile terminal wirelessly coupled to a network which includes a network element capable of processing content from the mobile terminal in provided. The mobile terminal comprises a memory capable of storing at least one of a protocol module and an imaging module, an image component configured by the imaging module to generate digital images, a processor coupled to the memory and configured by the protocol module to enable digital image exchange with the network element, and a transceiver configured to facilitate the digital image exchange with the network element.
In accordance with another embodiment of the invention, a computer- readable medium having instructions stored thereon which are executable by a mobile terminal for exchanging video content with a video processing platform is provided. The instructions perform steps comprising generating digital images using imaging equipment internal to the mobile terminal, establishing a proximity connection with the video processing platform, and transmitting the digital images to the video processing platform with the established proximity connection.
In accordance with another embodiment of the invention, a digital image processor proximately coupled to an image enabled mobile terminal is provided. The digital image processor comprises various arrangements for establishing the proximate connection to the image enabled mobile terminal, receiving video content from the image enabled mobile terminal via the proximate connection, and utilizing the received video content.
In accordance with another embodiment of the invention, a computer- readable medium having instructions stored thereon which are executable by a video processing platform is provided. The instructions perform steps comprising establishing a proximate connection with an image capable mobile terminal, receiving images from the image capable mobile terminal via the proximate connection, and utilizing the received images. These and various other advantages and features of novelty which characterize the invention are pointed out with greater particularity in the claims annexed hereto and form a part hereof. However, for a better understanding of the invention, its advantages, and the objects obtained by its use, reference should be made to the drawings which form a further part hereof, and to accompanying descriptive matter, in which there are illustrated and described specific examples of a system and method in accordance with the invention.
BRIEF DESCRIPTION OF THE DRAWINGS The invention is described in connection with the embodiments illustrated in the following diagrams.
FIG. 1 illustrates a block diagram according to the principles of the present invention;
FIG. 2 illustrates a block diagram of key components of content capture; FIG. 3 illustrates a Bluetooth stack hierarchy;
FIG. 4 illustrates a generic communication architecture according to the present invention;
FIG. 5 illustrates an exemplary video conferencing scenario enabled by the principles of the present invention; FIG. 6 illustrates a flow diagram of an exemplary setup required for a proximity connection transfer according to the present invention;
FIG. 7 illustrates a representative mobile computing arrangement suitable for providing imaging data in accordance with the present invention; and
FIG. 8 is a representative computing system capable of carrying out image processing functions according to the present invention.
DETAILED DESCRIPTION OF THE INVENTION
In the following description of the exemplary embodiment, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration various embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized, as structural and operational changes may be made without departing from the scope of the present invention.
Generally, the present invention is directed to a method and apparatus to facilitate a proximity connection between a mobile terminal and a PC, or other image processing platform, in order to exploit the functional capabilities of the mobile terminal. Various applications and embodiments in accordance with the present invention are presented that extract outbound video data, for example, from an image enabled mobile terminal to facilitate video conferencing. In such an instance, a proximity connection between the mobile terminal and an auxiliary device, such as a PC or other equivalent terminal, is established to enable the extraction of the video content from the mobile terminal. The video content may be either self-generated by the mobile terminal itself, e.g., generated by a camera built into the mobile terminal, or may be generated by an external device, e.g., pre-recorded video content from a Digital Versatile Disk (DVD) or equivalent storage device, and routed through the mobile terminal for proximity transport to the auxiliary device.
The present invention also supports the enhancement of in-bound video data intended to be rendered on the mobile terminal's display. The video data may be enhanced by, for example, allowing any video content received by the mobile terminal to be displayed by an auxiliary video display device, thus bypassing the normal video display of the mobile terminal. In such an instance, the user of any terminal, e.g., land, mobile or otherwise, that incorporates features in accordance with the present invention may enhance his viewing pleasure by facilitating an enhanced rendition of the received video content, e.g., larger size, greater resolution, increased color palette, through the use of an auxiliary video display device. FIG. 1 illustrates a high level block diagram illustrating the principles of the present invention. In general, mobile terminal 102 is arranged to transfer data to hardware platform 106 via path 118 and is arranged to receive acknowledgment of the received data via path 120. The nature of the data transfer may be of any type and rate that is supported by proximity connection 104, mobile terminal 102 and hardware platform 106. One of ordinary skill in the art will recognize that any data type may be supported by such an arrangement. The data, for example, may be synchronization data that is transferred by mobile terminal 102 to hardware platform 106, e.g. a Personal Computer (PC), in order to obtain a common data store between the two devices via a data synchronization standard such as SyncML. The synchronization data may support such activities as calendar synchronization, contact synchronization, to-do lists, etc. as required between the mobile terminal 102 and PC 106 to provide such a common data store. SyncML may also support data types such as images, files and database objects. It should also be noted, that data transfer from hardware platform 106 may also be received by mobile terminal 102. In such an instance, data flow path between hardware platform 106 and mobile terminal 102 is facilitated through path 120, while acknowledgment of the data receipt is provided by path 118. For purposes of exemplifying the present invention, block diagram 100 is discussed in terms of a content transport mechanism between mobile terminal 102 and hardware platform 106, whereby proximity connection 104 is utilized as the communication conduit between the two devices. Proximity connection 104 may represent a wired and/or a wireless connection. Wired implementations of proximity connection 104 may include single ended data transmission formats such as those specified by the RS232 or RS432 standards, or may include differential data transmission formats such as those specified by the RS422 or RS485 standards. Other wired implementations for higher bandwidth considerations may use the Universal Serial Bus (USB), or Fire Wire, specifications for example. Wireless implementations of proximity connection 104 may include Wireless Local Area Network (WLAN), Bluetooth, Infrared, etc. as required by the particular application.
In operation, mobile terminal 102 may be an image enabled device having content capture/receipt capability 108. Content capture/receipt 108 may provide both audio and video data, whereby the images may be presented in still and/or video mode. In still mode, only a single image is transferred via path 110 to First-In First-Out (FIFO) buffer 114, where acknowledgement of the content receipt is generated via path 112. In video mode, multiple images arranged in back to back frame sequence are transferred to FIFO buffer 114 at a rate of, for example, 10-20 Frames per Second (FPS). FIFO buffer 114 buffers the content blocks, while content delivery/receipt 116 prepares for their subsequent transfer to hardware platform 106 via path 118 through proximity connection 104. Path 120 is used by content receipt/delivery 122 to acknowledge receipt of the images from content delivery 116 via proximity connection 104. Buffer and synchronization block 124 is used to provide the proper frame alignment and playback speed as required by presentation 126. Presentation 126 represents any Application Programming Interface (API) that is executing on hardware platform 106 including image processing software in support of video conferencing, photo identification card generation, photo identification security, etc.
The images transferred via proximity path 104 may be formatted in any one of a number of video formats to include Moving Pictures Expert Group (MPEG), MPEG version 4 (MPEG-4), Joint Photographic Experts Group (JPEG), to name only a few. In addition, vector graphic files may be transmitted where creation of digital images is facilitated through a sequence of commands or mathematical statements that place lines and shapes in a given two-dimensional or three-dimensional space. In vector graphics, the file that results from a graphic artist's work is created and saved as a sequence of vector statements. For example, instead of containing a bit in the file for each bit of a line drawing, a vector graphic file describes a series of points to be connected. Alternatively, the vector graphics file may be converted to a raster graphics image by content delivery/receipt 116 prior to transmission, so as to increase portability between systems. Exemplary vector graphic files may be created, for example, by using Adobe Illustrator and CorelDraw. Additionally, animation images are also usually created as vector files, using content creation products such as Shockwave's Flash that allows creation of 2-D and 3-D animations that may be sent to content receipt/delivery 122 as a vector file and then rasterized "on the fly" as they arrive by presentation 126.
Content capture/receipt 108 may produce a video sequence consisting of a series of still images for ultimate buffering into FIFO buffer 114. Content capture/receipt 108 may also provide audio capture capability, depending upon the capabilities/feature selection represented by mobile terminal 102. Additionally, audio and video data streams may be received from FIFO buffer 114 by content capture/receipt 108 for subsequent display by mobile terminal 102. The audio and video data streams may have previously been received from an external device (not shown), from hardware platform 106, or from an attachment contained within an MMS message, for example. FIG. 2 illustrates an exemplary block diagram of some of the key elements provided by content capture 108 as they relate to the principles of the present invention. Video block 202 may provide a single image or a series of images to video encoder 206. In the case that a series of images are provided, video encoder 206 implements video compression methods that exploit redundant and perceptually irrelevant parts of the video series. The redundancy can be categorized into spatial, temporal, and spectral components; where spatial redundancy relates to correlation between neighboring pixels; temporal redundancy relates to objects likely to appear in present frames that were there in past frames; and spectral redundancy addresses the correlation between the different color components of the same image. Video encoder 206 achieves video compression by generating motion compensation data, which describes the motion between the current and previous image. Video encoder 206 may seek to establish a constant bit rate for data stream 220, in which case video encoder 206 controls the frame rate as well as the quality of images.
Video encoder 206 may implement a video COder/DECoder (CODEC) algorithm defined by ITU-T H.263, which is an established CODEC scheme used in various multimedia services. H.263 provides a wide toolbox of various encoding tools and coding complexities for different purposes. A definition of the tools to be used and the allowed complexity of the mode are defined in CODEC profiles and levels, such as Profile 0, Level 10, also known as the H.263 baseline, has been defined as a mandatory video CODEC. Video encoder 206 may also support decoding of video bit-stream content conforming to MPEG-4 Visual Simple Profile, Level 0. Other proprietary video coding formats, such as RealVideo 7 and RealVideo 8, may be used that are recognized by the RealOne Player utility.
Audio block 204 represents an audio generation function of mobile terminal 102, such as a microphone, that provides a series of amplitude waveforms over a period of time to audio encoder 208. The amplitude waveforms are digitized prior to delivery to audio encoder, where the sampling rate imposed is dependent upon the nature of the sound to be digitized. Music, for example, requires a 44.1 Kilohertz (KHz) sampling rate in order to provide high quality. Speech, on the other hand, may be adequately sampled at an 8 KHz sampling rate.
Audio encoder 208 compresses the digitized data received from audio block 204 using a number of different compression algorithms. One simple coding method uses an adaptive step size to quantize audio samples. Such a technique is used by the Interactive Multimedia Association (IMA) Adaptive Pulse Code Modulation (ADPCM) audio coding standard that reserves 4 bits per sample. Consequently, if the sampling rate is set to 8 KHz, IMA ADPCM coded audio requires a 32 KHz bit stream to be transferred by path 222. Other simple speech coding methods include the A-Law and μ-Law coding algorithms, which uses a logarithmic quantization step size and reserves 8 bits per sample. One of the most advanced speech coding standards used by audio encoder 208, is the Adaptive Multi-Rate (AMR) speech codec, which includes eight speech coding modes, whose bit rates range from 4.75 to 12.2 kilobits per second (kbps). AMR implements a limited audio bandwidth of 3.5 KHz and has been proven to work satisfactorily with news and sports coverage as well as light, popular music. File composer 210 receives video encoded data stream 220 and audio encoded data stream 222 and composes data file 212 from the respective data streams. It should be noted that the audio portion of content capture 200 is optional, depending upon the capabilities/feature selection currently implemented by mobile terminal 102. In the event that a single image is to be transmitted via MMS, for example, the audio portion of the content captured by content capture 108 of FIG. 1 may be omitted by a feature that is selected locally within mobile terminal 102.
File composer 210 groups video data stream 220 and optionally, audio data stream 222, into file format 212. Once formatted, file format 212 may be processed locally within mobile terminal 102, streamed over transport channel 118 via proximity connection 104 to hardware platform 106, or dispatched in any number of other formats and protocols as permitted by the capabilities of mobile terminal 102. Some exemplary file formats provided by file composer 210 may include Microsoft Audio- Video Interleaved (AVI), Apple Quicklime file format (.mov), MPEG-1 file format (.mpg), 3GPP file format (.3gp), and MP4 file format (.mp4), etc. Header 214 provides specific information about the file format such as video coding type, audio coding type, length of file, file identifier, etc. Video bit stream 216 and audio bit stream 218 contain the respective video bit stream 220 and audio bit stream 222 as received and formatted by file composer 210.
Proximity connection 104 of FIG. 1 provides the conduit for data transfer between mobile terminal 102 and hardware platform 106. For reasons of exemplification, proximity connection 104 is described in terms of the Bluetooth standard for localized data transfer. Bluetooth technology is an industry standard for short-range wireless voice and data communications, allowing a single air interface to support local communications for distances of up to 10-20 meters.
Mobile terminal 102 of FIG. 1 may be implemented using a Series 60 Platform, for example, that is built upon the Symbian Operating System (OS) General Technology (GT). Symbian GT provides a fully object-oriented design, preemptive multitasking, and full support for client-server architecture. Symbian GT also provides the common core for API and technology, which is shared between all Symbian reference designs. Some of the major components supported by Symbian GT include a multimedia server for audio recording, playback, and image-related functionality, as well as a Personal Area Network (PAN) communication stack including infrared, Bluetooth and serial communications support. As such, Symbian GT allows the use of Bluetooth technology to allow proximity, wireless operations to utilize local service accessories. The number and type of local service accessories provided by the Bluetooth connection are virtually unlimited and they include for example; bar code readers, digital pens, health monitoring devices, Global Positioning System (GPS) receivers, enhanced video feeds, and video conferencing facilitation.
Like many other communication technologies, Bluetooth is composed of a hierarchy of components that is exemplified in Bluetooth stack hierarchy 300 shown in FIG. 3. The Bluetooth communication stack may be broken into two main components. The first component, Bluetooth Host Controller (BTHC) 312, provides the lower level of the stack. BTHC 312 is generally implemented in hardware and allows the upper level stack, Bluetooth Host (BTH) 302, to send or receive data over a Bluetooth link and to configure the Bluetooth link. Configuration and data transfer between BTHC 312 and BTH 302 takes place via path 322, which connects Host Controller Interface (HCI) driver 310 with HCI firmware module 312.
Bluetooth operates in the 2.4 gigahertz (GHz) Industrial, Scientific, and Medical (ISM) band. It uses a fast frequency hopping scheme with 79 frequency channels, each being 1 MHz wide. Bluetooth Radio (BTR) 320 is designed to provide a low-cost, 64 kbps, full-duplex connection that exhibits low power consumption. Power consumption on the order of 10-30 milliamps (mA) is typical, where even lower power consumption exists during idle periods. Baseband link controller (LC) 318 defines different packet types to be used for both synchronous and asynchronous transmission. Packet types supporting different error handling techniques, e.g., error correction/detection, and encryption, are also defined within LC 318. LC 318 also mitigates any Direct Current (DC) offsets provided by BTR 320 due to special payload characteristics. Link Manager Protocol (LMP) 316 is responsible for controlling the connections of a device, like connection establishment, link detachment, security management, e.g., authentication, encryption, and power management of various low power modes.
BTH 302 illustrates the upper level of a Bluetooth stack and is comprised primarily of software applications 304-310, and 326. HCI driver 310 packages the high level components that communicate with the lower level hardware components found in BTHC 312. Logical Link Control and Adaptation Protocol (L2CAP) 308 allows finer grain control of the radio link. For example, L2CAP 308 controls how multiple users of the link are multiplexed together, controls packet segmentation and reassembly, and conveys quality of service information.
Service Discovery Protocol (SDP) 304 and Radio Frequency Communication (RFCOMM) protocol 306 represent middleware protocols of the Bluetooth stack. RFCOMM protocol 306 allows applications communicating with Bluetooth stack 300 to treat a Bluetooth enabled device as if it were a serial communications device, in order to support legacy protocols. RFCOMM protocol 306 defines a virtual set of serial port applications, which allows RFCOMM protocol 306 to replace cable enabled communications. The definition of RFCOMM protocol 306 incorporates major parts of the European Telecommunication Standards Institute (ETSI) TS 07.10 standard, which defines multiplexed serial communication over a single serial link. Service Discovery Protocol (SDP) 304 is used to locate and describe services provided by or available through another Bluetooth device. SDP 304 plays an important role in managing Bluetooth devices in a Bluetooth environment by allowing discovery and service description of services offered within the environment. Audio block 326 represents another middleware component of stack 300 that allows Bluetooth to offer audio and telephony support. The audio portion of Bluetooth data may be transferred directly from LC 318 to audio block 326 via path 324, thereby bypassing the LMP 316, HCI 310 and 314, and the L2CAP 308 layers. The Bluetooth communication stack of FIG. 3 represents the lower communication layers that support any number of higher level application embodiments according to the present invention. Returning to FIG. 1, for example, mobile terminal 102 and hardware platform 106 may each employ Bluetooth communication stack 300, in order to facilitate image and voice data transfer, whereby presentation software and camera APIs are implemented as necessary for image generation and display.
FIG. 4 represents generic communication architecture 400 according to the principles of the present invention, where the BTHC layers, e.g., 412 and 422, and the BTH layers, e.g., 414 and 424, represent the Bluetooth communication stack illustrated in FIG. 3. Mobile terminal 404 represents an image enabled mobile terminal that is capable of generating data streams such as those described in relation to FIG. 2. Camera HW 416 and camera API 418 combine to generate video bit stream 216 of data stream 212, while required terminal software 420 and related hardware (not shown) establishes the associated audio bit stream 218 of data stream 212. Data path 426 represents, for example, proximity connection 104 of FIG. 1 that exists between mobile terminal 404 and, for example, PC 402. Data path 426 transfers the video/audio data streams generated by mobile terminal 404 and provides them to their corresponding peer entity within PC 402. For example, camera API 408 and camera API 418 are peer entities, where camera API 418 ultimately communicates through data path 426 to its corresponding camera API entity 408, so that images captured by camera HW 416 may ultimately be displayed by presentation block 406. Required PC software 410 receives the captured images from BTHC 412 and provides a synchronized data stream as required to camera API 408 for proper delivery to presentation block 406. Communications between Bluetooth stacks 412-414 and 422-424 is facilitated through the use of sockets, which is similar to those used by a Transmission Control Protocol/Internet Protocol (TCP/IP) comiection. In Symbian OS, Bluetooth sockets are used to discover other Bluetooth devices, and to read and write data over a Bluetooth radio interface. The Bluetooth sockets API supports communication over both the L2CAP 308 and RFCOMM 306 layers of Bluetooth stack 300. Not only does the socket API allow a client to make a connection to a remote device, the socket API also allows the remote device to contact the client for data transfer. The Bluetooth socket API has five key concepts: socket address, remote device inquiry, RFCOMM commands, L2CAP commands, and HCI commands. Structures and API's provided by the Symbian OS Bluetooth implementation are defined in Series 60 Software Development Kit (SDK) under the following header files: <bt_sock.h>, <btdevice.h>, <btextnotifiers.h>, <btsdp.h>3 and <bttypes.h>. Prior to socket connection, however, service discovery must be performed in order to identify potential Bluetooth enabled devices for subsequent connection. SDP 304 performs this task by performing two main functions: discovery of devices and services within the local area, and the advertisement of services from the local device. If, for example, a Bluetooth enabled device can provide locally generated image data streams, then that service is made visible through SDP 304 to other Bluetooth enabled devices that may be interested in that functionality.
In order for a Bluetooth service to be advertised, it must first be represented by a service record and kept within an SDP database for access by other applications. The SDP database is implemented as a server within the Symbian OS and as such, other applications wishing to discover the services offered, must first establish a connection to the server and open a session on the server. The RSdp class within the Symbian OS API represents the SDP database server and allows an application to connect to it.
A service record in Symbian OS is created through the SDP database by managing a collection of service handles and their associated attributes that make up the service record. Each service record is identified by a Universally Unique Identifier
(UUID), which is defined .within the <bttypes.h> header file. Within each service record exists a service class and associated profile that are used to help generalize the types of service provided by the device. There are, for example, predefined service class numbers that may represent a Bluetooth enabled mobile terminal and a more specific entry to define that the Bluetooth enabled mobile terminal also has image capability that may support either still frame or streamed video applications. In general, therefore, the service record contains a collection of attributes that are identified by an identification number that is of the TSdpAttributelD data type defined within the <btsdp.h> header file. Each service handle and the associated attributes are used by the SDP database to identify attributes and their values within the database.
The Symbian OS API provides SDP 304 with service search patterns and attribute search patterns that are used to facilitate the device and service discovery process. The service search pattern, for example, allows SDP 304 to discover and create a list of all available services within the local area, where all services discovered in the local area are services that are advertised by their own SDP agent and identified by their respective service record UUIDs. The attribute search pattern allows the creation of a list of attribute IDs from a remote SDP database. Additionally, the attribute search pattern allows the searching device to create an attribute range that defines a list of attributes that are of interest to the searching device. Accordingly, attribute queries result in only those attributes of the remote Bluetooth enabled devices that fall within the attribute range specified in attribute search pattern. Once a device with a suitable service has been identified, a client application generally queries the service for more information, which may include requesting the available attributes of the identified service. There are generally two ways that a client may search for devices. First, a client application may manually issue a query to all devices within a range and handle each response in turn. Alternatively, the client may use the Bluetooth Device Selection User Interface (DSUI), which automatically issues queries, handles the responses, and prompts the user with a dialog box. The dialog box enables the user the ability to select the device that he wishes to use. The UI operates in conjunction with the RNotifier class supplied within the Symbian OS API.
In order to exemplify a process by which a user of a Bluetooth enabled mobile device may discover and use available devices and services, a user scenario is discussed in relation to FIG. 5 that is in accordance with the principles of the present invention. FIG. 5 represents video conferencing scenario 500, whereby the parties of meeting group 502 wish to participate in the presentation offered by presenter 514. Meeting group 502 and presenter 514 are spatially removed from one another, such as may be the case when a corporation has a number of production and engineering facilities that are geographically located across the globe from one another. In a particular case, for example, meeting group 502 may represent a group of lower level production management personnel located within the United States, who have assembled to receive and discuss the ideas presented by senior production manager 514 located at the corporation's headquarters in Finland.
In such an example, meeting group 502 and presenter 514 are not equipped with standard video conferencing equipment, but are equipped with imaging capable mobile terminals 504 and 512. In addition, image processing capable PCs 506 and 510 are provided locally to meeting group 502 and presenter 514, respectively. PCs 506 and 510, and mobile terminals 504 and 512, are also equipped with proximity connection capability to facilitate communication via links 516 and 522. In one embodiment of the present invention, proximity links 516 and 522 represent Bluetootli communication links, where the communication architecture 400 of FIG. 4 is utilized. Specifically, PC 506 and 510 may be represented by architecture 402 and mobile terminals 504 and 512 may be represented by architecture 404, where Bluetooth communications stacks 412-414 and 422-424 are arranged to support Bluetooth device and service discovery and subsequent usage of the Bluetooth devices and services.
PCs 506 and 510 are interconnected through internet 506 via, for example, Local Area Network (LAN) or Wide Area Network (WAN) connections 518 and 520. Each of PCs 506 and 510 are equipped with, for example, conferencing software such as NetMeeting or Timbuktu Pro, that allow audio/video data to be exchanged between them in order to create a virtual meeting between meeting group 502 and presenter 514. Since PCs 506 and 510 are not equipped with their own video capturing device, imaging enabled mobile terminals 504 and 512 are used instead.
In order to ultimately create the virtual meeting, mobile terminals 504 and 512 must first discover any video conferencing support services that may be offered in the local area. A user of mobile terminal 504, for example, may invoke service discovery through the Bluetooth DSUI executing within mobile terminal 504. The DSUI is provided by the RNotifier class of the Symbian OS which is defined by the <E3std.h> header file. The RNotifier class is designed to be used with most types of client applications, where a background thread provides the user selection dialog box that is presented to the user via the display of mobile terminal 504. The Bluetooth Device Selection dialog box is represented by the KDeviceSelectionNotifierUid constant value defined in the <btextnotifiers.h> header file.
Prior to presenting the user with a device selection dialog, the user has the option of limiting the number of devices that respond to the service discovery query by using the SetDeviceClass function. In so doing, one of the users of meeting group 502 may have defined a specific class of video conference support devices to be used that includes PC 506. Once discovered, the DSUI of mobile terminal 504 allows the user to select PC 506 as the device to be used for video conferencing support. Similarly, the user of mobile terminal 512 may select PC 510 to support video conferencing on his end of the virtual meeting link.
Actual data transfer between mobile terminal 504 and PC 506 and mobile terminal 512 and PC 510 may be implemented through the use of either the RFCOMM 306 or L2CAP 308 protocol layers as illustrated in FIG. 3, where access to the RFCOMM transmission protocol is provided by Symbian OS socket architecture. Each of mobile terminals 504 and 512, as well as PCs 506 and 510, may create a set of transmit and receive sockets. Both types of sockets are required by the scenario depicted in FIG. 5 because meeting group 502 and presenter 514 each require a view of the other on their respective PC terminals.
The flow diagram of FIG. 6 illustrates, for example, the steps required for RFCOMM communication enabled through the Bluetooth stack. In step 602, the StartL function of the CMessageServer class performs the work of starting up the RFCOMM service. Initially, StartL initiates the Bluetooth sockets protocol and initializes the
RFCOMM port. Next, opening the server socket involves first connecting to the socket server and then opening a connection to the server using an RSocket object, where the RFCOMM protocol is specified to be the requested transport type.
The next task is to query the protocol for an available channel as in step 604. Once the available channel has been returned, a port is assigned to the available channel and the RSocket object is then bound to the port as in step 606. In the case that the RSocket object is to be a listening object, it is set up to listen for any available data present on the port that it has been bound to. Any security features that may be required, such as authentication, authorization, and encryption, are setup in step 608 after the listening socket has been setup.
Since a transmit socket is still required, the YES path is taken from step 610 to generate a transmit socket in almost exactly the same way. In step 604, however, the query for an available channel involves the discovery process discussed above. In particular, step 604 for a transmission socket setup, assumes that the Bluetooth discovery process has identified the address of the device, e.g., PC 506, offering the service required by the Bluetooth client, e.g., mobile terminal 504. In addition, the discovery process has already identified the channel on the receiving device, e.g., PC 506, to which the transmitting device, e.g. mobile terminal 504, should connect. Once mobile terminal 504 and PC 506 have completed their RSocket object setup procedures, they may engage in full duplex communication. Receive event step 612 follows the YES path, for example, when data is ready to be read from the listening RSocket. Receive event step 612 takes the NO path, for example, when data is ready for transmission to the transmitting RSocket. Bluetooth communication pair, PC 510 and mobile terminal 512, setup their respective Bluetooth communication link is a similar manner. Once the point-to- point connection is established between PC 506, link 518, Internet 508, link 520, and PC 510, then the virtual meeting is ready to begin and the respective opposing images are available to each other. At meeting group 502, for example, mobile terminal 504 is transitioned into its imaging mode and positioned such that the video image presented on the display of mobile terminal 504 captures meeting group 502. Video data stream 216 and optional audio data stream 218 as depicted in data stream 212 of FIG. 2 is then captured by mobile terminal 504 and transmitted to PC 506 via Bluetooth link 516. The data subsequently arrives at PC 510 via internet 508 for display to presenter 514 via PC 510. Likewise, mobile terminal 512 is transitioned into its imaging mode and positioned such that the video image presented on the display of mobile terminal 512 captures presenter 514. Video data stream 216 and optional audio data stream 218 as depicted in data stream 212 of FIG. 2 is then captured by mobile terminal 512 and transmitted to PC 510 via Bluetooth link 522. The data subsequently arrives at PC 506 via internet 508 for display to meeting group 502 via PC 506.
It should be noted that although the implementation of virtual meeting scenario 500 is discussed in terms of a Bluetooth API enabled through a Symbian OS, the implementation may also be implemented by exploiting native coding methods and APIs of legacy devices. For example, data synchronization APIs that currently allow mobile terminal backup/synchronization services to proxnnity devices such as PCs, may also be exploited to implement similar functionality. Future terminals employing Mobile Information Device Profile (MIDP) 2.0 and Mobile Media API may offer similar features that may be exploited to provide equivalent features. It will become apparent to one of ordinary skill in the art that the present invention may be adapted to any number of applications involving audio/video feeds from mobile terminals via proximity connections and the subsequent use of the audio/video feeds. In one embodiment according to the principles of the present invention, for example, a user of a mobile terminal may wish to engage in Internet browsing activities using the mobile terminal, but wishes to display the received content on an enhanced display consisting of a PC, overhead projector, television, or other similar video device. Once the proximity connection, e.g. wired or wireless connection, is established between the mobile terminal and the auxiliary video device, then according to the principles of the present invention, the user may enhance his viewing pleasure, by forwarding the received content to the auxiliary video device via the proximity connection for improved video presentation. Alternatively, the mobile terminal may receive audio/video content from an external Compact Disk (CD) or DVD device. The mobile terminal may then subsequently transfer the received content to the video device via a proximity connection for enhanced presentation of the content received from the CD or DVD device.
In another embodiment according to the present invention, an image capable mobile terminal may take the place of a digital camera in those instances where a digital camera may be utilized. Any licensing that requires a video image of the license holder to be placed onto the license card, for example, may utilize the present invention. For example, state licensing stations may be placed in public kiosks having Bluetooth functionality. Any user having access to an image enabled mobile terminal may access the kiosk through a Bluetooth proximity connection and provide the kiosk with a digital image of Mmself. The digital image being generated at the mobile terminal may be transferred to the kiosk via the Bluetooth connection. The kiosk, after having verified that the user has met other licensing requirements, may then render a license to the user that contains the user's digital image previously generated and transferred to the kiosk via the Bluetooth connection. In another embodiment according to the present invention, a security application is comprehended, whereby access is controlled through digital verification of a user's facial features. In such an instance, a user having an image enabled mobile terminal may establish a Bluetooth connection between his mobile terminal and, for example, a security access control point at the entrance of a secured building. The user may then capture an image of his facial features using his mobile terminal and then transfer a digital image of his facial features to the access control point via the previously established Bluetooth connection. The security access control point may then compare the transferred digital image to a digital image database of all users having security access to the building. Once a match is found between the transferred digital image and an image contained within the digital image database, the security access control point may then facilitate entry by the user into the building. Otherwise, the security access point may deny access to the building and may provide a message indicating the denial of access to the user via the established Bluetooth connection.
In another embodiment according to the present invention, an enhanced gaming operation is enabled through the use of multiple image enabled mobile terminals. In such an instance, each user of a mobile terminal is engaged in a video game, whereby each user is networked to each other via known networking infrastructure. The video content of each gaming participant's mobile terminal may then be transferred to a video device via a proximity connection, whereby the video device multiplexes the separate video feeds into a single gaming video stream that is subsequently displayed onto the video device. Alternatively, such an arrangement also supports gaming activity of a single player, whereby a plethora of games previously limited by the display geometry of the mobile terminal, are now enabled by the present invention. .
The invention is a modular invention, whereby processing functions within either a mobile terminal or a hardware platform may be utilized to implement the present invention. The mobile terminals may be any type of wireless device, such as wireless/cellular telephones, personal digital assistants (PDAs), or other wireless handsets, as well as portable computing devices capable of wireless communication. These landline and mobile devices utilize computing circuitry and software to control and manage the conventional device activity as well as the functionality provided by the present invention. Hardware, firmware, software or a combination thereof may be used to perform the various imaging transfer functions described herein. An example of a representative ' mobile terminal computing system capable of carrying out operations in accordance with the invention is illustrated in FIG. 7. Those skilled in the art will appreciate that the exemplary mobile computing environment 700 is merely representative of general functions that may be associated with such mobile devices, and also that landline computing systems similarly include computing circuitry to perform such operations. The exemplary mobile computing arrangement 700 suitable for image capture/image data transfer functions in accordance with the present invention may be associated with a number of different types of wireless devices. The representative mobile computing arrangement 700 includes a processing/control unit 702, such as a microprocessor, reduced instruction set computer (RISC), or other central processing module. The processing unit 702 need not be a single device, and may include one or more processors. For example, the processing unit may include a master processor and associated slave processors coupled to communicate with the master processor.
The processing unit 702 controls the basic functions of the mobile terminal, and also those functions associated with the present invention as dictated by camera . hardware 730, imaging software module 726 and Bluetooth stack 728 available in the program storage/memory 704. Thus, the processing unit 702 is capable of initiating image capture and proximity connection functions associated with the present invention, whereby images captured by camera hardware 730 may be transferred to imaging software module 726 for subsequent transmission via Bluetooth stack 728. The program storage/memory 704 may also include an operating system and program modules for carrying out functions and applications on the mobile terminal. For example, the program storage may include one or more of read-only memory (ROM), flash ROM, programmable and/or erasable ROM, random access memory (RAM), subscriber interface module (SIM), wireless interface module (WIM), smart card, or other removable memory device, etc.
In one embodiment of the invention, the program modules associated with the storage/memory 704 are stored in non-volatile electrically-erasable, programmable
ROM (EEPROM), flash ROM, etc. so that the information is not lost upon power down of the mobile terminal. The relevant software for carrying out conventional mobile terminal operations and operations in accordance with the present invention may also be transmitted to the mobile computing arrangement 700 via data signals, such as being downloaded electronically via one or more networks, such as the Internet and an intermediate wireless network(s).
The processor 702 is also coupled to user-interface 706 elements associated with the mobile terminal. The user-interface 706 of the mobile terminal may include, for example, a display 708 such as a liquid crystal display, a keypad 710, speaker 712, camera hardware 730, and microphone 714. These and other user-interface components are coupled to the processor 702 as is known in the art. Other user-interface mechanisms may be employed, such as voice commands, switches, touch pad/screen, graphical user interface using a pointing device, trackball, joystick, or any other user interface mechanism.
The mobile computing arrangement 700 also includes conventional circuitry for performing wireless transmissions. A digital signal processor (DSP) 716 may be employed to perform a variety of functions, including analog-to-digital (A/D) conversion, digital-to-analog (D/A) conversion, speech coding/decoding, encryption/decryption, error detection and correction, bit stream translation, filtering, etc. The transceiver 718, generally coupled to an antenna 720, transmits the outgoing radio signals 722 and receives the incoming radio signals 724 associated with the wireless device.
The mobile computing arrangement 700 of FIG. 7 is provided as a representative example of a computing environment in which the principles of the present invention may be applied. From the description provided herein, those skilled in the art will appreciate that the present invention is equally applicable in a variety of other currently known and future mobile and landline computing environments. For example, desktop computing devices similarly include a processor, memory, a user interface, and data communication circuitry. Thus, the present invention is applicable in any known computing structure where data may be communicated via a network.
Using the description provided herein, the invention may be implemented as a machine, process, or article of manufacture by using standard programming and/or engineering techniques to produce programming software, firmware, hardware or any combination thereof. Any resulting program(s), having computer-readable program code, may be embodied on one or more computer-usable media, such as disks, optical disks, removable memory devices, semiconductor memories such as RAM, ROM, PROMS, etc. Articles of manufacture encompassing code to carry out functions associated with the present invention are intended to encompass a computer program that exists permanently or temporarily on any computer-usable medium or in any transmitting medium which transmits such a program. Transmitting mediums include, but are not limited to, transmissions via wireless/radio wave communication networks, the Internet, intranets, telephone/modem-based network communication, hard- wired/cabled communication network, satellite communication, and other stationary or mobile network systems/communication links. From the description provided herein, those skilled in the art will be readily able to combine software created as described with appropriate general purpose or special purpose computer hardware to create an image processing system and . method in accordance with the present invention.
The image processing platforms or other systems for providing image 5 processing functions in connection with the present invention may be any type of computing device capable of processing and commuiricating digital information. The image processing platforms utilize computing systems to control and manage the image processing activity. An example of a representative computing system capable of carrying out operations in accordance with the invention is illustrated in FIG. 8. Hardware,
10 firmware, software or a combination thereof may be used to perform the various image processing functions and operations described herein. The computing structure 800 of FIG. 8 is an example computing structure that can be used in connection with such an image processing platform.
The example computing arrangement 800 suitable for performing the image i 5 processing activity in accordance with the present invention includes image processing platform 801, which includes a central processor (CPU) 802 coupled to random access memory (RAM) 804 and read-only memory (ROM) 806. The ROM 806 may also be other types of storage media to store programs, such as programmable ROM (PROM), erasable PROM (EPROM), etc. The processor 802 may communicate with other internal
20 and external components through input/output (I/O) circuitry 808 and bussing 810, to provide control signals and the like. For example, image data received from proximity I/O connections 808 or Internet connection 828 may be processed in accordance with the present invention. External data storage devices, such as DNS or location servers, may be coupled to I/O circuitry 808 to facilitate imaging functions according to the present 5 invention. Alternatively, such databases may be locally stored in the storage/memory of image processing platform 801, or otherwise accessible via a local network or networks having a more extensive reach such as the Internet 828. The processor 802 carries out a variety of functions as is known in the art, as dictated by software and/or firmware instructions. 0 Image processing platform 801 may also include one or more data storage devices, including hard and floppy disk drives 812, CD-ROM drives 814, and other hardware capable of reading and/or storing information such as DVD, etc. In one embodiment, software for carrying out the image processing and image transfer operations in accordance with the present invention may be stored and distributed on a CD-ROM 816, diskette 818 or other form of media capable of portably storing information. These storage media may be inserted into, and read by, devices such as the CD-ROM drive 814, the disk drive 812, etc. The software may also be transmitted to image processing platform 801 via data signals, such as being downloaded electronically via a network, such as the Internet. Image processing platform 801 is coupled to a display 820, which may be any type of known display or presentation screen, such as LCD displays, plasma display, cathode ray tubes (CRT), etc. A user input interface 822 is provided, including one or more user interface mechanisms such as a mouse, keyboard, microphone, touch pad, touch screen, voice-recognition system, etc.
The image processing platform 801 may be coupled to other computing devices, such as the landline and/or wireless terminals via a network. The server may be part of a larger network configuration as in a global area network (GAN) such as the Internet 828, which allows ultimate connection to the various landline and/or mobile client/watcher devices.
The foregoing description of the various embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Thus, it is intended that the scope of the invention be limited not with this detailed description, but rather determined from the claims appended hereto.

Claims

WHAT IS CLAIMED IS: 1. A method of exploiting video facilities of a mobile terminal, comprising: generating video content using the mobile terminal; establishing a proximity connection between the mobile terminal and a video processing platform; and transferring the video content from the mobile terminal to the video processing platform using the proximity connection.
2. The method according to Claim 1, further comprising generating audio content using the mobile terminal.
3. The method according to Claim 2, wherein the video and audio content are combined to form a bit stream.
4. The method according to Claim 1 , wherein establishing a proximity connection comprises coupling the mobile terminal to the video processing platform via a wired comiection.
5. The method according to Claim 1, wherein establishing a proximity connection comprises coupling the mobile terminal to the video processing platform via a wireless connection.
6. The method according to Claim 5, wherein the wireless connection includes any one of a Bluetooth, infrared, or Wireless Local Area Network (WLAN) connection.
7. The method according to Claim 6, wherein transferring the video content includes initiating an RFCOMM communication session using the Bluetooth connection.
8. The method according to Claim 7, wherein the wireless connection facilitates a video conferencing scenario.
9. The method according to Claim 7, wherein the wireless connection facilitates a gaming scenario.
10. The method according to Claim 7, wherein the wireless connection facilitates security access verification.
11. An image processing system, comprising: an image enabled mobile terminal arranged to generate content; and an image processing platform arranged to receive the content, wherein the image enabled mobile terminal is coupled to the image processing platform via a proximity connection to facilitate content transfer.
12. The image processing system according to Claim 11, wherein the content is comprised of video data.
13. The image processing system according to Claim 11 , wherein the content is comprised of video and audio data.
14. The image processing system according to Claim 12, wherein the proximity connection comprises a wired connection.
15. The image processing system according to Claim 12, wherein the proximity connection comprises a wireless connection.
16. The image processing system according to Claim 15, wherein the wireless connection includes any one of a Bluetooth, infrared, or Wireless Local Area Network (WLAN) connection.
17. The image processing system according to Claim 15, wherein RFCOMM socket communication facilitates the content transfer.
18. A mobile terminal wirelessly coupled to a network which includes a network element capable of processing content from the mobile terminal, the mobile terminal comprising: a memory capable of storing at least one of ja protocol module and an imaging module; an image component configured by the imaging module to generate digital images; a processor coupled to the memory and configured by the protocol module to enable digital image exchange with the network element; and a transceiver configured to facilitate the digital image exchange with the network element.
19. The mobile terminal according to Claim 18, wherein the protocol module includes a Bluetooth communication stack.
20. The mobile terminal according to Claim 19, wherein the Bluetooth protocol stack allows discovery of the network element.
21. A computer-readable medium having instructions stored thereon which are executable by a mobile terminal for exchanging video content with a video processing platform by performing steps comprising: generating digital images using imaging equipment internal to the mobile terminal; establishing a proximity connection with the video processing platform; and transmitting the digital images to the video processing platform with the established proximity connection.
22. A digital image processor proximately coupled to an image enabled mobile terminal, comprising: means for establishing the proximate connection to the image enabled mobile terminal; means for receiving video content from the image enabled mobile terminal via the proximate connection; and means for utilizing the received video content.
23. The digital image processor according to Claim 22, wherein the means for establishing the proximate connection comprises a Bluetooth service discovery mechanism.
24. The digital image processor according to Claim 22, wherein the means for receiving the video content comprises a socket connection established between RFCOMM layers of a Bluetooth connection between the digital image processor and the image enabled mobile terminal.
25. A computer-readable medium having instructions stored thereon which are executable by a video processing platform by performing steps comprising: establishing a proximate connection with an image capable mobile terminal; receiving images from the image capable mobile terminal via the proximate comiection; and utilizing the received images.
PCT/IB2004/001258 2003-04-15 2004-04-07 Method and apparatus for exploiting video streaming services of mobile terminals via proximity connections WO2004092863A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/414,453 US20040207719A1 (en) 2003-04-15 2003-04-15 Method and apparatus for exploiting video streaming services of mobile terminals via proximity connections
US10/414,453 2003-04-15

Publications (2)

Publication Number Publication Date
WO2004092863A2 true WO2004092863A2 (en) 2004-10-28
WO2004092863A3 WO2004092863A3 (en) 2005-01-27

Family

ID=33158702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2004/001258 WO2004092863A2 (en) 2003-04-15 2004-04-07 Method and apparatus for exploiting video streaming services of mobile terminals via proximity connections

Country Status (2)

Country Link
US (1) US20040207719A1 (en)
WO (1) WO2004092863A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008071848A1 (en) * 2006-12-13 2008-06-19 Teliasonera Ab Communication system

Families Citing this family (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6578203B1 (en) 1999-03-08 2003-06-10 Tazwell L. Anderson, Jr. Audio/video signal distribution system for head mounted displays
US20020057364A1 (en) 1999-05-28 2002-05-16 Anderson Tazwell L. Electronic handheld audio/video receiver and listening/viewing device
US7210160B2 (en) 1999-05-28 2007-04-24 Immersion Entertainment, L.L.C. Audio/video programming and charging system and method
TW556849U (en) * 2002-09-13 2003-10-01 Shih-Pin Huang Projector to receive RF signal
US7725073B2 (en) * 2002-10-07 2010-05-25 Immersion Entertainment, Llc System and method for providing event spectators with audio/video signals pertaining to remote events
US8154581B2 (en) * 2002-10-15 2012-04-10 Revolutionary Concepts, Inc. Audio-video communication system for receiving person at entrance
US20040158648A1 (en) * 2003-02-10 2004-08-12 Mark Tung Network card device for LAN and WLAN systems
US7593687B2 (en) 2003-10-07 2009-09-22 Immersion Entertainment, Llc System and method for providing event spectators with audio/video signals pertaining to remote events
US7472057B2 (en) * 2003-10-17 2008-12-30 Broadcom Corporation Detector for use in voice communications systems
US9439048B2 (en) * 2003-10-31 2016-09-06 Alcatel Lucent Method and apparatus for providing mobile-to-mobile video capability to a network
JP4389560B2 (en) * 2003-11-28 2009-12-24 沖電気工業株式会社 Real-time communication system and media end terminal
US7899492B2 (en) 2004-07-16 2011-03-01 Sellerbid, Inc. Methods, systems and apparatus for displaying the multimedia information from wireless communication networks
US20140071818A1 (en) 2004-07-16 2014-03-13 Virginia Innovation Sciences, Inc. Method and system for efficient communication
US7957733B2 (en) 2004-07-16 2011-06-07 Sellerbid, Inc. Method and apparatus for multimedia communications with different user terminals
TWI245211B (en) * 2004-09-15 2005-12-11 High Tech Comp Corp Portable electronic apparatus and video conference system integrated with the same
TWI247545B (en) * 2004-11-12 2006-01-11 Quanta Comp Inc Video conferencing system utilizing a mobile phone and the method thereof
US20060121964A1 (en) * 2004-12-07 2006-06-08 Rhonda Gilligan Network marketing method
KR100931428B1 (en) * 2004-12-24 2009-12-11 가부시키가이샤 나비타이무쟈판 Recording route with leading route guidance system, portable route guidance system and program
DE102006001607B4 (en) * 2005-01-14 2013-02-28 Mediatek Inc. Methods and systems for the transmission of sound and image data
US20060161349A1 (en) * 2005-01-18 2006-07-20 John Cross GPS device and method for displaying raster images
JP2006238328A (en) * 2005-02-28 2006-09-07 Sony Corp Conference system, conference terminal and portable terminal
US7522181B2 (en) * 2005-03-09 2009-04-21 Polycom, Inc. Method and apparatus for videoconference interaction with bluetooth-enabled cellular telephone
EP1743681A1 (en) * 2005-07-13 2007-01-17 In Fusio (S.A.) Method for promoting an entertainment-based mobile application
EP1978480A3 (en) 2005-07-22 2011-09-07 Kangaroo Media, Inc. System and methods for enhancing the experience of spectators atttending a live sporting event
US9794762B2 (en) * 2005-10-06 2017-10-17 Nokia Technologies Oy System, methods, software, and devices employing messaging
US20070099658A1 (en) * 2005-11-03 2007-05-03 Blue Label Interactive Systems and methods for developing, delivering and using video applications for a plurality of mobile platforms
US7797740B2 (en) * 2006-01-06 2010-09-14 Nokia Corporation System and method for managing captured content
US8510666B2 (en) * 2006-03-14 2013-08-13 Siemens Enterprise Communications Gmbh & Co. Kg Systems for development and/or use of telephone user interface
US7737915B1 (en) * 2006-08-10 2010-06-15 Emc Corporation Techniques for displaying information through a computer display
GB2444994A (en) * 2006-12-21 2008-06-25 Symbian Software Ltd Interdevice transmission of data
US20080235600A1 (en) * 2007-03-23 2008-09-25 Microsoft Corporation Interaction with a Display System
US8253770B2 (en) * 2007-05-31 2012-08-28 Eastman Kodak Company Residential video communication system
US8185815B1 (en) * 2007-06-29 2012-05-22 Ambrosia Software, Inc. Live preview
US9538011B1 (en) * 2007-07-26 2017-01-03 Kenneth Nathaniel Sherman Mobile microphone system portal app for meetings
US20090047991A1 (en) * 2007-08-13 2009-02-19 Sony Ericsson Mobile Communications Ab Automatically enabling and disabling wireless networks
US8977710B2 (en) * 2008-06-18 2015-03-10 Qualcomm, Incorporated Remote selection and authorization of collected media transmission
US20090323802A1 (en) * 2008-06-27 2009-12-31 Walters Clifford A Compact camera-mountable video encoder, studio rack-mountable video encoder, configuration device, and broadcasting network utilizing the same
GB2463124B (en) * 2008-09-05 2012-06-20 Skype Ltd A peripheral device for communication over a communications sytem
US10567823B2 (en) 2008-11-26 2020-02-18 Free Stream Media Corp. Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device
US10977693B2 (en) 2008-11-26 2021-04-13 Free Stream Media Corp. Association of content identifier of audio-visual data with additional data through capture infrastructure
US9519772B2 (en) 2008-11-26 2016-12-13 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US9154942B2 (en) 2008-11-26 2015-10-06 Free Stream Media Corp. Zero configuration communication between a browser and a networked media device
US9961388B2 (en) 2008-11-26 2018-05-01 David Harrison Exposure of public internet protocol addresses in an advertising exchange server to improve relevancy of advertisements
US10419541B2 (en) 2008-11-26 2019-09-17 Free Stream Media Corp. Remotely control devices over a network without authentication or registration
US10880340B2 (en) 2008-11-26 2020-12-29 Free Stream Media Corp. Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device
US9986279B2 (en) 2008-11-26 2018-05-29 Free Stream Media Corp. Discovery, access control, and communication with networked services
US10631068B2 (en) 2008-11-26 2020-04-21 Free Stream Media Corp. Content exposure attribution based on renderings of related content across multiple devices
US10334324B2 (en) 2008-11-26 2019-06-25 Free Stream Media Corp. Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device
US8180891B1 (en) 2008-11-26 2012-05-15 Free Stream Media Corp. Discovery, access control, and communication with networked services from within a security sandbox
US8487975B2 (en) * 2009-01-27 2013-07-16 Lifesize Communications, Inc. Conferencing system utilizing a mobile communication device as an interface
TW201034430A (en) * 2009-03-11 2010-09-16 Inventec Appliances Corp Method for changing the video background of multimedia cell phone
US8687046B2 (en) * 2009-11-06 2014-04-01 Sony Corporation Three-dimensional (3D) video for two-dimensional (2D) video messenger applications
US8570358B2 (en) * 2009-11-06 2013-10-29 Sony Corporation Automated wireless three-dimensional (3D) video conferencing via a tunerless television device
JP2011248769A (en) 2010-05-28 2011-12-08 Sony Corp Information processor, information processing system and program
JP5494242B2 (en) * 2010-05-28 2014-05-14 ソニー株式会社 Information processing apparatus, information processing system, and program
US20110306325A1 (en) * 2010-06-10 2011-12-15 Rajesh Gutta Streaming video/audio from mobile phone to any device
US8555332B2 (en) 2010-08-20 2013-10-08 At&T Intellectual Property I, L.P. System for establishing communications with a mobile device server
US8504449B2 (en) 2010-10-01 2013-08-06 At&T Intellectual Property I, L.P. Apparatus and method for managing software applications of a mobile device server
US8516039B2 (en) * 2010-10-01 2013-08-20 At&T Intellectual Property I, L.P. Apparatus and method for managing mobile device servers
US8989055B2 (en) 2011-07-17 2015-03-24 At&T Intellectual Property I, L.P. Processing messages with a device server operating in a telephone
US8416281B2 (en) * 2010-11-24 2013-04-09 International Business Machines Corporation Multipoint conference scalability for co-located participants
US9066123B2 (en) 2010-11-30 2015-06-23 At&T Intellectual Property I, L.P. System for monetizing resources accessible to a mobile device server
JP5887756B2 (en) * 2010-11-30 2016-03-16 株式会社リコー External input device, communication terminal, display data sharing system, program
CN102098511A (en) * 2010-12-15 2011-06-15 中兴通讯股份有限公司 Mobile terminal and video playing realization method thereof
US20120190403A1 (en) * 2011-01-26 2012-07-26 Research In Motion Limited Apparatus and method for synchronizing media capture in a wireless device
JP2012231457A (en) * 2011-04-14 2012-11-22 Panasonic Corp Recording control device, information apparatus, information recording system, and program
US9348430B2 (en) 2012-02-06 2016-05-24 Steelseries Aps Method and apparatus for transitioning in-process applications to remote devices
US9398261B1 (en) 2012-07-20 2016-07-19 Time Warner Cable Enterprises Llc Transitioning video call between devices
US8892079B1 (en) * 2012-09-14 2014-11-18 Google Inc. Ad hoc endpoint device association for multimedia conferencing
EP2909971B1 (en) 2012-10-18 2020-09-02 Dolby Laboratories Licensing Corporation Systems and methods for initiating conferences using external devices
FR2998994A1 (en) * 2012-12-03 2014-06-06 France Telecom METHOD FOR RETRIEVING AUDIO AND / OR VIDEO CONTENT
US20140352896A1 (en) * 2013-05-30 2014-12-04 Gyeong-Hae Han Network roller shutters
KR102220825B1 (en) * 2013-09-05 2021-03-02 삼성전자주식회사 Electronic apparatus and method for outputting a content
WO2016003454A1 (en) 2014-07-02 2016-01-07 Hewlett-Packard Development Company, L.P. Managing port connections
WO2016036378A1 (en) * 2014-09-05 2016-03-10 Hewlett Packard Enterprise Development Lp Data storage over fibre channel
KR20160034737A (en) * 2014-09-22 2016-03-30 에스케이텔레콤 주식회사 Apparatus and method for multi-terminal communication service
KR101565347B1 (en) * 2014-11-20 2015-11-03 현대자동차주식회사 Vehicle supporting efficient bluetooth connection and method for controlling thereof
US9774824B1 (en) 2016-07-18 2017-09-26 Cisco Technology, Inc. System, method, and logic for managing virtual conferences involving multiple endpoints

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010055373A1 (en) * 2000-06-14 2001-12-27 Kabushiki Kaisha Toshiba Information processing system, information device and information processing device
US20020065868A1 (en) * 2000-11-30 2002-05-30 Lunsford E. Michael Method and system for implementing wireless data transfers between a selected group of mobile computing devices
US20020174073A1 (en) * 2001-05-21 2002-11-21 Ian Nordman Method and apparatus for managing and enforcing user privacy
US6489986B1 (en) * 2000-09-29 2002-12-03 Digeo, Inc. Remote control device for video and audio capture and communication
US6670982B2 (en) * 2002-01-04 2003-12-30 Hewlett-Packard Development Company, L.P. Wireless digital camera media
US6714233B2 (en) * 2000-06-21 2004-03-30 Seiko Epson Corporation Mobile video telephone system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003125365A (en) * 2001-10-10 2003-04-25 Minolta Co Ltd Controlling device, program, and recording medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010055373A1 (en) * 2000-06-14 2001-12-27 Kabushiki Kaisha Toshiba Information processing system, information device and information processing device
US6714233B2 (en) * 2000-06-21 2004-03-30 Seiko Epson Corporation Mobile video telephone system
US6489986B1 (en) * 2000-09-29 2002-12-03 Digeo, Inc. Remote control device for video and audio capture and communication
US20020065868A1 (en) * 2000-11-30 2002-05-30 Lunsford E. Michael Method and system for implementing wireless data transfers between a selected group of mobile computing devices
US20020174073A1 (en) * 2001-05-21 2002-11-21 Ian Nordman Method and apparatus for managing and enforcing user privacy
US6670982B2 (en) * 2002-01-04 2003-12-30 Hewlett-Packard Development Company, L.P. Wireless digital camera media

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008071848A1 (en) * 2006-12-13 2008-06-19 Teliasonera Ab Communication system

Also Published As

Publication number Publication date
WO2004092863A3 (en) 2005-01-27
US20040207719A1 (en) 2004-10-21

Similar Documents

Publication Publication Date Title
US20040207719A1 (en) Method and apparatus for exploiting video streaming services of mobile terminals via proximity connections
US7352997B2 (en) Method, apparatus and system for hosting a group of terminals
CN104115466B (en) Radio display with multi-screen service
JP4855408B2 (en) Portable wireless communication apparatus displaying information on a plurality of display screens, operating method of the portable wireless communication apparatus, and computer program for operating the portable wireless communication apparatus
US7398316B2 (en) Method and apparatus for keyhole video frame transmission during a communication session
US20100201780A1 (en) Utilizing image sequences to perform video streaming during video conferencing
US20060083194A1 (en) System and method rendering audio/image data on remote devices
CN104365088A (en) Multiple channel communication using multiple cameras
US20080288576A1 (en) Method and System for Sharing One or More Graphics Images Between Devices Using Profiles
US7260108B2 (en) Multimedia information providing method and apparatus
EP1561346A1 (en) Media communications method and apparatus
JP2001245268A (en) Contents transmitting system and content processor
US8970651B2 (en) Integrating audio and video conferencing capabilities
US20040001091A1 (en) Method and apparatus for video conferencing system with 360 degree view
KR100628322B1 (en) System for mediating convergence services of communication and broadcasting using non-communicative appliance
US20040204060A1 (en) Communication terminal device capable of transmitting visage information
CN114610253A (en) Screen projection method and equipment
JP5553782B2 (en) Video communication system and operating method thereof
WO2006104040A1 (en) Push-to-talk communication system and push-to-talk communication method
JP2010239641A (en) Communication device, communication system, control program of communication device, and recording media-recording control program of communication device
JP2003530743A (en) Video and graphics distribution system for mobile users
US20190089754A1 (en) System and method for providing audio conference between heterogenious networks
CN110177345A (en) A kind of file transmission, chat system and method for no cellular network signals region
JP4680034B2 (en) COMMUNICATION DEVICE, COMMUNICATION SYSTEM, COMMUNICATION DEVICE CONTROL PROGRAM, AND RECORDING MEDIUM CONTAINING COMMUNICATION DEVICE CONTROL PROGRAM
JP2014149644A (en) Electronic meeting system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase