Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20140208237 A1
Publication typeApplication
Application numberUS 14/224,079
Publication date24 Jul 2014
Filing date25 Mar 2014
Priority date30 Jun 2009
Also published asUS8718715, US20100331022
Publication number14224079, 224079, US 2014/0208237 A1, US 2014/208237 A1, US 20140208237 A1, US 20140208237A1, US 2014208237 A1, US 2014208237A1, US-A1-20140208237, US-A1-2014208237, US2014/0208237A1, US2014/208237A1, US20140208237 A1, US20140208237A1, US2014208237 A1, US2014208237A1
InventorsAri Petri HAPPONEN
Original AssigneeCore Wireless Licensing S.A.R.L
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Sharing functionality
US 20140208237 A1
Abstract
A user interface comprises a controller which is configured to display image data, receive input indicating a selection area comprising content corresponding to at least a portion of said image data, receive input indicating a movement of the selection area, identify a recipient and send the content to the recipient in response thereto.
Images(5)
Previous page
Next page
Claims(1)
What is claimed is:
1. An apparatus comprising:
a controller, wherein said controller is operable to:
display image data;
receive input indicating a selection area comprising at least a portion of said image data;
receive input indicating a movement of the selection area;
identify a recipient based upon the movement of the selection area; and
send the at least a portion of said image data in the selection area to the recipient.
Description
    CROSS REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application is a continuation of U.S. application Ser. No. 12/494,828 filed 30 Jun. 2009, which is incorporated herein in its entirety.
  • FIELD
  • [0002]
    The present application relates to a user interface, an apparatus and a method for sharing image data, and in particular to a user interface, an apparatus and a method for sharing a selection of image data to at least one recipient.
  • BACKGROUND
  • [0003]
    More and more electronic devices such as mobile phones, Media players, Personal Digital Assistants (PDAs) and computers both laptops and desktops are being used to display various image data such as media files (such as video files, slide shows and artwork for music files), internet content, image data representing maps, documents or other files and other image data.
  • [0004]
    A common problem is that sharing of images is quite slow or cumbersome and requires many steps to be taken by the user which is often thought of as being problematic for sharing images quickly and easily, especially if only a selection of an image is to be shared.
  • [0005]
    In contemporary devices a user has to take the following steps to share a selection in an image. First the selection has to be done and saved as a new file. The new file must then be specified to be sent to a recipient using a right click action, which to many users is regarded as unintuitive as they are used to the left click action. Then the recipient has to be specified. This requires a user to perform unassociated actions each being initiated from different menus and which requires an in-depth understanding of the system being used.
  • [0006]
    An apparatus that allows an easy to use and to learn sharing functionality would thus be useful in modern day society.
  • SUMMARY
  • [0007]
    On this background, it would be advantageously to provide a user interface, an apparatus and a method that overcomes or at least reduces the drawbacks indicated above by providing an apparatus according to the claims.
  • [0008]
    Further objects, features, advantages and properties of device, method and computer readable medium according to the present application will become apparent from the detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0009]
    In the following detailed portion of the present description, the teachings of the present application will be explained in more detail with reference to the example embodiments shown in the drawings, in which:
  • [0010]
    FIG. 1 is an overview of a telecommunications system in which a device according to the present application is used according to an embodiment;
  • [0011]
    FIG. 2 is a view of an apparatus according to an embodiment;
  • [0012]
    FIG. 3 is a block diagram illustrating the general architecture of an apparatus of FIG. 2 in accordance with the present application;
  • [0013]
    FIGS. 4 a-4 d are screen shot views of an apparatus or views of an application window according to an embodiment; and
  • [0014]
    FIG. 5 is a flow chart describing a method according to an embodiment of the application.
  • DETAILED DESCRIPTION
  • [0015]
    In the following detailed description, the user interface, the apparatus, the method and the software product according to the teachings for this application in the form of a cellular/mobile phone will be described by the embodiments. It should be noted that although only a mobile phone is describe-d the teachings of this application can also be used in any electronic device such as in portable electronic devices such as laptops, PDAs, mobile communication terminals, electronic books and notepads and other electronic devices offering access to information.
  • [0016]
    FIG. 1 illustrates an example of a cellular telecommunications system in which the teachings of the present application may be applied. In the telecommunication system of FIG. 1, various telecommunications services such as cellular voice calls, www or Wireless Application Protocol (W AP) browsing, cellular video calls, data calls, facsimile transmissions, music transmissions, still image transmissions, video transmissions, electronic message transmissions and electronic commerce may be performed between a mobile terminal 100 according to the teachings of the present application and other devices, such as another mobile terminal 106 or a stationary telephone 132. It is to be noted that for different embodiments of the mobile terminal 100 and in different situations, different ones of the telecommunications services referred to above may or may not be available; the teachings of the present application are not limited to any particular set of services in this respect.
  • [0017]
    The mobile terminals 100, 106 are connected to a mobile telecommunications network 110 through Radio Frequency (RF) links 102, 108 via base stations 104, 109. The mobile telecommunications network 110 may be in compliance with any commercially available mobile telecommunications standard, such as Group Speciale Mobile (GSM), Universal Mobile Telecommunications System (UMTS), Digital Advanced Mobile Phone system (D-AMPS), The code division multiple access standards (CDMA and CDMA2000), Freedom Of Mobile Access (FOMA), and Time Division-Synchronous Code Division Multiple Access (TD-SCDMA).
  • [0018]
    The mobile telecommunications network 110 is operatively connected to a wide area network 120, which may be Internet or a part thereof. An Internet server 122 has a data storage 124 and is connected to the wide area network 120, as is an Internet client computer 126. The server 122 may host a www/wap server capable of serving www/wap content to the mobile terminal 100.
  • [0019]
    A public switched telephone network (PSTN) 130 is connected to the mobile telecommunications network 110 as is commonly known by a skilled person. Various telephone terminals, including the stationary telephone 132, are connected to the PSTN 130.
  • [0020]
    The mobile terminal 100 is also capable of communicating locally via a local link 101 to one or more local devices 103. The local link can be any type of link with a limited range, such as Bluetooth, a Universal Serial Bus (USB) link, a Wireless Universal Serial Bus (WUSB) link, an IEEE 802.11 wireless local area network link, a Radio Standard link for example an RS-232 serial link, etc. The local devices 103 can for example be various sensors that can communicate measurement values to the mobile terminal 100 over the local link 101.
  • [0021]
    A computer such as a laptop or desktop can also be connected to the network both via a radio link such as a WiFi link, which is the popular term for a radio frequency connection using the WLAN (Wireless Local Area Network) standard IEEE 802.11.
  • [0022]
    It should be noted that the teachings of this application are also capable of being utilized in an internet network of which the telecommunications network described above may be a part of.
  • [0023]
    It should be noted that even though the teachings herein are described solely to wireless networks it is in no respect to be limited to wireless networks as such, but it to be understood to be usable in the Internet or similar networks.
  • [0024]
    It should thus be understood that an apparatus according to the teachings herein may be a mobile communications terminal, such as a mobile telephone, a media player, a music player, a video player, an electronic book, a personal digital assistant, a laptop as well as a stationary device such as a desktop computer or a server.
  • [0025]
    An embodiment 200 of the mobile terminal 100 is illustrated in more detail in FIG. 2 a. The mobile terminal 200 comprises a speaker or earphone 202, a microphone 206, a main or first display 203 and a set of keys 204 which may include keys such as soft keys 204 b, 204 c and a joystick 205 or other type of navigational input device. In this embodiment the display 203 is a touch-sensitive display also called a touch display which displays various virtual keys 204 a.
  • [0026]
    The internal component, software and protocol structure of the mobile terminal 200 will now be described with reference to FIG. 3. The mobile terminal has a controller 300 which is responsible for the overall operation of the mobile terminal and may be implemented by any commercially available CPU (“Central Processing Unit”), DSP (“Digital Signal Processor”) or any other electronic programmable logic device. The controller 300 has associated electronic memory 302 such as Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), flash memory, or any combination thereof. The memory 302 is used for various purposes by the controller 300, one of them being for storing data used by and program instructions for various software in the mobile terminal. The software includes a real-time operating system 320, drivers for a man-machine interface (MMI) 334, an application handler 332 as well as various applications. The applications can include a media file player 350, a notepad application 360, as well as various other applications 370, such as applications for voice calling, video calling, sending and receiving messages such as Short Message Service (SMS), Multimedia Message Service (MMS) or email, web browsing, an instant messaging application, a phone book application, a calendar application, a control panel application, a camera application, one or more video games, etc. It should be noted that two or more of the applications listed above may be executed as the same application.
  • [0027]
    The MMI 334 also includes one or more hardware controllers, which together with the MMI drivers cooperate with the first display 336/203, and the keypad 338/204 as well as various other Input/Output devices such as microphone, speaker, vibrator, ringtone generator, LED indicator, etc.
  • [0028]
    The software also includes various modules, protocol stacks, drivers, etc., which are commonly designated as 330 and which provide communication services (such as transport, network and connectivity) for an RF interface 306, and optionally a Bluetooth interface 308 and/or an IrDA interface 310 for local connectivity. The RF interface 306 comprises an internal or external antenna as well as appropriate radio circuitry for establishing and maintaining a wireless link to a base station (e.g. the link 102 and base station 104 in FIG. 1). As is well known to a man skilled in the art, the radio circuitry comprises a series of analogue and digital electronic components, together forming a radio receiver and transmitter. These components include, band pass filters, amplifiers, mixers, local oscillators, low pass filters, Analog to Digital and Digital to Analog (AD/DA) converters, etc.
  • [0029]
    The mobile terminal also ha—; a Subscriber Identity Module (SIM) card 304 and an associated reader. As is commonly known, the SIM card 304 comprises a processor as well as local work and data memory.
  • [0030]
    In the following description it will be assumed that the display is a touch display and that a tap is performed with a stylus or finger or other touching means tapping on a position on the display. It should be noted that a tap may also be included by use of other pointing means such as a mouse or touch pad controlled cursor which is positioned at a specific position and then a clicking action is performed. This analogy is commonly known in the field and will be clear to a skilled person. In the description it will be assumed that a tap input comprises a clicking action at an indicated position.
  • [0031]
    FIG. 4 show a series of screen shot views of an apparatus 400 according to the teachings herein. It should be noted that such an apparatus is not limited to a mobile phone, but can be any apparatus capable of displaying image data.
  • [0032]
    Examples of such apparatuses are computers, media players, mobile phones, personal digital assistants (PDA), digital cameras, navigation devices such as GPS (Global Positioning System) devices, game consoles electronic books, Digital Video Disc players, television sets, photo and video cameras, electronic books and electronic dictionaries.
  • [0033]
    The apparatus 400 has a display 403, which in this embodiment is a touch display.
  • [0034]
    A controller is configured to display image data or content 410, see FIG. 4 a. This image data may represent an image, a video, a document, a map, downloaded internet content, other downloaded content etc. The different alternatives to what image data may be displayed on an electronic device are well-known.
  • [0035]
    It should be noted that the image data may be stored on said apparatus or remotely at another position or in another apparatus. Image data may also be downloaded while it is being displayed, so called streaming.
  • [0036]
    By realizing that there is a problem in that the sequence of steps being taken, the location of the corresponding commands and the need to access different menus using different access means is difficult to many users and that the problem can be easily solved using a touch-based interface taking advantage of objects' mobility in such systems a solution as taught herein can be achieved offering a user with a quick and easy way of sharing images.
  • [0037]
    The teachings herein find particular usage when a large file collection is being browsed and some of the files are being shared as the multiple steps would otherwise have to be repeated.
  • [0038]
    A controller is configured to receive input indicating a selection area 411 of the image 410 on the display 403.
  • [0039]
    In one embodiment a controller is configured to receive an input indicating the selection area 411 by input marking the edges of the selection area 411.
  • [0040]
    In one embodiment a controller is configured to receive an input indicating the selection area 411 by input marking the center point of the selection and setting an area surrounding the center point as the selection. In one embodiment the area is rectangular. In one embodiment the area is square. In one embodiment the area is circular. In one embodiment the area is square. In one embodiment the area is oval.
  • [0041]
    In one embodiment the controller is configured to analyze the image data surrounding the center point and include data that is of interest according to selection criteria to be part of the selection area 41 I. Examples of such criteria are faces, vehicles, structures or other forms.
  • [0042]
    In one embodiment a controller is configured to receive an input indicating the whole image 410 as the selection area 411 by a tap input somewhere inside the image.
  • [0043]
    In one embodiment the controller is configured to mark the selection area 411 as is the case in FIG. 4 b where a selection area 41 1 has been marked by a stylus 412.
  • [0044]
    In one embodiment the controller is configured to receive input representing a movement of the selection area 411.
  • [0045]
    In one embodiment this movement input is received through manipulation of a navigation key 405.
  • [0046]
    In one embodiment this movement input is received through manipulation of a navigation input means such as a computer mouse or a trackball.
  • [0047]
    In one embodiment this movement input is received through a sliding gesture on the touch sensitive display 403.
  • [0048]
    In one embodiment this movement input is received through a sliding gesture on a touchpad.
  • [0049]
    In one embodiment a controller is configured to associate a portion 413 of at least one edge of the display 403 with at least one recipient. In FIG. 4 b two portions are shown 413 a and 413 b both arranged adjacent and associated to the right edge of the display 403. In this embodiment the portions are marked using a dotted rectangle. In one embodiment the portions are not marked.
  • [0050]
    It should be apparent that different arrangements of the portions are all within the teachings of this invention. Some examples are one portion on each edge, one portion for two edges, one portion for all edges, one portion on one edge and two or multiple portions on another edge.
  • [0051]
    In one embodiment a controller is configured to receive input indicating a portion of a display or application are and to receive input identifying a recipient and associate the portion with the recipient.
  • [0052]
    In one example a user marks a portion along a side of the display and drags a contact to that portion. The contact is then associated with that portion for future use.
  • [0053]
    In one embodiment the recipient being associated with a portion is application specific and may vary between applications. In one embodiment the portions are application specific.
  • [0054]
    In one embodiment a small icon representing the recipient is displayed in the associated portion.
  • [0055]
    In one embodiment a small icon representing the recipient is displayed adjacent the associated portion. In one embodiment the small icon is displayed adjacent the display 403. In one such embodiment the icon is printed on a cover of the terminal 400.
  • [0056]
    In one embodiment the recipient is a contact stored in a contact data base.
  • [0057]
    In one embodiment the recipient is a group of contacts.
  • [0058]
    In one embodiment the recipient is a service application. In one embodiment the service application is an internet service. In one embodiment the service application is one for sharing images or videos.
  • [0059]
    In one embodiment the recipient is a combination of at least one contact and at least one service application.
  • [0060]
    In one embodiment the controller is configured to identify a recipient according to the movement input.
  • [0061]
    In one embodiment the recipient is the recipient associated with the portion 413 in which the movement input stops.
  • [0062]
    In one embodiment the recipient is the recipient associated with the portion 413 to which the movement input brings the selection area 411 to. In such an embodiment a user will identify a recipient by dragging the selection area 411 to the associated portion 413.
  • [0063]
    In one embodiment the recipient is the recipient associated with the portion 413 through which the movement input brings the selection area 411 out of the display 403. In such an embodiment a user will identify a recipient by dragging the selection area 411 out of the display 403 through the associated portion 413.
  • [0064]
    In one embodiment a controller is configured to send the image data represented by the selection area 411 to the identified recipient.
  • [0065]
    In FIG. 4 c a user has dragged the selection area 411 (and its contents) to the upper right portion 413 a of the display 403. The content of the selection area 411 is packaged in an appropriate file format and sent to the associated recipient in response thereto.
  • [0066]
    A user is thus able to share a selection of an image by only making two simple gestures. First marking a selection and then pulling the selection to a portion of the display being associated with a recipient. The function of sharing a selection is thus effected by the two main steps of making the selection and indicating the recipient.
  • [0067]
    In one embodiment the data of the selection area is sent as a Multi-Media Service (MMS) message.
  • [0068]
    In one embodiment the data of the selection area is sent as an electronic mail (email).
  • [0069]
    In one embodiment the data of the selection area is sent using a dedicated communication channel.
  • [0070]
    In one embodiment the data of the selection area is sent using a special protocol communication channel.
  • [0071]
    In one embodiment the data of the selection area is sent through a file transfer protocol communication.
  • [0072]
    In one embodiment a controller is configured to display the content of the selection area 411 as the image 410 after it has been sent to the recipient, see FIG. 4 d. In one embodiment the controller is configured to adjust the data of the selection area 411 to better fit the available display space. In FIG. 4 d the content of the selection area 411 has been enlarged and displayed as the main image 410.
  • [0073]
    In one embodiment a controller is configured to prompt for whether the content of the selection area 411 should be saved as a new file and perform the saving operation in response thereto.
  • [0074]
    In one embodiment a controller is configured to automatically save the content of the selection area 411 as a new file.
  • [0075]
    In one embodiment a controller is configured to prompt for whether the content of the selection area 411 should be saved as the original file and perform the saving operation in response thereto.
  • [0076]
    In one embodiment a controller is configured to automatically save the content of the selection area 411 as the original file.
  • [0077]
    A user is thus offered a possibility of easily sharing a selection or cut out of an image to a friend or service without having to mark the selection, prompt the device to cut it out, and again prompt the device to send it to a contact and later specify the contact which is done by accessing different menus and using different control means in contemporary devices.
  • [0078]
    In one embodiment the selection is further saved automatically without further user interaction.
  • [0079]
    FIG. 5 is a flowchart describing a general method according to the teachings herein.
  • [0000]
    First the controller displays an image (510). The controller then receives an input marking a selection area (520) and an input indicating a movement of the selection area (530) to a portion which is associated to a recipient, thereby identifying a recipient (540). Content being represented by the selection area is then packaged and sent to the identified recipient 550.
  • [0080]
    The various aspects of what is described above can be used alone or in various combinations. The teaching of this application may be implemented by a combination of hardware and software, but can also be implemented in hardware or software. The teaching of this application can also be embodied as computer readable code on a computer readable medium. Such medium can be any of a Random Access Memory, a Read-Only Memory, a hard drive (magnetic or optical), a Digital Video Disc, A Compact Disc or other storage medium. It should be noted that the teaching of this application is not limited to the use in mobile communication terminals such as mobile phones, but can be equally well applied in Personal digital Assistants (PDAs), game consoles, media players, personal organizers, computers or any other device designed for displaying image data.
  • [0081]
    The teaching of the present application has numerous advantages. Different embodiments or implementations may yield one or more of the following advantages. It should be noted that this is not an exhaustive list and there may be other advantages which are not described herein. For example, one advantage of the teaching of this application is that a user will be able to quickly and efficiently share selections of images or other files with friends and family.
  • [0082]
    Although the teaching of the present application has been described in detail for purpose of illustration, it is understood that such detail is solely for that purpose, and variations can be made therein by those skilled in the art without departing from the scope of the teaching of this application.
  • [0083]
    For example, although the teaching of the present application has been described in terms of a mobile phone and a desktop computer, it should be appreciated that the teachings of the present application may also be applied to other types of electronic devices, such as media players, video players, photo and video cameras, palmtop, laptop and desktop computers and the like. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the teachings of the present application.
  • [0084]
    Features described in the preceding description may be used in combinations other than the combinations explicitly described.
  • [0085]
    Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.
  • [0086]
    The term “comprising” as used in the claims does not exclude other elements or steps. The term “a” or “an” as used in the claims does not exclude a plurality. A unit or other means may fulfill the functions of several units or means recited in the claims.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US6657702 *29 Nov 19992 Dec 2003Shutterfly, Inc.Facilitating photographic print re-ordering
US7016059 *29 Nov 199921 Mar 2006Shutterfly, Inc.Printing images in an optimized manner
US7028075 *23 Apr 200211 Apr 2006Flashpoint Technology, Inc.Method and system for sharing digital images over a network
US7343320 *2 Aug 199911 Mar 2008Treyz G VictorOnline digital image-based product ordering system
US7627174 *29 Sep 20051 Dec 2009Qurio Holdings, Inc.Digital image crop and order linked to a physical cropping tool
US20020065741 *2 Jan 200230 May 2002Baum Daniel R.Distributing images to multiple recipients
US20030182210 *25 Mar 200225 Sep 2003Erik WeitzmanProducing and sharing personalized photo calendar
US20030200268 *23 Apr 200223 Oct 2003Morris Robert P.Method and system for sharing digital images over a network
US20040109147 *25 Jul 200310 Jun 2004Shutterfly, Inc.Image prints having customized backprinting message
US20050143124 *1 Mar 200430 Jun 2005Sony Ericsson Mobile Communications AbMobile terminal with ergonomic imaging functions
US20050143136 *18 Feb 200530 Jun 2005Tvsi LevMms system and method with protocol conversion suitable for mobile/portable handset display
US20050190400 *15 Feb 20051 Sep 2005Redd Jarret L.Image printing for multiple recipients
US20050264832 *28 Jul 20051 Dec 2005Baum Daniel RPrinting images in an optimized manner
US20060004914 *1 Jul 20045 Jan 2006Microsoft CorporationSharing media objects in a network
US20070234214 *17 Mar 20064 Oct 2007One True Media, Inc.Web based video editing
US20080091723 *11 Oct 200617 Apr 2008Mark ZuckerbergSystem and method for tagging digital media
US20080158385 *28 Dec 20063 Jul 2008Research In Motion LimitedMethod for saving an image from a camera application of a portable electronic device
US20090271484 *29 Apr 200829 Oct 2009Kota Enterprises, LlcFacemail
US20100056188 *29 Aug 20084 Mar 2010Motorola, Inc.Method and Apparatus for Processing a Digital Image to Select Message Recipients in a Communication Device
US20100115430 *23 Oct 20096 May 2010Skirpa Alexander RUniversal content referencing, packaging, distribution system, and a tool for customizing web content
CA2356573A1 *31 Aug 200128 Feb 2003John-Paul J. GignacMethod of cropping a digital image
Non-Patent Citations
Reference
1 *http://www.peachpit.com/articles/article.aspx?p=412366&seqNum=6 downloaded 1/23/2016 IMovie by Robin Williams and John Tollett, published August 22, 2005
Classifications
U.S. Classification715/753
International ClassificationH04L29/06
Cooperative ClassificationH04L65/403, H04M1/72522, G06F3/0486, H04M1/00, G06F3/0488
Legal Events
DateCodeEventDescription
11 Sep 2017ASAssignment
Owner name: CONVERSANT WIRELESS LICENSING S.A R.L., LUXEMBOURG
Free format text: CHANGE OF NAME;ASSIGNOR:CORE WIRELESS LICENSING S.A.R.L.;REEL/FRAME:043814/0274
Effective date: 20170720