US20100245364A1 - Method of rendering text on an output device - Google Patents

Method of rendering text on an output device Download PDF

Info

Publication number
US20100245364A1
US20100245364A1 US12/815,475 US81547510A US2010245364A1 US 20100245364 A1 US20100245364 A1 US 20100245364A1 US 81547510 A US81547510 A US 81547510A US 2010245364 A1 US2010245364 A1 US 2010245364A1
Authority
US
United States
Prior art keywords
character
image
computer device
rendering
image file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/815,475
Inventor
Gerhard D. Klassen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ECKE RANCH BV
Malikie Innovations Ltd
Original Assignee
Research in Motion Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research in Motion Ltd filed Critical Research in Motion Ltd
Priority to US12/815,475 priority Critical patent/US20100245364A1/en
Assigned to RESEARCH IN MOTION LIMITED reassignment RESEARCH IN MOTION LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KLASSEN, GERHARD D.
Publication of US20100245364A1 publication Critical patent/US20100245364A1/en
Assigned to ECKE RANCH B.V. reassignment ECKE RANCH B.V. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHABER, MARGARET
Assigned to BLACKBERRY LIMITED reassignment BLACKBERRY LIMITED CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: RESEARCH IN MOTION LIMITED
Assigned to MALIKIE INNOVATIONS LIMITED reassignment MALIKIE INNOVATIONS LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BLACKBERRY LIMITED
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/203Drawing of straight lines or curves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents

Definitions

  • the present application relates to rendering text on an output device and, in particular, to the rendering of text in a custom font on an output device.
  • Computing devices use a variety of fonts to render text characters or glyphs on a display, a printer, or other output device.
  • a set of fonts are available to application programs or routines.
  • an application program When an application program outputs text characters in a certain font to a display, a graphics subsystem receives an instruction from the application program and renders the text characters on the display.
  • font types There are a number of prior art font types.
  • One type is a bitmapped font wherein each character of the character set is defined with an array of bits.
  • Another type is a vector-based font, wherein the shapes of the characters are defined mathematically. These fonts are more easily scaled than the bitmapped font.
  • a drawback of using custom fonts is that there is significant overhead in creating this font data. Large and complex code is required to render a complex custom font. This creates a difficulty for user devices that have limited processing and memory capacity but wish to display a custom font.
  • FIG. 1 shows a block diagram of a user device to which the present application is applied in an example embodiment
  • FIGS. 2 through 4 show a front view, side view, and back view, respectively, of an embodiment of a user device
  • FIG. 5 shows a front view of a further embodiment of a user device
  • FIGS. 6( a ) and ( b ) show sample images of custom character sets, according to the present application.
  • FIG. 7 shows, in flowchart form, a method of rendering text on an output device.
  • the present application provides a method of rendering text in a custom font that uses an image of the custom font in a standard image format for which the user device already has rendering code.
  • the character set for the custom font is stored in an image file and small portions of the image corresponding to individual text characters are rendered to output a text string.
  • the present application provides a method of rendering a text string on a display screen of a computer device.
  • the text string is comprised of a plurality of characters.
  • the computer device includes a memory storing an image file.
  • the image file defines an image containing a plurality of glyphs.
  • the memory has stored thereon associated character information.
  • the method comprises: for each character in the text string: defining a portion of the image containing a glyph corresponding to the character, including determining the location of the glyph corresponding to the character based upon the associated character information; and rendering the portion on the display screen.
  • the present application provides a computer device.
  • the computer device includes a display screen and a graphics subsystem for rendering a text string upon the display screen.
  • the text string comprises a plurality of characters.
  • the computer device also includes a memory.
  • the memory has stored thereon an image file defining an image containing a plurality of glyphs.
  • the memory also has stored thereon associated character information.
  • the memory further includes a processor configured to, for each character in the text string, define a portion of the image containing a glyph corresponding to the character, including determining the location of the glyph corresponding to the character based upon the associated character information; and render the portion on the display screen.
  • the present application provides, in a user device, a method of rendering, text on an output device, the user device including an image file defining an image of a custom character set, the user device having stored thereon associated character information, the associated character information including at least one character width for the custom character set.
  • the method includes the steps of locating a selected character from the custom character set within the image based upon the associated character information; defining a portion of the image containing the selected character; and rendering the portion on the output device.
  • the present application provides a user device, including an output device; a graphics subsystem for rendering graphics upon the output device; memory, the memory having stored thereon an image file defining an image of a custom character set and associated character information, the associated character information including at least one character width for the custom character set; and a custom font module for locating a selected character from the custom character set within the image file based upon the associated character information, and defining a portion of the image containing the selected character, wherein the graphics subsystems renders the portion on the output device.
  • the present application provides a computer program product having a computer-readable medium tangibly embodying computer executable instructions for rendering text on an output device in a user device, the user device including an image file defining an image of a custom character set, the user device having stored thereon associated character information, the associated character information including at least one character width for the custom character set, the user device having a graphics subsystem for rendering images on the output device.
  • the computer executable instructions include computer executable instructions for locating a selected character from the custom character set within the image based upon the associated character information; and computer executable instructions for defining a portion of the image containing the selected character, wherein the graphics subsystem renders the portion on the output device.
  • the present application provides a mobile device.
  • the mobile device includes a display screen; a graphics subsystem coupled to the display screen for rendering graphics upon the display screen; a memory, the memory containing an image file defining an image, the image including a custom character set, the memory further containing associated character information, the associated character information including character order information and at least one character width for the custom character set; a custom font module for locating a portion of the image containing a selected character from the custom character set within the image file based upon the associated character information, and producing a definition defining the portion of the image containing the selected character, wherein the graphics subsystem receives the definition and renders the portion on the display screen.
  • the present application provides a method of rendering text on a display screen of a computer device.
  • the computer device includes a memory storing an image file, and the image file defines an image containing a plurality of glyphs.
  • the method includes steps of defining a portion of the image containing one of the glyphs and excluding at least one other of the glyphs, and rendering the portion on the display screen.
  • the present application provides a computer device.
  • the computer device includes a display screen, a graphics subsystem for rendering text upon the display screen, a memory storing an image file defining an image containing a plurality of glyphs, and a processor.
  • the device also includes a custom font module executable by the processor and configured to define a portion of the image containing one of the glyphs and excluding at least one other of the glyphs.
  • the graphics subsystem is configured to render the portion on the display screen.
  • the present application provides a computer device.
  • the computer device includes a display screen and a memory storing an image file defining an image containing a plurality of glyphs.
  • the computer device further includes means for defining a portion of the image containing one of the glyphs and excluding at least one other of the glyphs, and means for rendering the portion on the display screen.
  • the user device includes an output device. Most typically, the output device includes a display screen.
  • the display screen may include a plasma display, liquid crystal display, light emitting diode display, cathode ray tube, or other type of visual display device.
  • the output device may also or alternatively include a printer or other output device for rendering graphics or text for viewing by a user.
  • FIG. 1 is a block diagram of a user device to which the present application is applied in an example embodiment.
  • the user device is a two-way mobile communication device 10 having data and possibly also voice communication capabilities.
  • the device 10 has the capability to communicate with other computer systems on the Internet.
  • the device may be a data communication device, a multiple-mode communication device configured for both data and voice communication, a mobile telephone, a PDA enabled for wireless communication, or a computer system with a wireless modem, among other things.
  • the present application may also be applied to handheld computing devices, such as PDAs and digital cameras, that are not enabled for communications.
  • the device 10 in which the device 10 is enabled for communications, the device 10 includes a communication subsystem 11 , including a receiver 12 , a transmitter 14 , and associated components such as one or more, preferably embedded or internal, antenna elements 16 and 18 , and a processing module such as a digital signal processor (DSP) 20 .
  • the communication subsystem includes local oscillator(s) (LO) 13 , and in some embodiments the communication subsystem 11 and a microprocessor 38 share an oscillator.
  • LO local oscillator
  • the particular design of the communication subsystem 11 will be dependent upon the communication network in which the device 10 is intended to operate.
  • Signals received by the antenna 16 through a wireless communication network 50 are input to the receiver 12 , which may perform such common receiver functions as signal amplification, frequency down conversion, filtering, channel selection and the like, and in some embodiments, analog to digital conversion.
  • signals to be transmitted are processed, including modulation and encoding for example, by the DSP 20 and input to the transmitter 14 for digital to analog conversion, frequency up conversion, filtering, amplification and transmission over the communications network 50 via the antenna 18 .
  • the device 10 includes the microprocessor 38 that controls the overall operation of the device.
  • the microprocessor 38 interacts with communications subsystem 11 and also interacts with further device subsystems such as the graphics subsystem 44 , flash memory 24 , random access memory (RAM) 26 , auxiliary input/output (I/O) subsystems 28 , serial port 30 , keyboard or keypad 32 , speaker 34 , microphone 36 , a short-range communications subsystem 40 , and any other device subsystems generally designated as 42 .
  • the graphics subsystem 44 interacts with the display 22 and renders graphics or text upon the display 22 .
  • Operating system software 54 and various software applications 58 used by the microprocessor 38 are, in one example embodiment, stored in a persistent store such as flash memory 24 or similar storage element. Those skilled in the art will appreciate that the operating system 54 , software applications 58 , or parts thereof, may be temporarily loaded into a volatile store such as RAM 26 . It is contemplated that received communication signals may also be stored to RAM 26 .
  • the microprocessor 38 in addition to its operating system functions, preferably enables execution of software applications 58 on the device.
  • a predetermined set of software applications 58 which control basic device operations, including at least data and voice communication applications for example, will normally be installed on the device 10 during manufacture. Further software applications 58 may also be loaded onto the device 10 through the network 50 , an auxiliary I/O subsystem 28 , serial port 30 , short-range communications subsystem 40 or any other suitable subsystem 42 , and installed by a user in the RAM 26 or a non-volatile store for execution by the microprocessor 38 .
  • Such flexibility in application installation increases the functionality of the device and may provide enhanced on-device functions, communication-related functions, or both.
  • secure communication applications may enable electronic commerce functions and other such financial transactions to be performed using the device 10 .
  • a received signal such as a text message or web page download will be processed by the communication subsystem 11 and input to the microprocessor 38 , which will preferably further process the received signal for output to the display 22 through the graphics subsystem 44 , or alternatively to an auxiliary I/O device 28 .
  • the auxiliary I/O device includes an image rendering subsystem like the graphics subsystem 44 for rendering graphics and text upon the auxiliary I/O device 28 .
  • a printer includes an image rendering subsystem for receiving and rendering image data.
  • a user of device 10 may also compose data items within a software application 58 , such as email messages for example, using the keyboard 32 in conjunction with the display 22 and possibly an auxiliary I/O device 28 . Such composed items may then be transmitted over a communication network through the communication subsystem 11 .
  • the serial port 30 in FIG. 1 would normally be implemented in a personal digital assistant (PDA)-type communication device for which synchronization with a user's desktop computer (not shown) may be desirable, but is an optional device component.
  • PDA personal digital assistant
  • Such a port 30 would enable a user to set preferences through an external device or software application and would extend the capabilities of the device by providing for information or software downloads to the device 10 other than through a wireless communication network.
  • a short-range communications subsystem 40 is a further component which may provide for communication between the device 10 and different systems or devices, which need not necessarily be similar devices.
  • the subsystem 40 may include an infrared device and associated circuits and components or a BluetoothTM communication module to provide for communication with similarly enabled systems and devices.
  • the device 10 may be a handheld device.
  • Wireless mobile network 50 is, in an example embodiment, a wireless packet data network, (e.g. MobitexTM or DataTACTM), which provides radio coverage to mobile devices 10 .
  • Wireless mobile network 50 may also be a voice and data network such as GSM (Global System for Mobile Communication) and GPRS (General Packet Radio System), CDMA (Code Division Multiple Access), or various other third generation networks such as EDGE (Enhanced Data rates for GSM Evolution) or UMTS (Universal Mobile Telecommunications Systems).
  • GSM Global System for Mobile Communication
  • GPRS General Packet Radio System
  • CDMA Code Division Multiple Access
  • EDGE Enhanced Data rates for GSM Evolution
  • UMTS Universal Mobile Telecommunications Systems
  • the components and subsystems of mobile device 10 are housed within a hard plastic main body case 70 that is configured to be held with one or two hands while the device 10 is in use.
  • the main body case 70 may be a single piece of may include two or more portions coupled together.
  • the device comprises a “flip-open” device 100 meaning that the main body case 70 includes two portions hinged together such that the two portions may be brought into closed contact with one another when the device 100 is not in use, as with the embodiment shown in FIG. 5 .
  • the various components of the device 100 need not be located in the same portion of the main body case 70 .
  • the case 70 may include a hook (not shown) so that it can be secured to a user's belt or pant's top, or it may be used in conjunction with a soft case (not shown) that can be mounted to the user's belt or pant's top and into which the mobile device 10 can be inserted for carrying.
  • Mobile device 10 will typically be small enough to fit inside a standard purse or suit jacket pocket.
  • the display 22 is visible from the front of the device, as is keypad or keyboard 32 .
  • the keyboard 32 includes buttons or keys 90 , 92 positioned to be actuated by the thumbs or fingers of the user. In the illustrated embodiment of FIG. 2 , the keyboard has relatively few keys, however in some embodiments, the keyboard includes 26 or more alphanumeric and control keys.
  • the display 22 is capable of outputting text 82 and graphics 80 rendered by the graphics subsystem 44 ( FIG. 1 ).
  • the device 10 includes a character image file 60 .
  • Character image file 60 is a file in a standard image format, such as a bit-mapped (raster) format like GIF or PNG, or in a vector font format.
  • the image defined in character image file 60 is an image of a custom character set.
  • the custom character set is a set of characters (also called glyphs) making up a particular font.
  • the custom characters of the font are developed off-line using sophisticated development tools to create a custom font having the look and attributes desired. For example, it may be desirable for the characters to be shown in outline, have shadows, or other complex characteristics, such as gradient fills, variable widths, filter effects, or variable line widths.
  • FIG. 6( a ) shows an example embodiment of an image 150 defined in a character image file 60 ( FIG. 1) .
  • the image 150 includes a number of alphanumeric and symbolic characters or glyphs developed off-line using an image development tool and saved in a bitmapped image format.
  • FIG. 6( b ) shows another example embodiment of an image 152 defined in a character image file 60 .
  • the image 152 reflects a reduced set of characters intended for displaying the time on an output device, such as a display 22 ( FIG. 1) . By only including those glyphs or characters needed to display time information the costs associated with the font are reduced.
  • the device 10 further includes associated character information 62 .
  • the associated character information 62 may be stored separately from the character image file 60 or may be incorporated as a part of the character image file 60 , such as within the header.
  • the associated character information 62 is stored as an XML file, which during run-time is converted into a run-time memory object by the Java Virtual Machine (JVM).
  • JVM Java Virtual Machine
  • the associated character information 62 facilitates the location of specific characters within the character image file 60 .
  • the associated character information 62 includes a value for the character width. This value may indicate the width of each character in pixels.
  • the character set in the character image file 60 includes all the standard characters in the ASCII character set in the order defined by the ASCII standard.
  • the associated character information 62 may include a flag indicating that the font is in ASCII format, the starting position of the first character in the image file 60 , and the width of the characters. If all the characters are the same width, then only a single width value need be stored in the associated character information 62 .
  • the width of characters may vary from character to character, such as in a true-type font, in which case the associated character information 62 may specify the width of each character; or, alternatively, a standard width and the width of any character that deviates from the standard width.
  • the character image file 60 may not contain a full set of ASCII characters in the predefined order. In such a case, the associated character information 62 may identify the characters within the character image file 60 and the order in which they are placed.
  • the relative advance may be specified in the case of an italics-style font. Relative advance information accounts for situations where the width of the character differs from the distance that the output module should move before placing the next character.
  • the character image file 60 may not contain a “space” character, in which case the associated character information 62 may include data defining the width of the space character.
  • the function of the associated character information 62 is to allow for the identification and location of a specific character within the character image file 60 .
  • a portion of the image file that contains only the specific character may be defined.
  • the defined portion is a small rectangle within the image containing the specific character, although the defined portion need not be rectangular.
  • the character image file 60 contains an image containing each of the characters (or glyphs) in the complex custom font developed off-line.
  • the associated character information 62 provides the specifics necessary to locate and define a portion of the image for each character.
  • the device 10 FIG. 1
  • the device 10 outputs text in the custom font to the display 22 or other output device
  • the text is created by rendering those portions of the image corresponding to the individual characters in the text string.
  • the device 10 avoids having to incorporate the sizable complex code required to create the customized font. Instead, the pre-rendered bitmapped image of the font is used and the graphics subsystem 44 ( FIG. 1 ) renders the small portions associated with the characters of the desired text string.
  • the device 10 includes a custom font module 56 .
  • the custom font module 56 performs the function of selecting or defining the portion of the image stored in the character image file 60 for a particular text character.
  • the custom font module 56 bases its selection of the portion upon the associated character information 62 which defines the relative positioning of the characters in the character image file 60 .
  • a software application 58 or other routine in operation upon the device 10 includes code to invoke the custom font module 56 when the application 58 or routine intends to output text in the custom font to the display 22 .
  • the custom font module 56 receives the text that the software application 58 or routine intends to display and it selects the portions of the image defined in the character image file 60 corresponding to the characters in the text.
  • the graphics subsystem 44 then renders these portions for output on the display.
  • the software application 58 or other routine is unaware of the special nature of the custom font.
  • the application 58 or routine intends to output a text string to the display. 22 , it instructs the graphics subsystem 44 to “draw text”.
  • the graphics subsystem 44 recognizes that the custom font requires handling by the custom font module 56 . Accordingly, the graphics subsystem 44 calls (or invokes) the custom font module 56 , which then defines the portions of the character image file 44 for rendering upon the display 22 by the graphics subsystem 44 .
  • the custom font module 56 may be incorporated within the graphics subsystem 44 and is shown separately in FIG. 1 for ease of illustration.
  • the custom font module 56 clips the portion of the image corresponding to a selected character and passes the clipped portion of the image to the graphics subsystem 44 .
  • the custom font module 56 may create an object containing the clipped image information and may pass this object to the graphics subsystem 44 .
  • the clipped image information may alternatively be stored in a small image file or data structure. Other mechanisms for extracting the data corresponding to the portion of the image and passing the data to the graphics subsystem 44 for rendering will be understood by those of ordinary skill in the art.
  • the overhead associated with actually clipping the portion and passing it to the graphics subsystem 44 as a separate file or object may be avoided by simply passing the graphics subsystem 44 a definition of the portion.
  • the custom font module 56 defines, but does not go so far as to clip, the portion of the image corresponding to a selected character. This definition is used by the graphics subsystem 44 to understand what portion of the overall image it is to render on the display 22 .
  • FIG. 7 shows a method 200 of rendering text on an output device, according to the present application.
  • the method 200 begins in step 202 with the creation and storage of the character image file 60 ( FIG. 1 ) defining the image of the custom character set.
  • Step 202 also includes the creation and storage of the associated character information 62 ( FIG. 1 ) on the device 10 ( FIG. 1 ).
  • the character image file 60 and the associated character information 62 may be uploaded to the device 10 through the serial port 30 ( FIG. 1 ), through the short-range communications subsystem 40 ( FIG. 1 ), or through the communication subsystem 11 ( FIG. 1 ) from the wireless network 50 ( FIG. 1 ). They may also be uploaded to the flash memory 24 prior to complete assembly of the device 10 .
  • the character image file 60 and associated character information 62 may be updated or replaced with new custom font information over time.
  • the device manufacturer may distribute an updated character image file and updated associated character information to deployed devices through the wireless network 50 .
  • the graphics subsystem 44 receives an instruction to output a text string to the display 22 .
  • the instruction may come from a software application 58 such as a word processing program, an e-mail program, or other program.
  • the instruction may also come from a routine, such as a operating system routine.
  • an operating system routine may be designed to put the date and time in the lower right hand corner of the display 22 for the device 10 .
  • the custom font may have been developed specifically for this purpose and the use of this font may be specified by the operating system routine when it instructs the graphics subsystem 44 to output the current date and time.
  • the graphics subsystem 44 recognizes that the custom font is requested, so in step 206 it invokes the custom font module 56 .
  • the custom font module 56 then, in step 208 , identifies the first text character in the text string and locates the corresponding character image in the image defined by the character image file 60 using the associated character information 62 .
  • the custom font module 56 then defines the portion of the image corresponding to the first text character in step 210 .
  • the graphics subsystem 44 Based upon the definition of the portion of the image, the graphics subsystem 44 renders the portion of the image on the display 22 , thereby outputting an image of the first text character to the display 22 in step 212 .
  • step 214 the custom font module 56 determines whether it has reached the end of the string of text. If not, then it continues to the next character in the string. If so, then the method 200 ends.
  • the custom font module 56 locates each character in the image and creates a definition for the character. When asked to render a text string, the custom font module 56 provides the graphics subsystem 44 with the predetermined definitions corresponding to the characters of the text string.
  • the mobile devices may be user terminals, such as desktop or laptop computers, or other devices.

Abstract

A method and computer device for rendering a text string on a display screen of a computer device are provided. The text string is comprised of a plurality of characters. The computer device includes a memory storing an image file. The image file defines an image containing a plurality of glyphs. The memory has stored thereon associated character information. The method comprises: for each character in the text string: defining a portion of the image containing a glyph corresponding to the character, including determining the location of the glyph corresponding to the character based upon the associated character information; and rendering the portion on the display screen.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. patent application Ser. No. 11/558,120 filed Nov. 9, 2006 which is a continuation of U.S. patent application Ser. No. 10/786,029, filed Feb. 26, 2004, and owned in common herewith. Each patent application identified above is incorporated here by reference in its entirety.
  • FIELD
  • The present application relates to rendering text on an output device and, in particular, to the rendering of text in a custom font on an output device.
  • BACKGROUND
  • Computing devices use a variety of fonts to render text characters or glyphs on a display, a printer, or other output device. In typical computing devices, a set of fonts are available to application programs or routines. When an application program outputs text characters in a certain font to a display, a graphics subsystem receives an instruction from the application program and renders the text characters on the display.
  • There are a number of prior art font types. One type is a bitmapped font wherein each character of the character set is defined with an array of bits. Another type is a vector-based font, wherein the shapes of the characters are defined mathematically. These fonts are more easily scaled than the bitmapped font.
  • There are a number of available application programs that allow a user to create a custom text images, including font characters, having a variety of visual enhancements. Some of the customizations that can be incorporated into a font include gradient fills, variable widths, outlines, shadows, and other artistic embellishments. These customizations can enhance the appearance of the text rendered using the custom font.
  • A drawback of using custom fonts is that there is significant overhead in creating this font data. Large and complex code is required to render a complex custom font. This creates a difficulty for user devices that have limited processing and memory capacity but wish to display a custom font.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Reference will now be made, by way of example, to the accompanying drawings which show an embodiment of the present application, and in which:
  • FIG. 1 shows a block diagram of a user device to which the present application is applied in an example embodiment;
  • FIGS. 2 through 4 show a front view, side view, and back view, respectively, of an embodiment of a user device;
  • FIG. 5 shows a front view of a further embodiment of a user device;
  • FIGS. 6( a) and (b) show sample images of custom character sets, according to the present application; and
  • FIG. 7 shows, in flowchart form, a method of rendering text on an output device.
  • Similar reference numerals are used in different figures to denote similar components.
  • DESCRIPTION OF SPECIFIC EMBODIMENTS
  • The present application provides a method of rendering text in a custom font that uses an image of the custom font in a standard image format for which the user device already has rendering code. The character set for the custom font is stored in an image file and small portions of the image corresponding to individual text characters are rendered to output a text string.
  • In one aspect, the present application provides a method of rendering a text string on a display screen of a computer device. The text string is comprised of a plurality of characters. The computer device includes a memory storing an image file. The image file defines an image containing a plurality of glyphs. The memory has stored thereon associated character information. The method comprises: for each character in the text string: defining a portion of the image containing a glyph corresponding to the character, including determining the location of the glyph corresponding to the character based upon the associated character information; and rendering the portion on the display screen.
  • In another aspect, the present application provides a computer device. The computer device includes a display screen and a graphics subsystem for rendering a text string upon the display screen. The text string comprises a plurality of characters. The computer device also includes a memory. The memory has stored thereon an image file defining an image containing a plurality of glyphs. The memory also has stored thereon associated character information. The memory further includes a processor configured to, for each character in the text string, define a portion of the image containing a glyph corresponding to the character, including determining the location of the glyph corresponding to the character based upon the associated character information; and render the portion on the display screen.
  • In one aspect, the present application provides, in a user device, a method of rendering, text on an output device, the user device including an image file defining an image of a custom character set, the user device having stored thereon associated character information, the associated character information including at least one character width for the custom character set. The method includes the steps of locating a selected character from the custom character set within the image based upon the associated character information; defining a portion of the image containing the selected character; and rendering the portion on the output device.
  • In another aspect the present application provides a user device, including an output device; a graphics subsystem for rendering graphics upon the output device; memory, the memory having stored thereon an image file defining an image of a custom character set and associated character information, the associated character information including at least one character width for the custom character set; and a custom font module for locating a selected character from the custom character set within the image file based upon the associated character information, and defining a portion of the image containing the selected character, wherein the graphics subsystems renders the portion on the output device.
  • In yet a further aspect, the present application provides a computer program product having a computer-readable medium tangibly embodying computer executable instructions for rendering text on an output device in a user device, the user device including an image file defining an image of a custom character set, the user device having stored thereon associated character information, the associated character information including at least one character width for the custom character set, the user device having a graphics subsystem for rendering images on the output device. The computer executable instructions include computer executable instructions for locating a selected character from the custom character set within the image based upon the associated character information; and computer executable instructions for defining a portion of the image containing the selected character, wherein the graphics subsystem renders the portion on the output device.
  • In yet another aspect, the present application provides a mobile device. The mobile device includes a display screen; a graphics subsystem coupled to the display screen for rendering graphics upon the display screen; a memory, the memory containing an image file defining an image, the image including a custom character set, the memory further containing associated character information, the associated character information including character order information and at least one character width for the custom character set; a custom font module for locating a portion of the image containing a selected character from the custom character set within the image file based upon the associated character information, and producing a definition defining the portion of the image containing the selected character, wherein the graphics subsystem receives the definition and renders the portion on the display screen.
  • In another aspect, the present application provides a method of rendering text on a display screen of a computer device. The computer device includes a memory storing an image file, and the image file defines an image containing a plurality of glyphs. The method includes steps of defining a portion of the image containing one of the glyphs and excluding at least one other of the glyphs, and rendering the portion on the display screen.
  • In yet another aspect, the present application provides a computer device. The computer device includes a display screen, a graphics subsystem for rendering text upon the display screen, a memory storing an image file defining an image containing a plurality of glyphs, and a processor. The device also includes a custom font module executable by the processor and configured to define a portion of the image containing one of the glyphs and excluding at least one other of the glyphs. The graphics subsystem is configured to render the portion on the display screen.
  • In yet a further aspect, the present application provides a computer device. The computer device includes a display screen and a memory storing an image file defining an image containing a plurality of glyphs. The computer device further includes means for defining a portion of the image containing one of the glyphs and excluding at least one other of the glyphs, and means for rendering the portion on the display screen.
  • Other aspects and features of the present application will be apparent to those of ordinary skill in the art from a review of the following detailed description when considered in conjunction with the drawings.
  • The following description of one or more specific embodiments of the application does not limit the implementation of the application to any particular computer programming language or system architecture. The present application is not limited to any particular operating system, mobile device architecture, or computer programming language. Moreover, although some of the embodiment described below include mobile devices, the present application is not limited to mobile devices, nor to wireless communications system; rather, it may be embodied within a variety of user devices or terminals, including handheld devices, mobile telephones, personal digital assistants (PDAs), personal computers, audio-visual terminals, televisions, and other devices. In the embodiments described below, the user device includes an output device. Most typically, the output device includes a display screen. The display screen may include a plasma display, liquid crystal display, light emitting diode display, cathode ray tube, or other type of visual display device. The output device may also or alternatively include a printer or other output device for rendering graphics or text for viewing by a user.
  • Referring now to the drawings, FIG. 1 is a block diagram of a user device to which the present application is applied in an example embodiment. In the example embodiment, the user device is a two-way mobile communication device 10 having data and possibly also voice communication capabilities. In an example embodiment, the device 10 has the capability to communicate with other computer systems on the Internet. Depending on the functionality provided by the device 10, in various embodiments the device may be a data communication device, a multiple-mode communication device configured for both data and voice communication, a mobile telephone, a PDA enabled for wireless communication, or a computer system with a wireless modem, among other things. In various embodiments, the present application may also be applied to handheld computing devices, such as PDAs and digital cameras, that are not enabled for communications.
  • In this embodiment, in which the device 10 is enabled for communications, the device 10 includes a communication subsystem 11, including a receiver 12, a transmitter 14, and associated components such as one or more, preferably embedded or internal, antenna elements 16 and 18, and a processing module such as a digital signal processor (DSP) 20. In some embodiments, the communication subsystem includes local oscillator(s) (LO) 13, and in some embodiments the communication subsystem 11 and a microprocessor 38 share an oscillator. As will be apparent to those skilled in the field of communications, the particular design of the communication subsystem 11 will be dependent upon the communication network in which the device 10 is intended to operate.
  • Signals received by the antenna 16 through a wireless communication network 50 are input to the receiver 12, which may perform such common receiver functions as signal amplification, frequency down conversion, filtering, channel selection and the like, and in some embodiments, analog to digital conversion. In a similar manner, signals to be transmitted are processed, including modulation and encoding for example, by the DSP 20 and input to the transmitter 14 for digital to analog conversion, frequency up conversion, filtering, amplification and transmission over the communications network 50 via the antenna 18.
  • The device 10 includes the microprocessor 38 that controls the overall operation of the device. The microprocessor 38 interacts with communications subsystem 11 and also interacts with further device subsystems such as the graphics subsystem 44, flash memory 24, random access memory (RAM) 26, auxiliary input/output (I/O) subsystems 28, serial port 30, keyboard or keypad 32, speaker 34, microphone 36, a short-range communications subsystem 40, and any other device subsystems generally designated as 42. The graphics subsystem 44 interacts with the display 22 and renders graphics or text upon the display 22.
  • Operating system software 54 and various software applications 58 used by the microprocessor 38 are, in one example embodiment, stored in a persistent store such as flash memory 24 or similar storage element. Those skilled in the art will appreciate that the operating system 54, software applications 58, or parts thereof, may be temporarily loaded into a volatile store such as RAM 26. It is contemplated that received communication signals may also be stored to RAM 26.
  • The microprocessor 38, in addition to its operating system functions, preferably enables execution of software applications 58 on the device. A predetermined set of software applications 58 which control basic device operations, including at least data and voice communication applications for example, will normally be installed on the device 10 during manufacture. Further software applications 58 may also be loaded onto the device 10 through the network 50, an auxiliary I/O subsystem 28, serial port 30, short-range communications subsystem 40 or any other suitable subsystem 42, and installed by a user in the RAM 26 or a non-volatile store for execution by the microprocessor 38. Such flexibility in application installation increases the functionality of the device and may provide enhanced on-device functions, communication-related functions, or both. For example, secure communication applications may enable electronic commerce functions and other such financial transactions to be performed using the device 10.
  • In a data communication mode, a received signal such as a text message or web page download will be processed by the communication subsystem 11 and input to the microprocessor 38, which will preferably further process the received signal for output to the display 22 through the graphics subsystem 44, or alternatively to an auxiliary I/O device 28. It is contemplated that the auxiliary I/O device includes an image rendering subsystem like the graphics subsystem 44 for rendering graphics and text upon the auxiliary I/O device 28. For example, a printer includes an image rendering subsystem for receiving and rendering image data. A user of device 10 may also compose data items within a software application 58, such as email messages for example, using the keyboard 32 in conjunction with the display 22 and possibly an auxiliary I/O device 28. Such composed items may then be transmitted over a communication network through the communication subsystem 11.
  • The serial port 30 in FIG. 1 would normally be implemented in a personal digital assistant (PDA)-type communication device for which synchronization with a user's desktop computer (not shown) may be desirable, but is an optional device component. Such a port 30 would enable a user to set preferences through an external device or software application and would extend the capabilities of the device by providing for information or software downloads to the device 10 other than through a wireless communication network.
  • A short-range communications subsystem 40 is a further component which may provide for communication between the device 10 and different systems or devices, which need not necessarily be similar devices. For example, the subsystem 40 may include an infrared device and associated circuits and components or a Bluetooth™ communication module to provide for communication with similarly enabled systems and devices. The device 10 may be a handheld device.
  • Wireless mobile network 50 is, in an example embodiment, a wireless packet data network, (e.g. Mobitex™ or DataTAC™), which provides radio coverage to mobile devices 10. Wireless mobile network 50 may also be a voice and data network such as GSM (Global System for Mobile Communication) and GPRS (General Packet Radio System), CDMA (Code Division Multiple Access), or various other third generation networks such as EDGE (Enhanced Data rates for GSM Evolution) or UMTS (Universal Mobile Telecommunications Systems).
  • With reference to FIGS. 2 to 4, in an example embodiment the components and subsystems of mobile device 10 are housed within a hard plastic main body case 70 that is configured to be held with one or two hands while the device 10 is in use. The main body case 70 may be a single piece of may include two or more portions coupled together. For example, in one embodiment, the device comprises a “flip-open” device 100 meaning that the main body case 70 includes two portions hinged together such that the two portions may be brought into closed contact with one another when the device 100 is not in use, as with the embodiment shown in FIG. 5. The various components of the device 100 need not be located in the same portion of the main body case 70.
  • The case 70 may include a hook (not shown) so that it can be secured to a user's belt or pant's top, or it may be used in conjunction with a soft case (not shown) that can be mounted to the user's belt or pant's top and into which the mobile device 10 can be inserted for carrying. Mobile device 10 will typically be small enough to fit inside a standard purse or suit jacket pocket. The display 22 is visible from the front of the device, as is keypad or keyboard 32. The keyboard 32 includes buttons or keys 90, 92 positioned to be actuated by the thumbs or fingers of the user. In the illustrated embodiment of FIG. 2, the keyboard has relatively few keys, however in some embodiments, the keyboard includes 26 or more alphanumeric and control keys. The display 22 is capable of outputting text 82 and graphics 80 rendered by the graphics subsystem 44 (FIG. 1).
  • Referring again to FIG. 1, the device 10 includes a character image file 60. Character image file 60 is a file in a standard image format, such as a bit-mapped (raster) format like GIF or PNG, or in a vector font format. The image defined in character image file 60 is an image of a custom character set. The custom character set is a set of characters (also called glyphs) making up a particular font. The custom characters of the font are developed off-line using sophisticated development tools to create a custom font having the look and attributes desired. For example, it may be desirable for the characters to be shown in outline, have shadows, or other complex characteristics, such as gradient fills, variable widths, filter effects, or variable line widths. Those of ordinary skill in the art will be familiar with the types of development tools that are available to create custom fonts, such as the Adobe Photoshop™ software produced by Adobe Systems Incorporated, or the CorelDRAW™ software produced by Corel Corporation. Those of ordinary skill in the art will also appreciate the wide variety of alterations and effects that can be incorporated into a font.
  • Reference is now made to FIG. 6( a), which shows an example embodiment of an image 150 defined in a character image file 60 (FIG. 1). The image 150 includes a number of alphanumeric and symbolic characters or glyphs developed off-line using an image development tool and saved in a bitmapped image format. FIG. 6( b) shows another example embodiment of an image 152 defined in a character image file 60. The image 152 reflects a reduced set of characters intended for displaying the time on an output device, such as a display 22 (FIG. 1). By only including those glyphs or characters needed to display time information the costs associated with the font are reduced.
  • Referring again to FIG. 1, the device 10 further includes associated character information 62. The associated character information 62 may be stored separately from the character image file 60 or may be incorporated as a part of the character image file 60, such as within the header. In one embodiment, the associated character information 62 is stored as an XML file, which during run-time is converted into a run-time memory object by the Java Virtual Machine (JVM). Other methods of storing the associated character information 62 will be apparent to those of ordinary skill in the art.
  • The associated character information 62 facilitates the location of specific characters within the character image file 60. For example, the associated character information 62 includes a value for the character width. This value may indicate the width of each character in pixels. In one embodiment, the character set in the character image file 60 includes all the standard characters in the ASCII character set in the order defined by the ASCII standard. In this case, the associated character information 62 may include a flag indicating that the font is in ASCII format, the starting position of the first character in the image file 60, and the width of the characters. If all the characters are the same width, then only a single width value need be stored in the associated character information 62. In other embodiments, the width of characters may vary from character to character, such as in a true-type font, in which case the associated character information 62 may specify the width of each character; or, alternatively, a standard width and the width of any character that deviates from the standard width. In other embodiments, the character image file 60 may not contain a full set of ASCII characters in the predefined order. In such a case, the associated character information 62 may identify the characters within the character image file 60 and the order in which they are placed.
  • Other information may also be included in the associated character information 62. For example, the relative advance may be specified in the case of an italics-style font. Relative advance information accounts for situations where the width of the character differs from the distance that the output module should move before placing the next character. Additionally, the character image file 60 may not contain a “space” character, in which case the associated character information 62 may include data defining the width of the space character.
  • The function of the associated character information 62 is to allow for the identification and location of a specific character within the character image file 60. By defining the width of the character, a portion of the image file that contains only the specific character may be defined. In one embodiment, the defined portion is a small rectangle within the image containing the specific character, although the defined portion need not be rectangular.
  • Accordingly, the character image file 60 contains an image containing each of the characters (or glyphs) in the complex custom font developed off-line. The associated character information 62 provides the specifics necessary to locate and define a portion of the image for each character. In this manner, when the device 10 (FIG. 1) outputs text in the custom font to the display 22 or other output device, the text is created by rendering those portions of the image corresponding to the individual characters in the text string. By having the complex font “pre-rendered” in the image, the device 10 avoids having to incorporate the sizable complex code required to create the customized font. Instead, the pre-rendered bitmapped image of the font is used and the graphics subsystem 44 (FIG. 1) renders the small portions associated with the characters of the desired text string.
  • Referring again to FIG. 1, the device 10 includes a custom font module 56. The custom font module 56 performs the function of selecting or defining the portion of the image stored in the character image file 60 for a particular text character. The custom font module 56 bases its selection of the portion upon the associated character information 62 which defines the relative positioning of the characters in the character image file 60.
  • In a first embodiment, a software application 58 or other routine in operation upon the device 10 includes code to invoke the custom font module 56 when the application 58 or routine intends to output text in the custom font to the display 22. The custom font module 56 receives the text that the software application 58 or routine intends to display and it selects the portions of the image defined in the character image file 60 corresponding to the characters in the text. The graphics subsystem 44 then renders these portions for output on the display.
  • In another embodiment, the software application 58 or other routine is unaware of the special nature of the custom font. When the application 58 or routine intends to output a text string to the display. 22, it instructs the graphics subsystem 44 to “draw text”. The graphics subsystem 44 recognizes that the custom font requires handling by the custom font module 56. Accordingly, the graphics subsystem 44 calls (or invokes) the custom font module 56, which then defines the portions of the character image file 44 for rendering upon the display 22 by the graphics subsystem 44. It will be appreciated that in some embodiments the custom font module 56 may be incorporated within the graphics subsystem 44 and is shown separately in FIG. 1 for ease of illustration.
  • In some embodiments, the custom font module 56 clips the portion of the image corresponding to a selected character and passes the clipped portion of the image to the graphics subsystem 44. The custom font module 56 may create an object containing the clipped image information and may pass this object to the graphics subsystem 44. The clipped image information may alternatively be stored in a small image file or data structure. Other mechanisms for extracting the data corresponding to the portion of the image and passing the data to the graphics subsystem 44 for rendering will be understood by those of ordinary skill in the art.
  • In another embodiment, the overhead associated with actually clipping the portion and passing it to the graphics subsystem 44 as a separate file or object may be avoided by simply passing the graphics subsystem 44 a definition of the portion. In such an embodiment, the custom font module 56 defines, but does not go so far as to clip, the portion of the image corresponding to a selected character. This definition is used by the graphics subsystem 44 to understand what portion of the overall image it is to render on the display 22.
  • Reference is now made to FIG. 7, which shows a method 200 of rendering text on an output device, according to the present application. The method 200 begins in step 202 with the creation and storage of the character image file 60 (FIG. 1) defining the image of the custom character set. Step 202 also includes the creation and storage of the associated character information 62 (FIG. 1) on the device 10 (FIG. 1). The character image file 60 and the associated character information 62 may be uploaded to the device 10 through the serial port 30 (FIG. 1), through the short-range communications subsystem 40 (FIG. 1), or through the communication subsystem 11 (FIG. 1) from the wireless network 50 (FIG. 1). They may also be uploaded to the flash memory 24 prior to complete assembly of the device 10. It will be appreciated that in some embodiments the character image file 60 and associated character information 62 may be updated or replaced with new custom font information over time. In one embodiment, the device manufacturer may distribute an updated character image file and updated associated character information to deployed devices through the wireless network 50.
  • In step 204, the graphics subsystem 44 (FIG. 1) receives an instruction to output a text string to the display 22. The instruction may come from a software application 58 such as a word processing program, an e-mail program, or other program. The instruction may also come from a routine, such as a operating system routine. For example, an operating system routine may be designed to put the date and time in the lower right hand corner of the display 22 for the device 10. The custom font may have been developed specifically for this purpose and the use of this font may be specified by the operating system routine when it instructs the graphics subsystem 44 to output the current date and time.
  • The graphics subsystem 44 recognizes that the custom font is requested, so in step 206 it invokes the custom font module 56. The custom font module 56 then, in step 208, identifies the first text character in the text string and locates the corresponding character image in the image defined by the character image file 60 using the associated character information 62. The custom font module 56 then defines the portion of the image corresponding to the first text character in step 210.
  • Based upon the definition of the portion of the image, the graphics subsystem 44 renders the portion of the image on the display 22, thereby outputting an image of the first text character to the display 22 in step 212.
  • In step 214, the custom font module 56 determines whether it has reached the end of the string of text. If not, then it continues to the next character in the string. If so, then the method 200 ends.
  • It will be appreciated that some of the steps of the above method 200 may be performed in a different order and some may be combined with others. For example, in one embodiment the custom font module 56 locates each character in the image and creates a definition for the character. When asked to render a text string, the custom font module 56 provides the graphics subsystem 44 with the predetermined definitions corresponding to the characters of the text string.
  • As noted above, although the above embodiments describe the present application in the context of mobile devices using a wireless network, those of ordinary skill in the art will appreciate that it is not so limited. In some embodiments, the mobile devices may be user terminals, such as desktop or laptop computers, or other devices.
  • The present application may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Certain adaptations and modifications of the application will be obvious to those skilled in the art. Therefore, the above discussed embodiments are considered to be illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (22)

1. A method of rendering a text string on a display screen of a computer device, the text string being comprised of a plurality of characters, the computer device including a memory storing an image file, the image file defining an image containing a plurality of glyphs, the memory having stored thereon associated character information, the method comprising:
for each character in the text string:
defining a portion of the image containing a glyph corresponding to the character, including determining the location of the glyph corresponding to the character based upon the associated character information; and
rendering the portion on the display screen.
2. The method claimed in claim 1, wherein defining a portion of the image includes clipping the portion of the image to obtain a subimage and passing the subimage to a graphics subsystem for performing the rendering.
3. The method claimed in claim 1, wherein the plurality of glyphs comprise a reduced character set consisting of a subset of the font, the subset including all glyphs necessary for a specific predetermined purpose.
4. The method of claim 3, wherein the subset excludes at least one predetermined alphanumeric character.
5. The method of claim 3, wherein the subset includes only glyphs necessary for rendering a time.
6. The method of claim 3, wherein the subset includes only glyphs necessary for implementing a clock.
7. The method of claim 1, further comprising:
receiving an updated image file through a network; and
storing the updated image file on the memory.
8. The method of claim 7, wherein the network is a wireless network.
9. The method of claim 1, wherein the associated character information includes relative advance information specifying a distance that an output module should move before placing a next character.
10. The method of claim 1, wherein the image file does not contain a glyph for a space character.
11. The method of claim 1, further comprising:
if the text character is the space character, retrieving data defining the width of the space character and rendering a space character in accordance with the data defining the width.
12. A computer device, comprising:
a display screen;
a graphics subsystem for rendering a text string upon the display screen, the text string comprising a plurality of characters;
a memory, the memory having stored thereon an image file, the image file defining an image containing a plurality of glyphs, the memory having stored thereon associated character information;
a processor configured to:
for each character in the text string:
define a portion of the image containing a glyph corresponding to the character, including determining the location of the glyph corresponding to the character based upon the associated character information; and
render the portion on the display screen.
13. The computer device of claim 12, wherein defining a portion of the image includes clipping the portion of the image to obtain a subimage and passing the subimage to a graphics subsystem for performing the rendering.
14. The computer device of claim 12, wherein the plurality of glyphs comprise a reduced character set consisting of a subset of the font, the subset including all glyphs necessary for a specific predetermined purpose.
15. The computer device of claim 14, wherein the subset excludes at least one predetermined alphanumeric character.
16. The computer device of claim 14, wherein the subset includes only glyphs necessary for rendering a time.
17. The computer device of claim 14, wherein the subset includes only glyphs necessary for implementing a clock.
18. The computer device of claim 12, wherein the processor is further configured to:
receive an updated image file through a network; and
store the updated image file on the memory.
19. The computer device of claim 18, wherein the network is a wireless network.
20. The computer device of claim 12, wherein the associated character information includes relative advance information specifying a distance that an output module should move before placing a next character.
21. The computer device of claim 12, wherein the image file does not contain a glyph for a space character.
22. The computer device of claim 12, wherein the processor is further configured to:
if the text character is the space character, retrieve data defining the width of the space character and rendering a space character in accordance with the data defining the width.
US12/815,475 2004-02-26 2010-06-15 Method of rendering text on an output device Abandoned US20100245364A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/815,475 US20100245364A1 (en) 2004-02-26 2010-06-15 Method of rendering text on an output device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US10/786,029 US7161598B2 (en) 2004-02-26 2004-02-26 Method of rendering text on an output device
US11/558,120 US7768513B2 (en) 2004-02-26 2006-11-09 Method of rendering text on an output device
US12/815,475 US20100245364A1 (en) 2004-02-26 2010-06-15 Method of rendering text on an output device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/558,120 Continuation US7768513B2 (en) 2004-02-26 2006-11-09 Method of rendering text on an output device

Publications (1)

Publication Number Publication Date
US20100245364A1 true US20100245364A1 (en) 2010-09-30

Family

ID=34886664

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/786,029 Active 2024-08-22 US7161598B2 (en) 2004-02-26 2004-02-26 Method of rendering text on an output device
US11/558,120 Expired - Lifetime US7768513B2 (en) 2004-02-26 2006-11-09 Method of rendering text on an output device
US12/815,475 Abandoned US20100245364A1 (en) 2004-02-26 2010-06-15 Method of rendering text on an output device

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US10/786,029 Active 2024-08-22 US7161598B2 (en) 2004-02-26 2004-02-26 Method of rendering text on an output device
US11/558,120 Expired - Lifetime US7768513B2 (en) 2004-02-26 2006-11-09 Method of rendering text on an output device

Country Status (1)

Country Link
US (3) US7161598B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100275161A1 (en) * 2009-04-22 2010-10-28 Dicamillo Adrienne T Font Selector And Method For The Same
WO2013066610A1 (en) * 2011-11-04 2013-05-10 Facebook, Inc. Rendering texts on electronic devices

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7161598B2 (en) * 2004-02-26 2007-01-09 Research In Motion Limited Method of rendering text on an output device
US7359902B2 (en) 2004-04-30 2008-04-15 Microsoft Corporation Method and apparatus for maintaining relationships between parts in a package
US7383500B2 (en) 2004-04-30 2008-06-03 Microsoft Corporation Methods and systems for building packages that contain pre-paginated documents
US8661332B2 (en) 2004-04-30 2014-02-25 Microsoft Corporation Method and apparatus for document processing
US8243317B2 (en) 2004-05-03 2012-08-14 Microsoft Corporation Hierarchical arrangement for spooling job data
US7755786B2 (en) * 2004-05-03 2010-07-13 Microsoft Corporation Systems and methods for support of various processing capabilities
US8363232B2 (en) 2004-05-03 2013-01-29 Microsoft Corporation Strategies for simultaneous peripheral operations on-line using hierarchically structured job information
US7519899B2 (en) 2004-05-03 2009-04-14 Microsoft Corporation Planar mapping of graphical elements
US7580948B2 (en) * 2004-05-03 2009-08-25 Microsoft Corporation Spooling strategies using structured job information
US8253742B2 (en) 2004-05-28 2012-08-28 Microsoft Corporation Rendering stroke pairs for graphical objects
US7256786B2 (en) * 2004-05-28 2007-08-14 Microsoft Corporation Appropriately rendering a graphical object when a corresponding outline has exact or inexact control points
US7292249B2 (en) * 2004-05-28 2007-11-06 Microsoft Corporation Appropriately rendering a graphical object when a corresponding outline has excessive control points
US8487936B2 (en) * 2007-05-30 2013-07-16 Kyocera Corporation Portable electronic device and character display method for the same
WO2010084206A1 (en) 2009-01-26 2010-07-29 Fontself Sa A system and method for creating, managing, sharing and displaying personalized fonts on a client-server architecture
WO2010084207A1 (en) 2009-01-26 2010-07-29 Fontself Sa A system and method for displaying a text with a font
WO2010084205A1 (en) 2009-01-26 2010-07-29 Fontself Sa A system and method for creating and sharing personalized fonts on a client-server architecture
US9319444B2 (en) * 2009-06-22 2016-04-19 Monotype Imaging Inc. Font data streaming
US8769405B2 (en) * 2009-10-16 2014-07-01 Celartem, Inc. Reduced glyph font files
US20110202384A1 (en) * 2010-02-17 2011-08-18 Rabstejnek Wayne S Enterprise Rendering Platform
US8615709B2 (en) 2010-04-29 2013-12-24 Monotype Imaging Inc. Initiating font subsets
US20120079374A1 (en) * 2010-09-29 2012-03-29 Apple Inc. Rendering web page text in a non-native font
CA2753508C (en) 2011-09-23 2013-07-30 Guy Le Henaff Tracing a document in an electronic publication
US20130215126A1 (en) * 2012-02-17 2013-08-22 Monotype Imaging Inc. Managing Font Distribution
US9817615B2 (en) 2012-12-03 2017-11-14 Monotype Imaging Inc. Network based font management for imaging devices
US9569865B2 (en) 2012-12-21 2017-02-14 Monotype Imaging Inc. Supporting color fonts
EP2943894A2 (en) 2013-01-09 2015-11-18 Monotype Imaging Inc. Advanced text editor
EP2784771A1 (en) * 2013-03-25 2014-10-01 Samsung Electronics Co., Ltd. Display apparatus and method of outputting text thereof
US9317777B2 (en) 2013-10-04 2016-04-19 Monotype Imaging Inc. Analyzing font similarity for presentation
US9691169B2 (en) 2014-05-29 2017-06-27 Monotype Imaging Inc. Compact font hinting
US10115215B2 (en) 2015-04-17 2018-10-30 Monotype Imaging Inc. Pairing fonts for presentation
US11537262B1 (en) 2015-07-21 2022-12-27 Monotype Imaging Inc. Using attributes for font recommendations
US10437615B2 (en) 2015-11-02 2019-10-08 Microsoft Technology Licensing, Llc Emotionally connected responses from a digital assistant
US11334750B2 (en) 2017-09-07 2022-05-17 Monotype Imaging Inc. Using attributes for predicting imagery performance
US10909429B2 (en) 2017-09-27 2021-02-02 Monotype Imaging Inc. Using attributes for identifying imagery for selection
US11657602B2 (en) 2017-10-30 2023-05-23 Monotype Imaging Inc. Font identification from imagery
CN115859910A (en) * 2022-12-20 2023-03-28 北京百度网讯科技有限公司 Text rendering method and device, electronic equipment and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5072214A (en) * 1989-05-11 1991-12-10 North American Philips Corporation On-screen display controller
US5724067A (en) * 1995-08-08 1998-03-03 Gilbarco, Inc. System for processing individual pixels to produce proportionately spaced characters and method of operation
US5771371A (en) * 1993-12-30 1998-06-23 International Business Machines Corporation Method and apparatus for optimizing the display of forms in a data processing system
US6038575A (en) * 1996-09-11 2000-03-14 Intel Corporation Method of sharing glyphs between computers having graphical user interfaces
US6043826A (en) * 1997-09-02 2000-03-28 Microsoft Corporation Transferring outline fonts to devices requiring raster fonts
US6057858A (en) * 1996-08-07 2000-05-02 Desrosiers; John J. Multiple media fonts
US20010005207A1 (en) * 1999-12-24 2001-06-28 Masahiro Muikaichi Apparatus and method for drawing character sequence using font data with any data structure
US20020085006A1 (en) * 2000-09-25 2002-07-04 Shade Marilyn E. Composite font editing device and computer program
US6525749B1 (en) * 1993-12-30 2003-02-25 Xerox Corporation Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system
US6754710B1 (en) * 2000-05-08 2004-06-22 Nortel Networks Limited Remote control of computer network activity
US6870535B2 (en) * 1997-09-15 2005-03-22 Canon Kabushiki Kaisha Font architecture and creation tool for producing richer text
US7161598B2 (en) * 2004-02-26 2007-01-09 Research In Motion Limited Method of rendering text on an output device
US7199805B1 (en) * 2002-05-28 2007-04-03 Apple Computer, Inc. Method and apparatus for titling
US7245945B2 (en) * 2002-11-05 2007-07-17 Intel Corporation Portable computing device adapted to update display information while in a low power mode

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5175811A (en) * 1987-05-20 1992-12-29 Hitachi, Ltd. Font data processor using addresses calculated on the basis of access parameters
US5133076A (en) * 1989-06-12 1992-07-21 Grid Systems Corporation Hand held computer
US5590247A (en) * 1993-08-31 1996-12-31 Casio Computer Co., Ltd. Character string outputting method and apparatus capable of varying sizes of characters
US5607356A (en) * 1995-05-10 1997-03-04 Atari Corporation Interactive game film
US7315979B1 (en) * 1998-11-09 2008-01-01 Tesseron Ltd. Method and system for dynamic flowing data to an arbitrary path defined by a page description language
WO2002009413A2 (en) 2000-07-21 2002-01-31 Telefonaktiebolaget L M Ericsson (Publ) Method and apparatus for producing pictures within a text display of a mobile device
US20040225773A1 (en) * 2001-02-16 2004-11-11 Wang Derek X. Apparatus and method for transmitting arbitrary font data to an output device
US6967689B1 (en) * 2001-05-08 2005-11-22 Pixelworks, Inc. System and method for providing a variable character size in an on-screen display application
US7046848B1 (en) * 2001-08-22 2006-05-16 Olcott Peter L Method and system for recognizing machine generated character glyphs and icons in graphic images
US8369601B2 (en) * 2002-12-10 2013-02-05 Ncr Corporation Method of processing a check in an image-based check processing system and an apparatus therefor
JP4202857B2 (en) * 2003-01-30 2008-12-24 富士通株式会社 Program, character input editing method, apparatus, and recording medium

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5072214A (en) * 1989-05-11 1991-12-10 North American Philips Corporation On-screen display controller
US6525749B1 (en) * 1993-12-30 2003-02-25 Xerox Corporation Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system
US5771371A (en) * 1993-12-30 1998-06-23 International Business Machines Corporation Method and apparatus for optimizing the display of forms in a data processing system
US5724067A (en) * 1995-08-08 1998-03-03 Gilbarco, Inc. System for processing individual pixels to produce proportionately spaced characters and method of operation
US6057858A (en) * 1996-08-07 2000-05-02 Desrosiers; John J. Multiple media fonts
US6038575A (en) * 1996-09-11 2000-03-14 Intel Corporation Method of sharing glyphs between computers having graphical user interfaces
US6043826A (en) * 1997-09-02 2000-03-28 Microsoft Corporation Transferring outline fonts to devices requiring raster fonts
US6870535B2 (en) * 1997-09-15 2005-03-22 Canon Kabushiki Kaisha Font architecture and creation tool for producing richer text
US20010005207A1 (en) * 1999-12-24 2001-06-28 Masahiro Muikaichi Apparatus and method for drawing character sequence using font data with any data structure
US6754710B1 (en) * 2000-05-08 2004-06-22 Nortel Networks Limited Remote control of computer network activity
US20020085006A1 (en) * 2000-09-25 2002-07-04 Shade Marilyn E. Composite font editing device and computer program
US7199805B1 (en) * 2002-05-28 2007-04-03 Apple Computer, Inc. Method and apparatus for titling
US7245945B2 (en) * 2002-11-05 2007-07-17 Intel Corporation Portable computing device adapted to update display information while in a low power mode
US7161598B2 (en) * 2004-02-26 2007-01-09 Research In Motion Limited Method of rendering text on an output device
US7768513B2 (en) * 2004-02-26 2010-08-03 Research In Motion Limited Method of rendering text on an output device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100275161A1 (en) * 2009-04-22 2010-10-28 Dicamillo Adrienne T Font Selector And Method For The Same
US8707208B2 (en) * 2009-04-22 2014-04-22 Confetti & Frolic Font selector and method for the same
WO2013066610A1 (en) * 2011-11-04 2013-05-10 Facebook, Inc. Rendering texts on electronic devices
US9082339B2 (en) 2011-11-04 2015-07-14 Facebook, Inc. Rendering texts on electronic devices

Also Published As

Publication number Publication date
US7768513B2 (en) 2010-08-03
US20050190186A1 (en) 2005-09-01
US20070153002A1 (en) 2007-07-05
US7161598B2 (en) 2007-01-09

Similar Documents

Publication Publication Date Title
US7768513B2 (en) Method of rendering text on an output device
US7827495B2 (en) Method and data structure for user interface customization
US7865215B2 (en) Magnification of currently selected menu item
US20050289458A1 (en) Enhancing browsing in electronic device
US9619446B2 (en) Generating customized graphical user interfaces for mobile processing devices
US8041558B2 (en) Text creating and editing device and computer-readable storage medium with dynamic data loading
US20120131446A1 (en) Method for displaying web page in a portable terminal
US20060026527A1 (en) Method for customizing the visual attributes of a user interface
US20140164950A1 (en) Extended user interface for email composition
CN110619879A (en) Voice recognition method and device
US20040038705A1 (en) Portable terminal equipment and method for previewing e-mail
CA2498391C (en) Method of rendering text on an output device
CA2524011C (en) Extended user interface for email composition
KR20130056636A (en) Apparatus and method for displaying a logo image in a portable terminal
US20060291463A1 (en) Communication apparatus, control method therefor, computer readable information recording medium and communication destination apparatus type registration data
US20030087630A1 (en) Method of searching for electronic mail in portable cellular phone and electronic mail searching program for portable cellular phone
US7913165B2 (en) Inserting objects using a text editor that supports scalable fonts
EP1679582B1 (en) Magnification of currently selected menu item
CN114780896B (en) Webpage content generation method and device and computer readable storage medium
US20070043456A1 (en) Electronic device and method of representing an application or data using a shell application
KR100385523B1 (en) Apparatus and method for materializing transparent
CN113822011A (en) Font switching method and electronic equipment
CN117075992A (en) AppList component library design method and device

Legal Events

Date Code Title Description
AS Assignment

Owner name: RESEARCH IN MOTION LIMITED, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KLASSEN, GERHARD D.;REEL/FRAME:024535/0111

Effective date: 20040225

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: ECKE RANCH B.V., NETHERLANDS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHABER, MARGARET;REEL/FRAME:030101/0130

Effective date: 20121106

AS Assignment

Owner name: BLACKBERRY LIMITED, ONTARIO

Free format text: CHANGE OF NAME;ASSIGNOR:RESEARCH IN MOTION LIMITED;REEL/FRAME:037845/0441

Effective date: 20130709

AS Assignment

Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064104/0103

Effective date: 20230511