Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS8108144 B2
Publication typeGrant
Application numberUS 12/164,866
Publication date31 Jan 2012
Filing date30 Jun 2008
Priority date28 Jun 2007
Also published asUS8548735, US20090003659, US20120131048, US20120131510
Publication number12164866, 164866, US 8108144 B2, US 8108144B2, US-B2-8108144, US8108144 B2, US8108144B2
InventorsScott Forstall, Gregory N. Christie, Robert E. Borchers, Imran A. Chaudhri
Original AssigneeApple Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Location based tracking
US 8108144 B2
Abstract
Location information is used to build a database of locations having associated audio, video, image or text data. In some implementations, a device includes a touch-sensitive display and collects data associated with a geographic location of interest. The geographic location of interest can be displayed on a map using an indicator. A touch selection of the indicator provides access to the data through an interface displayed on the touch-sensitive display. One or more locations of interest can be displayed and grouped together by an attribute.
Images(8)
Previous page
Next page
Claims(42)
1. A method comprising:
presenting a map of a geographic region on a mobile device display;
determining a geographic location of the mobile device;
receiving data in response to an input selecting the data;
associating the data with the geographic location to produce first geographically tagged data; and
storing the first geographically tagged data on the mobile device.
2. The method of claim 1, further comprising:
positioning a first indicator on the map display in accordance with the geographic location.
3. The method of claim 1, wherein determining the geographic location of the mobile device further comprises:
receiving the geographic location from a source external to the mobile device, the source being communicatively connected to the mobile device.
4. The method of claim 1, wherein the geographic location is Global Positioning System (GPS) coordinate data.
5. The method of claim 2, further comprising:
receiving input selecting a second geographic location;
receiving data in response to an input selecting the data;
associating the data with the second geographic location to produce second geographically tagged data; and
storing the second geographically tagged data on the mobile device.
6. The method of claim 5, further comprising:
positioning a second indicator on the map display in accordance with the second geographic location.
7. The method of claim 6, further comprising:
presenting the indicator and the second indicator on the map display;
receiving a selection of one of the first indicator or the second indicator; and
displaying, on the mobile device, the first geographically tagged data or the second geographically tagged data in accordance with the selection.
8. The method of claim 1, further comprising:
receiving an input selecting the geographic location;
receiving an input selecting a type of data to be displayed associated with the geographic location; and
displaying an indication of the geographic location and the selected type of data in a user interface provided by the mobile device.
9. The method of claim 1, further comprising:
associating the data with the geographic location using geotags.
10. The method of claim 1, further comprising:
communicating the first geographically tagged data to a device external of the mobile device.
11. The method of claim 10, further comprising:
displaying the first geographically tagged data on the device external of the mobile device.
12. The method of claim 1, wherein the data, captured by the mobile device, comprises one of text, images, audio or video associated with the geographic location.
13. The method of claim 1, further comprising:
compiling the data into a multimedia presentation.
14. The method of claim 1, further comprising:
receiving the input from a touch-sensitive display provided by the mobile device.
15. A system comprising:
a processor;
a non-transitory computer-readable medium coupled to the processor and having instructions stored thereon, which, when executed by the processor, causes the processor to perform operations comprising:
presenting a map of a geographic region on a mobile device display;
determining a geographic location of the mobile device;
receiving data in response to an input selecting the data;
associating the data with the geographic location to produce first geographically tagged data; and
storing the first geographically tagged data on the mobile device.
16. The system of claim 15, wherein the geographic location is Global Positioning System (GPS) coordinate data.
17. The system of claim 15, the operations further comprising:
receiving the geographic location from a source external to the mobile device, the source being communicatively connected to the mobile device.
18. The system of claim 15, wherein determining the geographic location of the mobile device further comprises:
positioning a first indicator on the map display in accordance with the geographic location.
19. The system of claim 18, the operations further comprising:
receiving input selecting a second geographic location;
receiving data in response to an input selecting the data;
associating the data with the second geographic location to produce second geographically tagged data; and
storing the second geographically tagged data on the mobile device.
20. The system of claim 19, the operations further comprising:
positioning a second indicator on the map display in accordance with the second geographic location.
21. The system of claim 20, the operations further comprising:
presenting the first indicator and the second indicator on the map display;
receiving a selection of one of the first indicator or the second indicator; and
displaying, on the mobile device, the first geographically tagged data or the second geographically tagged data in accordance with the selection.
22. The system of claim 15, the operations further comprising:
receiving an input selecting the geographic location;
receiving an input selecting a type of data to be displayed associated with the geographic location; and
displaying an indication of the geographic location and the selected type of data in a user interface provided by the mobile device.
23. The system of claim 15, the operations further comprising:
associating the data with the geographic location using geotags.
24. The system of claim 15, the operations further comprising:
communicating the first geographically tagged data to a device external of the mobile device.
25. The system of claim 24, the operations further comprising:
displaying the first geographically tagged data on the device external of the mobile device.
26. The system of claim 15, wherein the data, captured by the mobile device, comprises one of text, images, audio or video associated with the geographic location.
27. The system of claim 15, the operations further comprising:
compiling the data into a multimedia presentation.
28. The system of claim 15, the operations further comprising:
receiving the input from a touch-sensitive display provided by the mobile device.
29. A computer program product, encoded on a non-transitory computer-readable medium, operable to cause data processing apparatus to perform operations comprising:
presenting a map of a geographic region on a mobile device display;
determining a geographic location of the mobile device;
receiving data in response to an input selecting the data;
associating the data with the geographic location to produce first geographically tagged data; and
storing the first geographically tagged data on the mobile device.
30. The computer program product of claim 29, wherein the geographic location is Global Positioning System (GPS) coordinate data.
31. The computer program product of claim 29, wherein determining the geographic location of the mobile device further comprises:
receiving the geographic location from a source external to the mobile device, the source being communicatively connected to the mobile device.
32. The computer program product of claim 29, the operations further comprising:
positioning a first indicator on the map display in accordance with the geographic location.
33. The computer program product of claim 32, the operations further comprising:
receiving input selecting a second geographic location;
receiving data in response to an input selecting the data;
associating the data with the second geographic location to produce second geographically tagged data; and
storing the second geographically tagged data on the mobile device.
34. The computer program product of claim 33, the operations further comprising:
positioning a second indicator on the map display in accordance with the second geographic location.
35. The computer program product of claim 34, the operations further comprising:
presenting the first indicator and the second indicator on the map display;
receiving a selection of one of the first indicator or the second indicator; and
displaying, on the mobile device, the first geographically tagged data or the second geographically tagged data in accordance with the selection.
36. The computer program product of claim 29, the operations further comprising:
receiving an input selecting the geographic location;
receiving an input selecting a type of data to be displayed associated with the geographic location; and
displaying an indication of the geographic location and the selected type of data in a user interface provided by the mobile device.
37. The computer program product of claim 29, the operations further comprising:
associating the data with the geographic location using geotags.
38. The computer program product of claim 29, the operations further comprising:
communicating the first geographically tagged data to a device external of the mobile device.
39. The computer program product of claim 38, the operations further comprising:
displaying the first geographically tagged data on the device external of the mobile device.
40. The computer program product of claim 29, wherein the data, captured by the mobile device, comprises one of text, images, audio or video associated with the geographic location.
41. The computer program product of claim 29, the operations further comprising:
compiling the data into a multimedia presentation.
42. The computer program product of claim 29, the operations further comprising:
receiving the input from a touch-sensitive display provided by the mobile device.
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application claims priority to U.S. Provisional Patent Application Ser. No. 60/946,813 filed Jun. 28, 2007, and entitled “LOCATION BASED TRACKING,” the contents of which are incorporated herein by reference.

TECHNICAL FIELD

The subject matter of this patent application is generally related to location based services.

BACKGROUND

Mobile devices have grown more powerful and feature-rich and now include such features as personal digital assistant (PDA) capabilities, cameras to capture video and still images, Internet access, etc. Location-based services have been developed for determining and tracking the locations of the users of mobile devices. Location-based services provide location-specific information to mobile devices, including, for example, global positioning system (GPS) data to locate the mobile device on a map of a geographic region.

A number of applications are available for aiding users in navigation and route planning. Some of these applications use mobile devices containing global positioning systems to define the location of the mobile device and plan a route to a desired destination. Currently, however, these route planning systems do not provide a way to document items of interest to a user while a route is traveled. In conventional systems, the information the route planning systems provide is limited to what is pre-programmed. This information can become obsolete in time and may be of little or no interest to the user.

SUMMARY

Location information is used to build a database of locations having associated audio, video, image or text data.

In some implementations, a method includes: presenting a map of a geographic region on a touch-sensitive display; receiving touch input selecting a geographic location; determining geographic positioning information of the geographic location; receiving data in response to an input received by a touch-sensitive display; associating the data with the geographic positioning information of the geographic location to produce geographically tagged data; and storing the geographically-tagged data.

In some implementations a method includes: presenting indications of a predetermined group of geographic locations on a touch-sensitive display; receiving a selection of a geographic location from the group of geographic locations displayed on the touch-sensitive display; and presenting geographically tagged data associated with the geographic location in a user interface on the touch-sensitive display.

In some implementations, a user interface includes a touch-sensitive display area for displaying indications of a predetermined group of geographic locations associated by an attribute, wherein each indication represents geographically coded data associated with a geographic position, and wherein a name of the attribute is displayed in the user interface.

Other implementations are disclosed, including implementations directed to systems, methods, apparatuses, computer-readable mediums and user interfaces.

DESCRIPTION OF DRAWINGS

FIG. 1 is a block diagram of an example mobile device.

FIG. 2 is a block diagram of an example network operating environment for the mobile device of FIG. 1.

FIG. 3 is a block diagram of an example implementation of the mobile device of FIG. 1.

FIGS. 4-10 are exemplary interfaces to input, review and display data associated with geographic locations of interest.

FIG. 11 is a flow diagram of an example process for indicating geographic locations of interest.

FIG. 12 is a flow diagram of an example process for reviewing and editing data associated with geographic locations of interest.

FIG. 13 is a flow diagram of an example process for interactively displaying data associated with geographic locations of interest.

FIG. 14 is a flow diagram of an example process for playback of data associated with geographic locations of interest.

DETAILED DESCRIPTION

FIG. 1 is a block diagram of an example mobile device 100. The mobile device 100 can be, for example, a handheld computer, a personal digital assistant, a cellular telephone, a network appliance, a camera, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a network base station, a media player, a navigation device, an email device, a game console, or other electronic device, a combination of any two or more of these data processing devices or other data processing devices.

Mobile Device Overview

In some implementations, the mobile device 100 includes a touch-sensitive display 102. The touch-sensitive display 102 can implement liquid crystal display (LCD) technology, light emitting polymer display (LPD) technology, or some other display technology. The touch-sensitive display 102 can be sensitive to haptic and/or tactile contact with a user.

In some implementations, the touch-sensitive display 102 can comprise a multi-touch-sensitive display 102. A multi-touch-sensitive display 102 can, for example, process multiple simultaneous touch points, including processing data related to the pressure, degree and/or position of each touch point. Such processing facilitates gestures and interactions with multiple fingers, chording, and other interactions. Other touch-sensitive display technologies can also be used, e.g., a display in which contact is made using a stylus or other pointing device. Some examples of multi-touch-sensitive display technology are described in U.S. Pat. Nos. 6,323,846, 6,570,557, 6,677,932, and U.S. Patent Publication 2002/0015024A1, each of which is incorporated by reference herein in its entirety.

In some implementations, the mobile device 100 can display one or more graphical user interfaces on the touch-sensitive display 102 for providing the user access to various system objects and for conveying information to the user. In some implementations, the graphical user interface can include one or more display objects 104, 106. In the example shown, the display objects 104, 106, are graphic representations of system objects. Some examples of system objects include device functions, applications, windows, files, alerts, events, or other identifiable system objects.

Exemplary Mobile Device Functionality

In some implementations, the mobile device 100 can implement multiple device functionalities, such as a telephony device, as indicated by a phone object 110; an e-mail device, as indicated by the e-mail object 112; a network data communication device, as indicated by the Web object 114; a Wi-Fi base station device (not shown); and a media processing device, as indicated by the media player object 116. In some implementations, particular display objects 104, e.g., the phone object 110, the e-mail object 112, the Web object 114, and the media player object 116, can be displayed in a menu bar 118. In some implementations, device functionalities can be accessed from a top-level graphical user interface, such as the graphical user interface illustrated in FIG. 1. Touching one of the objects 110, 112, 114 or 116 can, for example, invoke corresponding functionality.

In some implementations, the mobile device 100 can implement network distribution functionality. For example, the functionality can enable the user to take the mobile device 100 and its associated network while traveling. In particular, the mobile device 100 can extend Internet access (e.g., Wi-Fi) to other wireless devices in the vicinity. For example, mobile device 100 can be configured as a base station for one or more devices. As such, mobile device 100 can grant or deny network access to other wireless devices.

In some implementations, upon invocation of device functionality, the graphical user interface of the mobile device 100 changes, or is augmented or replaced with another user interface or user interface elements, to facilitate user access to particular functions associated with the corresponding device functionality. For example, in response to a user touching the phone object 110, the graphical user interface of the touch-sensitive display 102 may present display objects related to various phone functions; likewise, touching of the email object 112 may cause the graphical user interface to present display objects related to various e-mail functions; touching the Web object 114 may cause the graphical user interface to present display objects related to various Web-surfing functions; and touching the media player object 116 may cause the graphical user interface to present display objects related to various media processing functions.

In some implementations, the top-level graphical user interface environment or state of FIG. 1 can be restored by pressing a button 120 located near the bottom of the mobile device 100. In some implementations, each corresponding device functionality may have corresponding “home” display objects displayed on the touch-sensitive display 102, and the graphical user interface environment of FIG. 1 can be restored by pressing the “home” display object.

In some implementations, the top-level graphical user interface can include additional display objects 106, such as a short messaging service (SMS) object 130, a calendar object 132, a photos object 134, a camera object 136, a calculator object 138, a stocks object 140, a weather object 142, a maps object 144, a notes object 146, a clock object 148, an address book object 150, and a settings object 152. Touching the SMS display object 130 can, for example, invoke an SMS messaging environment and supporting functionality; likewise, each selection of a display object 132, 134, 136, 138, 140, 142, 144, 146, 148, 150 and 152 can invoke a corresponding object environment and functionality.

Additional and/or different display objects can also be displayed in the graphical user interface of FIG. 1. For example, if the device 100 is functioning as a base station for other devices, one or more “connection” objects may appear in the graphical user interface to indicate the connection. In some implementations, the display objects 106 can be configured by a user, e.g., a user may specify which display objects 106 are displayed, and/or may download additional applications or other software that provides other functionalities and corresponding display objects.

In some implementations, the mobile device 100 can include one or more input/output (I/O) devices and/or sensor devices. For example, a speaker 160 and a microphone 162 can be included to facilitate voice-enabled functionalities, such as phone and voice mail functions. In some implementations, a loud speaker 164 can be included to facilitate hands-free voice functionalities, such as speaker phone functions. An audio jack 166 can also be included for use of headphones and/or a microphone.

In some implementations, a proximity sensor 168 can be included to facilitate the detection of the user positioning the mobile device 100 proximate to the user's ear and, in response, to disengage the touch-sensitive display 102 to prevent accidental function invocations. In some implementations, the touch-sensitive display 102 can be turned off to conserve additional power when the mobile device 100 is proximate to the user's ear.

Other sensors can also be used. For example, in some implementations, an ambient light sensor 170 can be utilized to facilitate adjusting the brightness of the touch-sensitive display 102. In some implementations, an accelerometer 172 can be utilized to detect movement of the mobile device 100, as indicated by the directional arrow 174. Accordingly, display objects and/or media can be presented according to a detected orientation, e.g., portrait or landscape. In some implementations, the mobile device 100 may include circuitry and sensors for supporting a location determining capability, such as that provided by the global positioning system (GPS) or other positioning systems (e.g., systems using Wi-Fi access points, television signals, cellular grids, Uniform Resource Locators (URLs)). In some implementations, a positioning system (e.g., a GPS receiver) can be integrated into the mobile device 100 or provided as a separate device that can be coupled to the mobile device 100 through an interface (e.g., port device 190) to provide access to location-based services.

The mobile device 100 can also include a camera lens and sensor 180. In some implementations, the camera lens and sensor 180 can be located on the back surface of the mobile device 100. The camera can capture still images and/or video.

The mobile device 100 can also include one or more wireless communication subsystems, such as an 802.11b/g communication device 186, and/or a Bluetooth™ communication device 188. Other communication protocols can also be supported, including other 802.x communication protocols (e.g., WiMax, Wi-Fi, 3G), code division multiple access (CDMA), global system for mobile communications (GSM), Enhanced Data GSM Environment (EDGE), etc.

In some implementations, a port device 190, e.g., a Universal Serial Bus (USB) port, or a docking port, or some other wired port connection, can be included. The port device 190 can, for example, be utilized to establish a wired connection to other computing devices, such as other communication devices 100, network access devices, a personal computer, a printer, or other processing devices capable of receiving and/or transmitting data. In some implementations, the port device 190 allows the mobile device 100 to synchronize with a host device using one or more protocols, such as, for example, the TCP/IP, HTTP, UDP and any other known protocol. In some implementations, a TCP/IP over USB protocol can be used.

Network Operating Environment

FIG. 2 is a block diagram of an example network operating environment 200 for the mobile device 100 of FIG. 1. The mobile device 100 of FIG. 1 can, for example, communicate over one or more wired and/or wireless networks 210 in data communication. For example, a wireless network 212, e.g., a cellular network, can communicate with a wide area network (WAN) 214, such as the Internet, by use of a gateway 216. Likewise, an access point 218, such as an 802.11g wireless access point, can provide communication access to the wide area network 214. In some implementations, both voice and data communications can be established over the wireless network 212 and the access point 218. For example, the mobile device 100 a can place and receive phone calls (e.g., using VoIP protocols), send and receive e-mail messages (e.g., using POP3 protocol), and retrieve electronic documents and/or streams, such as web pages, photographs, and videos, over the wireless network 212, gateway 216, and wide area network 214 (e.g., using TCP/IP or UDP protocols). Likewise, the mobile device 100 b can place and receive phone calls, send and receive e-mail messages, and retrieve electronic documents over the access point 218 and the wide area network 214. In some implementations, the mobile device 100 can be physically connected to the access point 218 using one or more cables and the access point 218 can be a personal computer. In this configuration, the mobile device 100 can be referred to as a “tethered” device.

The mobile devices 100 a and 100 b can also establish communications by other means. For example, the wireless device 100 a can communicate with other wireless devices, e.g., other wireless devices 100, cell phones, etc., over the wireless network 212. Likewise, the mobile devices 100 a and 100 b can establish peer-to-peer communications 220, e.g., a personal area network, by use of one or more communication subsystems, such as the Bluetooth™ communication device 188 shown in FIG. 1. Other communication protocols and topologies can also be implemented.

The mobile device 100 can, for example, communicate with one or more services 230, 240, 250, and 260 and/or one or more content publishers 270 over the one or more wired and/or wireless networks 210. For example, a navigation service 230 can provide navigation information, e.g., map information, location information, route information, and other information, to the mobile device 100. In the example shown, a user of the mobile device 100 b has invoked a map functionality, e.g., by pressing the maps object 144 on the top-level graphical user interface shown in FIG. 1, and has requested and received a map for the location “1 Infinite Loop, Cupertino, Calif.”

User devices 280 can, for example, communicate with the one or more services 230, 240, 250 and 260 and/or one or more content publishes 260 over the one or more wired and/or wireless networks 210 to access content and services as well as communicate with the mobile device 100. The user devices 280 can be, for example, a personal computer, a set top, a gaming device, a digital video recorder, a portable audio or video player, an in-vehicle navigation system, etc.

A messaging service 240 can, for example, provide e-mail and/or other messaging services. A media service 250 can, for example, provide access to media files, such as song files, movie files, video clips, and other media data. One or more other services 260 can also be utilized by the mobile device 100.

The mobile device 100 can also access other data and content over the one or more wired and/or wireless networks 210. For example, content publishers, e.g., content publisher(s) 270, such as news sites, RSS feeds, web sites, blogs, social networking sites, developer networks, etc., can be accessed by the mobile device 100. Such access can be provided by invocation of a web browsing function or application (e.g., a browser) in response to a user touching the Web object 114.

Exemplary Mobile Device Architecture

FIG. 3 is a block diagram 300 of an example implementation of the mobile device 100 of FIG. 1. The mobile device 100 can include a memory interface 302, one or more data processors, image processors and/or central processing units 304, and a peripherals interface 306. The memory interface 302, the one or more processors 304 and/or the peripherals interface 306 can be separate components or can be integrated in one or more integrated circuits. The various components in the mobile device 100 can be coupled by one or more communication buses or signal lines.

Sensors, devices and subsystems can be coupled to the peripherals interface 306 to facilitate multiple functionalities. For example, a motion sensor 310, a light sensor 312, and a proximity sensor 314 can be coupled to the peripherals interface 306 to facilitate the orientation, lighting and proximity functions described with respect to FIG. 1. Other sensors 316 can also be connected to the peripherals interface 306, such as a positioning system (e.g., GPS receiver), a temperature sensor, a biometric sensor, or other sensing device, to facilitate related functionalities.

A camera subsystem 320 and an optical sensor 322, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips.

Communication functions can be facilitated through one or more wireless communication subsystems 324, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the communication subsystem 324 can depend on the communication network(s) over which the mobile device 100 is intended to operate. For example, a mobile device 100 may include communication subsystems 324 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax network, and a Bluetooth™ network. In particular, the wireless communication subsystems 324 may include hosting protocols such that the device 100 may be configured as a base station for other wireless devices.

An audio subsystem 326 can be coupled to a speaker 328 and a microphone 330 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.

The I/O subsystem 340 can include a touch screen controller 342 and/or other input controller(s) 344. The touch-screen controller 342 can be coupled to a touch screen 346. The touch screen 346 and touch screen controller 342 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 346.

The other input controller(s) 344 can be coupled to other input/control devices 348, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of the speaker 328 and/or the microphone 330.

In one implementation, a pressing of the button for a first duration may disengage a lock of the touch screen 346; and a pressing of the button for a second duration that is longer than the first duration may turn power to the mobile device 100 on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen 346 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.

In some implementations, the mobile device 100 can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, the mobile device 100 can include the functionality of an MP3 player, such as an iPod™. The mobile device 100 may, therefore, include a 36-pin connector that is compatible with the iPod. Other input/output and control devices can also be used.

The memory interface 302 can be coupled to memory 350. The memory 350 can include high-speed random access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). The memory 350 can store an operating system 352, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. The operating system 352 may include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, the operating system 352 can be a kernel (e.g., UNIX kernel).

The memory 350 may also store communication instructions 354 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. The memory 350 may include graphical user interface instructions 356 to facilitate graphic user interface processing; sensor processing instructions 358 to facilitate sensor-related processing and functions; phone instructions 360 to facilitate phone-related processes and functions; electronic messaging instructions 362 to facilitate electronic-messaging related processes and functions; web browsing instructions 364 to facilitate web browsing-related processes and functions; media processing instructions 366 to facilitate media processing-related processes and functions; GPS/Navigation instructions 368 to facilitate GPS and navigation-related processes and instructions; camera instructions 370 to facilitate camera-related processes and functions; and/or other software instructions 372 to facilitate other processes and functions.

Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures or modules. The memory 350 can include additional instructions or fewer instructions. Furthermore, various functions of the mobile device 100 may be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.

FIG. 4 is an example user interface that is presented on the mobile device 100 in response to a user selection of, e.g., the maps object 144. The user interface includes an information display area 400 and a map display area 402. A position of the mobile device is indicated by an indicator 404. The information display area 400 receives user input from the touch-sensitive display 102. In some implementations, upon an input of a location, e.g., California, the user is presented with an object 406 providing an option to save data associated with the input location.

In some implementations, the user provides a “friendly name” to identify the location or group of locations of interest. If, for example, a user would like to save data related to a trip to California, the user can identify the data by entering, e.g., “My Trip to California” into the display area 400. A user can save the data in accordance with any attribute.

In some implementations, an indicator 406 can be placed on the map display area 402 to indicate a particular geographic location of interest. For example, if the location if interest is in Cupertino, user input can be received from the touch sensitive display 102 to place the indicator 406 on the map display area 402 at either the current location of the mobile device 100 (shown as reference numeral 404) or a user-specified location.

Where the current location of the mobile device 100 is used to specify the geographic location of interest, according to some implementations, geographic position information can be provided to the mobile device 100 from, for example, Global Positioning System (GPS) coordinate data. The GPS coordinate data can be processed by the GPS/Navigation instructions 368 and can be provided from an external or internal GPS navigation system. Triangulation and external GPS information can be provided to the mobile device 100 through the wireless communication subsystems 324 or port device 190.

In some implementations, the geographic information regarding the geographic location of interest is manually input by the user. The user can input a street address, a latitude/longitude pair, or other identifying geographic information to specify the geographic location of interest.

After a geographic location of interest has been indicated, in some implementations, an example user interface 500 shown in FIG. 5 is presented on the mobile device 100 in response to invoking the camera object 136. A next object 502 and back object 504, are provided to navigate within the user interface 500. A save object 506 and delete object 508 are provided to operate on pictures captured by the mobile device 100. In some implementations, audio data is captured as the picture data is captured by the mobile device 100.

In some implementations, pictures captured by the mobile device are geographically associated with the geographic location of interest when the save object 506 is selected to save the currently displayed picture. In some implementations, the geographic association is automatically performed using the determined position of the mobile device. In some implementations, the geographic association is manually input by a user when touching the save object 506 on the touch-sensitive display 102.

In some implementations, the association of geographic information with data is performed by geo-tagging the data. For example, geographic position information (e.g., latitude and longitude), geographic place names, or geographical regions are associated with the data. In the example of FIG. 5, the geographic information could be, for example, latitude 37.331837, longitude −122.030799; or 1 Infinite Loop, Cupertino, Calif., USA. In some implementations, the geographic information can be included as meta tags in a document.

In some implementations, the user interface 500 can be used to capture video when the camera object 136 is invoked by a user. The video data is saved on the mobile device 100 with associated geographic information as described above with regard to pictures data.

Referring to FIG. 6, notes (e.g., text information or audio information) about the geographic location of interest can be recorded using the notes object 146. The next object 502 and back object 504, are provided to navigate within the user interface 600. The save object 506 and delete object 508 are provided to operate on the notes entered on the mobile device 100. In the manner described above, notes received by the mobile device 100 are geographically associated with the geographic location of interest when the save object 506 is selected to save the current notes on the mobile device 100.

In some implementations, touching the indicator 406 presents a menu item to invoke a reviewing user interface 700 such as shown in FIG. 7. Objects such as view notes 702, view pictures 704 and view video 706 can be displayed on the user interface 700. If, for example, the view notes object 702 is selected the user interface of FIG. 6 can be displayed. If, for example, the view pictures object 704 or the view videos object 706 is selected, the user interface of FIG. 5 can be displayed. A navigation object 708 is provided, for example, to return to the previous display (e.g., map display area 402 shown in FIG. 4).

As shown in FIG. 8, in some implementations, multiple indicators 406, 800, 802, 804 and 806 can be placed on the map display area 402 to indicate multiple locations of interest. In some implementations, for each geographic location of interest, the user can capture data such as, pictures, notes, audio and video and save it to the mobile device 100 with an association to the geographic location of interest as described above with regard to indicator 406. In the example interface of FIG. 8, data associated with Palo Alto, Calif. (indicator 800) and San Francisco, Calif. (indicators 802, 804 and 806) is saved on the mobile device 100.

In some implementations the data associated with the geographic locations identified by indicators 800, 802, 804 and/or 806 can be reviewed in the reviewing user interface 700 of FIG. 7 in response to a selection of indicators 800, 802, 804 and/or 806. For example, if indicator 800 is selected, the user interface 700 is presented, however, the information display area 400 would indicate “Stanford University.” Likewise, if indicator 802, 804 or 806 is selected, the information display area 400 would indicate “San Francisco,” or “Downtown San Francisco,” “Golden Gate Bridge,” or “Alcatraz Island,” respectively, if a higher level of detail is desired.

In some implementations, the data on the mobile device 100 associated with locations of interest can be uploaded to a remote storage location at one of the service providers 230, 240, 250 and 260 and/or content publishers 260, or directly to an end-user device 280.

In some implementations, the data associated with locations of interest can be played back for later viewing as a multimedia presentation. For example, in response to a selection of the maps object 144, the data saved to “My Trip to California” is retrieved and displayed in a user interface such as FIG. 9.

In some implementations, the multimedia presentation begins by displaying the indicator 406 on the map display area 402 as shown in FIG. 9. The presentation continues by showing selected, a predetermined portion, or all pictures, notes, audio and/or video associated with the geographic location specified by the indicator 406. For example, the user interfaces of FIGS. 4 and 5 can be displayed in response to a selection of the indicator 406 such that users can step through the pictures, notes and/or videos using the navigation objects 802 and 804.

In some implementations, as shown in FIG. 9, the multimedia presentation includes an indicator 900 illustrating a traveled route 900 associated with the saved “My Trip to California.” The traveled route 900 can be indicated, for example by a line or an arrow that moves from indicator 406 to indicator 800 to illustrate a direction of travel over time. Notes, pictures, audio and/or videos associated with the location specified by indicator 800 (e.g., Stanford University) are accessible to the user as discussed above.

As shown in FIG. 10, the multimedia presentation illustrates the traveled route indicator 900 moving to the end in San Francisco, Calif., where indicators 802, 804 and 806 are located. Data associated with indicators 802, 804 and 806 can be displayed as indicated above with regard to indicator 406. For example, notes, pictures, audio and/or video associated with downtown San Francisco (e.g., indicator 802), the Golden gate Bridge (e.g., indicator 804), and/or Alcatraz Island (e.g., indicator 806) can be displayed.

In some implementations, all of the data saved to “My Trip to California” is available at once rather than conveying a notion of time as described above. The user interface of FIG. 10 is used for the presentation of the pictures, notes, audio and/or video associated with the all (or a predetermined subset) of the locations of interest indicated by indicators 406, 800, 802, 804 and 806. The user interfaces of FIGS. 4 and 5 can be displayed in response to a selection of one of the indicators 406, 800, 802, 804 or 806 such that users can step through the pictures, notes and/or videos using the navigation objects 802 and 804.

In some implementations, the pictures, notes and/or videos are compiled into a movie using an authoring application that converts and aggregates the pictures, notes, audio and/or video into a multimedia video data file, such as an MPEG-2, MPEG-4 AVI, Quicktime, Windows Media, RealVideo, DivX, etc., movie file. The movie can be compiled on a mobile device 100 or remotely by one of the services 230, 240, 250 or 260, or content publishers 270. For example, in some implementations, the movie begins by displaying a map of the first geographic location of interest (e.g. Cupertino) and then displaying associated pictures, notes and videos taken by the mobile device 100 in-succession. The movie changes scenes to a second geographic location of interest (e.g. Stanford University) to display a map and associated pictures, notes, audio and videos. Finally, the movie continues until the pictures, notes, audio and videos for a final geographic location of interest (e.g., Alcatraz Island) are displayed.

In some implementations, the data associated with geographic locations can be requested by the end-user devices 280 for display. A suitable application running on an end-user device 280 makes a request over the wide area network 214 to, e.g. the media service 250, the content publisher 274, or the wireless device 100 the data to be downloaded or to download the compiled movie.

FIG. 11 is a flow diagram of an example process 1100 for indicating geographic locations of interest and acquiring data associated with the geographic locations of interest. At stage 1102, an indication is received of a geographic location of interest. For example, a location input in the information display area 400 shown in the user interface of FIGS. 4 and 5 is confirmed as a geographic location of interest by a selection of the save object 406.

At stage 1104, the geographic position information of the geographic location of interest is ascertained. For example, this information can be manually input or obtained from GPS coordinate data. At stage 1106, data associated with location is received. For example, notes, pictures, audio and/or video associated with the geographic location of interest is input to the mobile device 100 by a selection of the camera object 136 or the notes object 146.

At stage 1108, data is stored with the geographic position information. For example, the notes, pictures, audio and/or video received at stage 1106 are saved with the geographic position information in the mobile device 100. The geographic position information can be automatically appended to the notes, pictures, audio and/or video, or manually input by the user during the save operation.

At stage 1110, it is determined if more data is to be associated with the geographic location of interest. If so, the process flows to stage 1106. If no more data is to be associated with the geographic location of interest, the process returns to stage 1102.

FIG. 12 is a flow diagram of an example process 1200 for reviewing and editing data associated with geographic locations of interest. At stage 1202, an indication is received. For example, a user selects the indicator 406 and an option to edit and/or review data associated with the geographic area identified by indicator 406. At stage 1204, a user interface is provided (e.g., launched). For example, in accordance with the type of data to be displayed, one of user interfaces 500 or 600 is launched to view and/or edit pictures, videos, and/or notes associated with the geographic location of interest identified by indicator 406.

At stage 1206, an indication of an action is received. For example, a user input from one of objects 502, 504, 506 and/or 508 is received by the mobile device 100. At stage 1208, the received action is performed. For example, a next picture is displayed if the next object 502 is selected, or a previous picture is displayed if the back object 504 is selected. A displayed picture is saved if the save object 506 is selected, or deleted if the deleted object 508 is selected by the user.

FIG. 13 is a flow diagram of an example process 1300 for interactively displaying data associated with geographic locations of interest. At stage 1302, an indication is received. For example, after invoking the maps object 144, the user selects data associated with “My Trip to California.” At stage 1304, data is retrieved. For example, the data associated with the saved locations of interest identified by “My Trip to California” is retrieved from the memory 350 in the mobile device 100.

At stage 1306, a user interface is displayed. For example, the user interface of FIG. 9 is displayed on the mobile device 100. At stage 1308, an indication of location is received. For example, the user selects indicator 406 on the touch sensitive display 102. At stage 1310, data is presented. For example, the interface of FIG. 7 is displayed, from which the user can select to view notes, pictures, audio and/or video. In accordance with the selection made by the user, the user interface 500 or 600 is presented to view the data requested.

After the data is presented, the flow returns to stage 1306. For example, when the user selects the back object 708, the user interface of FIG. 9 (or FIG. 10) is displayed.

FIG. 14 is a flow diagram of an example process 1400 for playback of data associated with one or more geographic locations of interest. At stage 1402, an indication is received. For example, a location input in the information display area 400 shown in the user interface of FIGS. 4 and 5 is confirmed as a geographic location of interest, or a selection of object on the touch sensitive display 102 is received.

At stage 1404, an application is launched. For example, a playback application (e.g., media player) executing on the mobile device 100 or end-user device 280 is launched. At stage 1406, data is retrieved. For example, data associated with the geographic location of interest is retrieved from the memory 350 or from a remote location and communicated over the wide area network and/or wireless network to the mobile device 100 or end-user device 280.

At stage 1408, a user interface is presented. For example, the user interface associated with the media player is displayed on the mobile device 100 or end-user device 280.

At stage 1410, the data associated with the geographic location of interest is presented in the user interface. In accordance with a playback mode, the notes, pictures, and/or video associated with the geographic locations of interest are played back in sequence without any user interaction.

The disclosed embodiments can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of what is disclosed here, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.

The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

While this specification contains many specifics, these should not be construed as limitations on the scope of what being claims or of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.

Similarly, while operations are depicted in the drawings in a particular order, this should not be understand as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.

Various modifications may be made to the disclosed implementations and still be within the scope of the following claims.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US46443518 May 198417 Feb 1987Motorola, Inc.Two way personal message system with extended coverage
US490321211 Mar 198820 Feb 1990Mitsubishi Denki Kabushiki KaishaGPS/self-contained combination type navigation system
US49071595 May 19886 Mar 1990U.S. Philips CorporationDevice for receiving and processing road information
US499978310 May 198812 Mar 1991Sumitomo Electric Industries, Ltd.Of a vehicle
US503110429 Nov 19899 Jul 1991Sumitomo Electric Industries, Ltd.Adaptive in-vehicle route guidance system
US504601130 Jun 19893 Sep 1991Mazda Motor CorporationApparatus for navigating vehicle
US506708130 Aug 198919 Nov 1991Person Carl EPortable electronic navigation aid
US51269418 Feb 199030 Jun 1992Hailemichael GurmuVehicle guidance system
US516490426 Jul 199017 Nov 1992Farradyne Systems, Inc.In-vehicle traffic congestion information system
US517016526 Apr 19908 Dec 1992Honda Giken Kogyo Kabushiki KaishaApparatus for displaying travel path
US517369126 Jul 199022 Dec 1992Farradyne Systems, Inc.Data fusion process for an in-vehicle traffic congestion information system
US518255526 Jul 199026 Jan 1993Farradyne Systems, Inc.Cell messaging process for an in-vehicle traffic congestion information system
US518781027 Nov 199116 Feb 1993Oki Electric Industry Co., Ltd.Route guidance system for provding a mobile station with optimum route data in response to a guidance request together with base station data indicative of an identification of a base station
US519503124 Oct 198816 Mar 1993Reuters LimitedTrading system for providing real time context sensitive trading messages based on conversation analysis
US520876314 Sep 19904 May 1993New York UniversityMethod and apparatus for determining position and orientation of mechanical objects
US521862912 May 19898 Jun 1993Public Access Cellular Telephone, Inc.Communication system for message display onboard mass transit vehicles
US524365230 Sep 19927 Sep 1993Gte Laboratories IncorporatedLocation-sensitive remote database access control
US527456027 Mar 199128 Dec 1993Audio Navigation Systems, Inc.Sensor free vehicle navigation system utilizing a voice input/output interface for routing a driver from his source point to his destination point
US528957226 Feb 199322 Feb 1994Mitsubishi Denki Kabushiki KaishaElectronic map combined with user service information
US529506421 Sep 198815 Mar 1994Videocart, Inc.Intelligent shopping cart system having cart position determining and service queue position securing capability
US53072788 Aug 199126 Apr 1994U.S. Philips CorporationMethod of determining the position of a vehicle, arrangement for determining the position of a vehicle, as well as a vehicle provided with such an arrangement
US531731114 Nov 198931 May 1994Martell David KVehicle carrying receiving means
US53370448 Oct 19919 Aug 1994Nomadic Systems, Inc.System for remote computer control using message broadcasting system
US53393914 Aug 199316 Aug 1994Microelectronics And Computer Technology CorporationComputer display unit with attribute enhanced scroll bar
US537167821 Nov 19916 Dec 1994Nissan Motor Co., Ltd.System and method for navigating vehicle along set route of travel
US53749335 Jan 199320 Dec 1994Zexel CorporationPosition correction method for vehicle navigation system
US537905728 Jul 19933 Jan 1995Microslate, Inc.Portable computer with touch screen and computer system employing same
US539012518 Feb 199314 Feb 1995Caterpillar Inc.Vehicle position determination system and method
US540649028 Feb 199111 Apr 1995Robert Bosch GmbhFor a land vehicle
US541671228 May 199316 May 1995Trimble Navigation LimitedPosition and velocity estimation system for adaptive weighting of GPS and dead-reckoning information
US541689011 Dec 199116 May 1995Xerox CorporationGraphical user interface for controlling color gamut clipping
US546936216 May 199421 Nov 1995Pitney Bowes Inc.Dispatching method and apparatus for monitoring scheduled mail
US54796001 Aug 199426 Dec 1995Wroblewski; David A.Attribute-enhanced scroll bar system and method
US550448211 Jun 19932 Apr 1996Rockwell International CorporationAutomobile navigation guidance, control and safety system
US550870728 Sep 199416 Apr 1996U S West Technologies, Inc.For use in a wireless communication system
US55108011 Mar 199423 Apr 1996Stanford Telecommunications, Inc.Location determination system and method using television broadcast signals
US551976022 Jun 199421 May 1996Gte Laboratories IncorporatedCellular network-based location system
US55239508 May 19954 Jun 1996Peterson; Thomas D.Method and apparatus for providing shortest elapsed time route information to users
US55374608 Jul 199416 Jul 1996Holliday, Jr.; Robert O.Method and apparatus for determining the precise location of a modified cellular telephone using registration messages and reverse control channel transmission
US553939530 Jun 199523 Jul 1996Motorola, Inc.Location dependent information receiving device and method
US553964719 Sep 199523 Jul 1996Matsushita Electric Industrial Co., Ltd.Vehicle navigation system using GPS including correction of coefficients for velocity sensor
US555298927 Oct 19923 Sep 1996Bertrand; GeorgesFor displaying geographical data
US555952026 Sep 199424 Sep 1996Lucent Technologies Inc.Wireless information system for acquiring location related information
US557041228 Sep 199429 Oct 1996U.S. West Technologies, Inc.System and method for updating a location databank
US559857215 Mar 199528 Jan 1997Hitachi, Ltd.Information terminal system getting information based on a location and a direction of a portable terminal device
US56275477 Apr 19956 May 1997Delco Electronics CorporationMapless GPS navigation system in vehicle entertainment system
US562754916 Jan 19966 May 1997Seiko Communications Holding N.V.Dual channel advertising referencing vehicle location
US56280509 Dec 19946 May 1997Scientific And Commercial Systems CorporationDisaster warning communications system
US563020611 Aug 199413 May 1997Stanford Telecommunications, Inc.Position enhanced cellular telephone system
US563624510 Aug 19943 Jun 1997The Mitre CorporationCommunication system
US56423035 May 199524 Jun 1997Apple Computer, Inc.Time and location based computing
US564685316 Jul 19928 Jul 1997Hitachi, Ltd.Traffic control system
US565490820 Jan 19955 Aug 1997Kabushikikaisha Equos ResearchElectronic diary with navigation destination output
US566373225 May 19952 Sep 1997Honeywell Inc.Integrity monitoring method and apparatus for GPS and DGPS receivers
US56753624 Oct 19947 Oct 1997Microslate, Inc.Portable computer with touch screen and computing system employing same
US567557322 Mar 19957 Oct 1997Lucent Technologies Inc.Delay-minimizing system with guaranteed bandwidth delivery for real-time traffic
US567783718 Oct 199514 Oct 1997Trimble Navigation, Ltd.Dial a destination system
US56848591 May 19954 Nov 1997Motorola, Inc.Method and apparatus for downloading location specific information to selective call receivers
US56892527 Mar 199618 Nov 1997Lucent Technologies Inc.For identifying a desired map route
US568927012 Mar 199618 Nov 1997Pinterra CorporationNavigation and positioning system and method using uncoordinated becon signals
US568943118 Apr 199518 Nov 1997Leading Edge Technologies, Inc.Golf course yardage and information system
US570847826 Jun 199613 Jan 1998Sun Microsystems, Inc.Computer system for enabling radio listeners/television watchers to obtain advertising information
US571739213 May 199610 Feb 1998Eldridge; MartyPosition-responsive, hierarchically-selectable information presentation system and control program
US573207416 Jan 199624 Mar 1998Cellport Labs, Inc.Mobile portable wireless communication system
US57426665 Oct 199421 Apr 1998Tele Digital Development, Inc.Emergency mobile telephone
US574586529 Dec 199528 Apr 1998Lsi Logic CorporationFor a geographical area
US574810923 Dec 19945 May 1998Nissan Motor Co., Ltd.Apparatus and method for navigating vehicle to destination using display unit
US57521867 Jun 199512 May 1998Jeman Technologies, Inc.Access free wireless telephony fulfillment service system
US575443017 Mar 199519 May 1998Honda Giken Kogyo Kabushiki KaishaCar navigation system
US57580491 May 199226 May 1998International Business Machines CorporationMethod of and apparatus for providing automatic detection and processing of an empty multimedia data object
US57607736 Jan 19952 Jun 1998Microsoft CorporationMethods and apparatus for interacting with data objects using action handles
US57677953 Jul 199616 Jun 1998Delta Information Systems, Inc.GPS-based information system for vehicles
US577482424 Aug 199530 Jun 1998The Penn State Research FoundationMap-matching navigation system
US577482912 Dec 199530 Jun 1998Pinterra CorporationNavigation and positioning system and method using uncoordinated beacon signals in conjunction with an absolute positioning system
US579363014 Jun 199611 Aug 1998Xerox CorporationSystem for transferring digital information
US579636523 Dec 199618 Aug 1998Lewis; Peter T.Method and apparatus for tracking a moving object
US57966131 Aug 199518 Aug 1998Aisin Aw Co., Ltd.Navigation system for vehicles including present position calculating means
US58060182 Jun 19948 Sep 1998Intellectual Property Development Associates Of Connecticut, IncorporatedMethods and apparatus for updating navigation information in a motorized vehicle
US582530622 Aug 199620 Oct 1998Aisin Aw Co., Ltd.Navigation system for vehicles
US58258841 Jul 199620 Oct 1998Thomson Consumer ElectronicsMethod and apparatus for operating a transactional server in a proprietary database environment
US583155217 Apr 19973 Nov 1998Mitsubishi Denki Kabushiki KaishaTraffic information display unit
US58350616 Jun 199510 Nov 1998Wayport, Inc.Method and apparatus for geographic-based communications service
US583908617 Jul 199517 Nov 1998Sumitomo Electric Industries, Ltd.On-board route display receiving information from external device
US58452279 Feb 19961 Dec 1998Peterson; Thomas D.Method and apparatus for providing shortest elapsed time route and tracking information to users
US584837318 Jul 19978 Dec 1998Delorme Publishing CompanyComputer aided map location system
US586224413 Jul 199519 Jan 1999Motorola, Inc.Satellite traffic reporting system and methods
US58671109 Aug 19962 Feb 1999Hitachi, Ltd.Information reporting system
US587068621 Aug 19979 Feb 1999Ag-Chem Equipment Co., Inc.Intelligent Mobile product application control system
US587252623 May 199616 Feb 1999Sun Microsystems, Inc.GPS collision avoidance system
US587306814 Jun 199416 Feb 1999New North Media Inc.Display based marketing message control system and method
US588358024 Mar 199716 Mar 1999Motorola, Inc.Geographic-temporal significant messaging
US588726916 Jan 199623 Mar 1999Delco Elecronics CorporationData product authorization control for GPS navigation system
US589245421 Oct 19966 Apr 1999Trimble Navigation Ltd.Hybrid monitoring of location of a site confinee
US589389830 Jul 199613 Apr 1999Alpine Electronics, Inc.Navigation system having intersection routing using a road segment based database
US58986805 Nov 199627 Apr 1999Worldspace, Inc.System for providing location-specific data to a user
US58999549 Apr 19964 May 1999Xanavi Informatics CorporationCurrent position calculating system having a function for correcting a distance factor
US590545124 Apr 199718 May 1999Denso CorporationVehicular navigation system
US590846527 Sep 19961 Jun 1999Aisin Aw Co., Ltd.Navigation system for displaying a structure-shape map
US59107999 Apr 19968 Jun 1999International Business Machines CorporationFor a portable data processor
US7421422 *3 Apr 20032 Sep 2008Wsi CorporationMethod for graphical interaction with geographic databases for interactive broadcast presentation
US20080284642 *17 May 200720 Nov 2008International Business Machines CorporationOptimizing bandwidth of a global positioning system
Non-Patent Citations
Reference
1"27 Countries in your pocket"; [online] [Retrieved on Sep. 29, 2005] Retrieved from the Internet <URL: http://www.mio-tech.be/en/printview/press-releases-2005-09-29.htm; 1 page.
2"Animated Transition"; [online] [Retrieved on Oct. 16, 2006] Retrieved from the Internet <URL: http://designinterfaces.com/Animated-Transition; 2 pages.
3"Cyberguide: a mobile context-aware tour guide", Wireless Networks Archive (Special Issue: Mobile computing and networking; selecting papers from MobiCom '96), 3(5):421-433, 1997.
4"DaimlerCrysler Guide5 Usecases Overview Map", 1 page (no reference date).
5"Frontiers in electronic media", Interactions Archive 4(4):32-64, 1997.
6"International Roaming Guide-Personal Experience(s) from Customer and Community Member"; [online] [Retrieved Jun. 26, 2006] Retrieved from the Internet <URL: http://forums.cingular.com/cng/board/message?board.id=1185; 6 pages.
7"iPhone Software/Hardware Hack: LocoGPS-GPS Add-on for the iPhone"; [online] [Retrieved on Dec. 25, 2007] Retrieved from the Internet <URL: http://www.iphonehacks.com/iphone-applications/index.html; 41 pages.
8"Location-aware mobile applications based on directory services", International Conference on Mobile Computing and Networking Archive, Proceedings on the 3rd Annual ACM/IEEE International Conference on Mobile Computing and Networking, Budapest, Hungary, pp. 23-33, 1997.
9"Mio 269+ Users Manula"; 2005; 44 pages.
10"New program for mobile blogging for PocketPC released: MY BLOG"; [online] [Retrieved on Apr. 5, 2006]; Retrieved from the Internet, URL: http://msmobiles.com/news.php/4067.html.
11"Numbering and Dialing Plan within the United States", Alliance for Telecommunications Industry Solutions; 2005; 17 pages.
12"Travel Time Data Collection Handbook-Chapter 5: Its Probe Vehicle Techniques", FHWA-PL-98-035 Report, Department of Transport, University of Texas, Mar. 1998; [online] [Retrieved from the Internet at http://www.fhwa.dot.gov/ohim/handbook/chap5.pdf.
13"User-centered design of mobile solutions", NAMAHN, 2006, 18 pages.
14"User's Manual MioMap 2.0"; Aug. 2005; 60 pages.
15"Windows Live Search for Mobile Goes Final, Still Great"; [online] [Retrieved on Mar. 11, 2007]; Retrieved from the Internet, URL: http://gizmodo.com/gadgets/software/windows-live-search-for-mobile-goes-final-still-great-236002.php; 3 pages.
16"Windows Mobile 6 Professional Video Tour"; [online] [Retrieved on Mar. 11, 2007]; Retrieved from the Internet, URL: http://gizmodo.com/gadgets/cellphones/windows-mobile-6-professional-video-tour-237039.php; 4 pages.
17"Windows Mobile"; Microsoft; 2007, 2 pages.
18"3rd Generation Partnership Project (3GPP); Technical Specification Group (TSG) RAN; Working Group 2 (WG2); Report on Location Services (LCS)," 3G TR 25.923 v.1.0.0, Apr. 1999, 45 pages.
19"3rd Generation Partnership Project (3GPP); Technical Specification Group (TSG) RAN; Working Group 2 (WG2); Report on Location Services," TS RAN R2.03 V0.1.0, Apr. 1999, 43 pages.
20"3rd Generation Partnership Project; Technical Specification Group Radio Access Network; Stage 2 Functional Specification of Location Services in UTRAN," 3G TS 25.305 v.3.1.0, Mar. 2000, 45 pages.
21"3rd Generation Partnership Project; Technical Specification Group Services and System Aspects; Functional stage 2 description of location services in UMTS," 3G TS 23.171 v.1.1.0, Nov. 1999, 42 pages.
22"Animated Transition"; [online] [Retrieved on Oct. 16, 2006] Retrieved from the Internet <URL: http://designinterfaces.com/Animated—Transition; 2 pages.
23"Enabling UMTS / Third Generation Services and Applications," No. 11 Report from the UMTS Forum, Oct. 2000, 72 pages.
24"Error: could not find a contact with this e-mail address." Outlookbanter.com. Dec. 2006, 12 pages.
25"Estonian operator to launch world's first Network-based location services," Ericsson Press Release, Oct. 11, 1999, 2 pages.
26"International Roaming Guide—Personal Experience(s) from Customer and Community Member"; [online] [Retrieved Jun. 26, 2006] Retrieved from the Internet <URL: http://forums.cingular.com/cng/board/message?board.id=1185; 6 pages.
27"iPhone Software/Hardware Hack: LocoGPS—GPS Add-on for the iPhone"; [online] [Retrieved on Dec. 25, 2007] Retrieved from the Internet <URL: http://www.iphonehacks.com/iphone—applications/index.html; 41 pages.
28"LaBarge in joint venture on bus system," Internet: URL: http://www.bizjournals.com/stlouis/stories/1998/08/10/focus2.html?t-printable, Aug. 7, 1998, 1 page.
29"New Handsets Strut Their Stuff At Wireless '99," Internet: URL: http://findarticles.com/p/articles/mi—m0BMD/is—1999—Feb—11/ai—n27547656/ downloaded from Internet on Feb. 11, 1999, 3 pages.
30"Report on Location Service feature (LCS) 25.923 v1.0.0," TSG-RAN Working Group 2 (Radio layer 2 and Radio layer 3), Berlin, May 25-28, 1999, 45 pages.
31"Revised CR to 09/31 on work item LCS," ETSI SMG3 Plenary Meeting #6, Nice, France, Dec. 13-15, 1999. 18 pages.
32"School Buses to Carry Noticom's First Application," Internet: URL: http://findarticles.com/p/articles/mi—m0BMD/is—1999—Feb—17/ai—n27547754/ downloaded from the Internet on Feb. 17, 1999, 2 pages.
33"Travel Time Data Collection Handbook—Chapter 5: Its Probe Vehicle Techniques", FHWA-PL-98-035 Report, Department of Transport, University of Texas, Mar. 1998; [online] [Retrieved from the Internet at http://www.fhwa.dot.gov/ohim/handbook/chap5.pdf.
34Abowd et al., "Context-awareness in wearable and ubiquitous computing," 1st International Symposium on Wearable Computers, Oct. 13-14, 1997, Cambridge, MA, 9 pages.
35Akerblom, "Tracking Mobile Phones in Urban Areas," Goteborg University Thesis, Sep. 2000, 67 pages.
36Anand et al., "Quantitative Analysis of Power Consumption for Location-Aware Applications on Smart Phones", IEEE International Symposium on Industrial Electronics, 2007.
37Ayatsuka et al., "UbiquitousLinks: Hypermedia Links Embedded in the Real World, Technical Report of Information Processing Society, 96-HI-67," Information Processing Society of Japan, Jul. 11, 1996, 96(62):23-30.
38Balliet, "Transportation Information Distribution System", IBM Technical Disclosure Bulletin, [online] [Retrieved Nov. 7, 2008] Retrieved from the Internet, URL: https://www.delphion.com/tdbs/tdb?order=86A+61395; Jun. 1986; 2 pages.
39Balsiger et al., "MOGID: Mobile Geo-depended Information on Demand," Workshop on Position Dependent Information Services (W3C-WAP), 2000, 8 pages.
40Beard et al., "Estimating Positions and Paths of Moving Objects", IEEE 2000, pp. 1-8.
41Bederson, B.B., Audio Augmented Reality: A Prototype Automated Tour Guide [online] [retrieved on Aug. 30, 2002] [retrieved from http://www.cs.umd.edu/~bederson/papers/chi-95-aar/] pp. 1-4.
42Bederson, B.B., Audio Augmented Reality: A Prototype Automated Tour Guide [online] [retrieved on Aug. 30, 2002] [retrieved from http://www.cs.umd.edu/˜bederson/papers/chi-95-aar/] pp. 1-4.
43Benefon ESC! GSM+GPS Personal Navigation Phone, benefon.com, Copyright 2001, 4 pages.
44Berman et al., "The Role of Dead Reckoning and Inertial Sensors in Future General Aviation Navigation", IEEE, 1998, pp. 510-517.
45Bevly et al., "Cascaded Kalman Filters for Accurate Estimation of Multiple Biases, Dead-Reckoning Navigation, and Full State Feedback Control of Ground Vehicles", IEEE Transactions on Control Systems in Technology, vol. 15, No. 2, Mar. 2007, pp. 199-208.
46Binzhuo et al., "Mobile Phone GIS Based on Mobile SVG", IEEE 2005.
47Blumenberg et al., U.S. Appl. No. 12/119,316, filed May 12, 2008.
48Bokharouss et al., "A Location-Aware Mobile Call Handling Assistant", International Conference on Advanced Information Networking and Applications Workshops, 2007.
49Bonsignore, "A Comparative Evaluation of the Benefits of Advanced Traveler Information System (ATIS) Operational Tests," MIT Masters Thesis, Feb. 1994, 140 pages.
50Boonsrimuang et al., "Mobile Internet Navigation System", IEEE, 2002, pp. 325-328.
51Borsodi, "Super Resolution of Discrete Arrivals in a Cellular Geolocation System," University of Calgary Thesis, Apr. 2000, 164 pages.
52Brown, "The stick-e document: a framework for creating context-aware applications," Electronic Publishing, 1995, 8:259-272.
53Brown, "Triggering Information by Context," Personal Technologies, 1998, 2:18-27.
54Budka et al., "A Bayesian method to Improve Mobile Geolocation Accuracy", IEEE, 2002, pp. 1021-1025.
55Burnett, "Usable Vehicle Navigation Systems: Are We There Yet?" Vehicle Electronic Systems 2000, Jun. 29-30, 2000, 3.1.1-3.1.12.
56Camp et al., "A computer-based method for predicting transit time systems", Decsision Sciences, vol. 5, pp. 339-346, 1974.
57Carew; "Phones that tell you where to drive, meet, eat"; [online] [Retrieved May 26, 2007]; Retrieved from the Internet <URL httlp://news.yahoo.com/s/nm/20070525/wr-nm/column-pluggedin-dc-2&printer=1;-ylt=Ahqaftn7xmlS2r0FZFeu9G4ht.cA; 2 pages.
58Carew; "Phones that tell you where to drive, meet, eat"; [online] [Retrieved May 26, 2007]; Retrieved from the Internet <URL httlp://news.yahoo.com/s/nm/20070525/wr—nm/column—pluggedin—dc—2&printer=1;—ylt=Ahqaftn7xmlS2r0FZFeu9G4ht.cA; 2 pages.
59Challe, "CARMINAT-An Integrated information and guidance system," Vehicle Navigation and Information Systems Conference, Oct. 20-23, 1991, Renault-Direction de la Recherche, Rueil-Malmaison, France.
60Challe, "CARMINAT—An Integrated information and guidance system," Vehicle Navigation and Information Systems Conference, Oct. 20-23, 1991, Renault—Direction de la Recherche, Rueil-Malmaison, France.
61Change Request for "U.S. specific Emergency Services requirements included as an informative annex," Nov. 29, 1999, 2 pages.
62Charny, "AT&T puts 411 to the text"; [online] [Retrieved Mar. 4, 2009]; Retrieved from the Internet <URL http://news.cnet.com/ATT-puts-411-to-the-text/2100-1039-3-1000669.html; May 8, 2003; 2 pages.
63Charny, "AT&T puts 411 to the text"; [online] [Retrieved Mar. 4, 2009]; Retrieved from the Internet <URL http://news.cnet.com/ATT-puts-411-to-the-text/2100-1039—3-1000669.html; May 8, 2003; 2 pages.
64Cheverst et al., "Architectural Ideas for the Support of Adaptive Context-Aware Applications," Proceedings of Workshop on Infrastructure for Smart Devices—How to Make Ubiquity an Actuality, HUC'00, Bristol, Sep. 2000, 3 pages.
65Cheverst et al., "Design of an Object Model for a Context Sensitive Tourist Guide," Computers and Graphics, 1999, 23(6):883-891.
66Cheverst et al., "Developing Interfaces for Collaborative Mobile Systems," 1999, 15 pages.
67Cheverst et al., "Experiences of Developing and Deploying a Context-Aware Tourist Guide: The GUIDE Project," 2000, pp. 20-31.
68Cheverst et al., "Exploiting Context to Support Social Awareness and Social Navigation," SIGGROUP Bulleting Dec. 2000, 21(3):43-48.
69Cheverst et al., "Services to Support Consistency in Mobile Collaborative Applications," Proc. 3rd International Workshop on Services in Distributed Networked Environments, 1996, 8 pages.
70Cheverst et al., "Sharing (Location) Context to Facilitate Collaboration Between City Visitors," 2000, 8 pages.
71Cheverst et al., "Supporting Collaboration in Mobile-aware Groupware," Workshop on Handheld CSCW, 1998, 6 pages.
72Cheverst et al., "The Role of Connectivity in Supporting Context-Sensitive Applications," HUC'99, LNCS 1707, 1999, pp. 193-209.
73Cheverst et al., "The Support of Mobile-Awareness in Collaborative Groupware," Personal Technologies, 1999, 3:33-42.
74Cho et al., A Traveler Information Service Structure in Hybrid T-DMB and Cellular Communication Network, Broadcast Systems Research Group, IEEE, 2006, pp. 747-750.
75Christie et al., "Development and Deployment of GPS wireless devices for E911 and Location based services", IEEE 2002.
76Chua et al., "Intelligent Portal for Event-triggered SMS Alerts", 2nd International Conference on Mobile Technology, Applications and Systems, 2005.
77Civilis et al., "Efficient Tracking of Moving Objects with Precision Guarantees", IEEE, Proceedings of the First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services, 2004, 10 pages.
78Clarke et al., "Development of Human Factors Guidelines for Advanced Traveler Information Systems (ATIS) and Commercial Vehicle Operations (CVO): Comparable Systems Analysis," U.S. Department of Transportation Federal Highway Administration, Publication No. FHWA-RD-95-197, Dec. 1996, 212 pages.
79Costa et al., "Experiments with Reflective Middleware," Proceedings of the ECOOP'98 Workshop on Reflective Object-Oriented Programming and Systems, ECOOP'98 Workshop Reader, 1998, 13 pages.
80Dalrymple, "Google Maps adds locator, but not for iPhone," [online] [Retrieved Nov. 30, 2007]; Retrieved from the Internet URL: http://news.yahoo.com/s/macworld/20071130/tc-macworld/googlemaps20071130-0&printer=1;-ylt=Auvf3s6LQK-pOaJ1b954T-DQn6gB; 1 page.
81Dalrymple, "Google Maps adds locator, but not for iPhone," [online] [Retrieved Nov. 30, 2007]; Retrieved from the Internet URL: http://news.yahoo.com/s/macworld/20071130/tc—macworld/googlemaps20071130—0&printer=1;—ylt=Auvf3s6LQK—pOaJ1b954T—DQn6gB; 1 page.
82Davies et al., "‘Caches in the Air’: Disseminating Tourist Information in the Guide System," Second IEEE Workshop on Mobile Computer Systems and Applications, Feb. 25-26, 1999, 9 pages.
83Davies et al., "L2imbo: A distributed systems platform for mobile computing," Mobile Networks and Applications, 1998, 3:143-156.
84Dey et al., "CyberDesk: a framework for providing self-integrating context-aware services," Knowledge-Based Systems, 1998, 11:3-13.
85Dey, "Context-Aware Computing: The CyberDesk Project," [online] Retrieved from the Internet: URL: http://www.cc.gatech.edu/fce/cyberdesk/pubs/AAAI98/AAAI98.html; AAAI '98 Spring Symposium, Stanford University, Mar. 23-25, 1998, downloaded from the Internet on Aug. 6, 2010, 8 pages.
86Dibdin, Peter, "Where are mobile location based services?", Dec. 14, 2001, pp. 1-8.
87Digital cellular telecommunications system (Phase 2+); Location Services (LCS); Service description, Stage 1 (GSM 02.71) ETSI, Apr. 1999, 22 pages.
88Dix et al., "Exploiting Space and Location as a Design Framework for Interactive Mobile Systems," ACM Transactions on Computer-Human Interaction (TOCHI)—Special issue on human-computer interaction with mobile systems, 2000, 7(3):285-321.
89Drane and Rizos, "Role of Positioning Systems in ITS," Positioning Systems in Intelligent Transportation Systems, Dec. 1997, pp. 312, 346-349.
90Drane et al., "The accurate location of mobile telephones", Third Annual World Congress on Intelligent Transport Systems, Orlando, Florida, Oct. 1996.
91Drane et al., "Positioning GSM Telephones," IEEE Communications Magazine, Apr. 1998, pp. 46-59.
92Dunn et al., "Wireless Emergency Call System", IBM TDB, Sep. 1994.
93Ebine, "Dual Frequency resonant base station antennas for PDC systems in Japan", IEEE, pp. 564-567, 1999.
94Efstratiou and Cheverst, "Reflection: A Solution for Highly Adaptive Mobile Systems," 2000 Workshop on Reflective Middleware, 2000, 2 pages.
95Efstratiou et al., "Architectural Requirements for the Effective Support of Adaptive Mobile Applications," 2000, 12 pages.
96Evans, "In-Vehicle Man-Machine Interaction the Socrates Approach", Vehicle Navigation & Information System Conference Proceedings, Aug. 31-Sep. 2, 1994, pp. 473-477.
97Feddema et al., "Cooperative Sentry Vehicles and Differential GPS Leapfrog," 2000, United States Department of Energy, pp. 1-12.
98Fischer et al., "System Performance Evaluation of Mobile Positioning Methods," IEEE, Aug. 2002, pp. 1962-1966.
99Flinn and Satyanarayanan, "PowerScope: A Tool for Profiling the Energy Usage of Mobile Applications," Proc. WMCSA '99 Second IEEE Workshop on Mobile Computing Systems and Applications, Feb. 25-26, 1999, 9 pages.
100FM 3-25.26 Map Reading and Land Navigation Field Manual No. 3-25.26, Headquarters Department of the Army, Washington, DC [online] [retrieved on Apr. 9, 2004] [retrieved from http://155.217.58.58/cgi-bin/atdl.dll/fm/3-25.26/toc.htm] Jul. 20, 2001, pp. 1-7 and J-1 to J-3.
101French and Driscoll, "Location Technologies for ITS Emergency Notification and E911," Proc. 1996 National Technical Meeting of the Institute of Navigation, Jan. 22-24, 1996, pp. 355-359.
102Freundschuh, "Does ‘Anybody’ Really Want (Or Need) Vehicle Navigation Aids?" First Vehicle Navigation and Information System Conference, Sep. 11-13, 1989, Toronto, Canada, 5 pages.
103Friday et al., "Developing Adaptive Applications: The MOST Experience," J. Integrated Computer-Aided Engineering, 1999, pp. 143-157.
104Gould, "The Provision of Usable Navigation Assistance: Considering Individual Cognitive Ability," First Vehicle Navigation and Information System Conference, Sep. 11-13, 1989, Toronto, Canada, 7 pages.
105GPS 12 Personal Navigator Owner's Manual & Reference, Garmin Corporation, Jan. 1999, pp. 1-60.
106Green et al., "Suggested Human Factors Design Guidelines for Driver Information Systems," Technical Report UMTRI-93-21, Nov. 1993, 119 pages.
107Gunnarsson et al., "Location Trial System for Mobile Phones," IEEE, 1998, pp. 2211-2216.
108Guo et al., "An Intelligent Query System based on Chinese Short Message Service for Restaurant Recommendation", IEEE 2007, 1 page.
109Hameed et al., "An Intelligent Agent-Based Medication and Emergency System", IEEE 2006.
110Helal et al., "Drishti: An Integrated Navigation System for Visually Impaired and Disabled", Fifth International Symposium on Wearable Computers (ISWC'01), IEEE, 2001, pp. 149-156.
111Herz, U.S. Appl. No. 12/270,814, filed Nov. 13, 2008.
112Hodes and Katz, "Composable ad hoc location-based services for heterogeneous mobile clients," Wireless Networks, 1999, 5:411-427.
113Hohman et al., "GPS Roadside Integrated Precision Positioning System", Position Location and Navigation Symposium (IEEE 2000), pp. 221-230.
114Hoogenraad, "Location Dependent Services," 3rd AGILE Conference on Geographic Information Science, Helsinki/Espoo, Finland, May 25-27, 2000, pp. 74-77.
115International Numbering and SMS-Type of Numbering, TON, Numbering Plan Indicator, NPI, [online] [Retrieved Jan. 5, 2007] Retrieved from the Internet <URL: http://www.activeexperts.com/support/activsms/tonnpi/.
116International Numbering and SMS—Type of Numbering, TON, Numbering Plan Indicator, NPI, [online] [Retrieved Jan. 5, 2007] Retrieved from the Internet <URL: http://www.activeexperts.com/support/activsms/tonnpi/.
117International Search Report and Written Opinion, dated Jun. 9, 2008, issued in Interntiaonal Application No. PCT/US2007/088880, filed Dec. 27, 2007.
118International Search Report and Written Opinion, dated Oct. 1, 2009, issued in PCT/US2009/041298.
119Jain, R., Potential Networking Applications of Global Positioning Systems (GPS) [online] [retrieved on Nov. 18, 2008] [retrieved from http://arxiv.org/ftp/cs/papers/9809/9809079.pdf] OSU Technical Report TR-24, Apr. 1996, pp. 1-40.
120Jirawimut et al., "A Method for Dead Reckoning Parameter Correction in Pedestrian Navigation System", IEEE Transactions on Instrumentation and Measurement, vol. 52, No. 1, Feb. 2003, pp. 209-215.
121Johnson, U.S. Appl. No. 11/827,065, filed Jul. 10, 2007.
122Johnson, U.S. Appl. No. 12/044,363, filed Mar. 7, 2008.
123Jose and Davies, "Scalable and Flexible Location-Based Services for Ubiquitous Information Access," HUC'99, LNCS 1707, 1999, pp. 52-66.
124Ju et al., "RFID Data Collection and Integration based on Mobile Agent", IEEE, 2006.
125Kbar et al., "Mobile Station Location based on Hybrid of Signal Strength and Time of Arrival", IEEE, 2005.
126Khattak et al., "Bay Area ATIS Testbed Plan," Research Reports, California Partners for Advanced Transit and Highways (PATH), Institute of Transportation Studies, UC Berkeley, Jan. 1, 1992, 83 pages.
127Klinec and Nolz, "Nexus-Positioning and Communication Environment for Spatially Aware Applications," IAPRS, Amsterdam, 2000, 7 pages.
128Koide et al., "3-D Human Navigation System with Consideration of Neighboring Space Information", IEEE International Conference on Systems, Man and Cybernetics, 2006 (SMC '06), vol. 2, (Oct. 8-11, 2006), pp. 1693-1698.
129Kovacs et al., "Adaptive Mobile Access to Context-aware Services," Proc. ASAMA '99 Proc. First International Symposium on Agent Systems and Applications Third International Symposium on Mobile Agents, IEEE Computer Society Washington, DC, 1999, 12 pages.
130Kreller et al., "A Mobile-Aware City Guide Application," ACTS Mobile Communication Summit, 1998, Rhodes, Greece, 7 pages.
131Kreller et al., "UMTS: A Middleware Architecture and Mobile API/Approach," IEEE Personal Communications, Apr. 1998, pp. 32-38.
132Kugler and Lechner, "Combined Use of GPS and LORAN-C in Integrated Navigation Systems," Fifth International Conference on Satellite Systems for Mobile Communications and Navigation, London, UK, May 13-15, 1996, pp. 199-207.
133Kyriazakos et al., "Optimization of the Handover Algorithm based on the Position of the Mobile Terminals," Communications and Vehicular Technology, Oct. 2000, pp. 155-159.
134Leonhardt and Magee, "Multi-Sensor Location Tracking," MOBICOM 98, Dallas, TX, pp. 203-214.
135Leonhardt and Magee, "Towards a general location service for mobile environments," Proc. Third International Workshop on Services in Distributed and Networked Environments, Jun. 3-4, 1996, 8 pages.
136Lloyd et al., "Cellular phone base stations installation violate the Electromagnetic Compatibility regulations", IEEE, 2004.
137Long et al., "Rapid Prototyping of Mobile Context-Aware Applications: The Cyberguide Case Study," MobiCom '96, 1996, 11 pages.
138Low et al., U.S. Appl. No. 12/233,358, filed Sep. 18, 2008.
139Lusky et al., "Mapping the Present," ColoradoBiz, Nov. 1999, 26(11):16-17.
140Mahmassani et al., "Providing Advanced and Real-Time Travel/Traffic Information to Tourists," Center for Transportation Research, Bureau of Engineering Research, The University of Texas at Austin, Oct. 1998, 15 pages.
141Manabe et al., "On the M-CubITS Pedestrian Navigation System", IEEE, 2006, pp. 793-798.
142Mark, "A Conceptual Model for Vehicle Navigation Systems," First Vehicle Navigation and Information System Conference, Sep. 11-13, 1989, Toronto, Canada, 11 pages.
143Maxwell et al., "Alfred: The Robot Waiter Who Remembers You," AAAI Technical Report WS-99-15, 1999, 12 pages.
144McCarthy and Meidel, "ACTIVEMAP: A Visualization Tool for Location Awareness to Support Informal Interactions," HUC '99, LNCS 1707, 1999, pp. 158-170.
145Meier et al., "Location-Aware Event-Base Middleware: A Paradigm for Collaborative Mobile Applications?", Sep. 2003.
146Microsoft Outlook 2003 User's Guide, http://opan.admin.ufl.edu/user—guides/outlook2003.htm. Aug. 2004, 17 pages.
147Miller et al., "Synchronization of Mobile XML Databases by Utilizing Deferred Views", IEEE 2004.
148Miller et al., "Integrating Hierarchical Navigation and Querying: A User Customizable Solution," ACM Multimedia Workshop on Effective Abstractions in Multimedia Layout, Presentation, and Interaction, San Francisco, CA, Nov. 1995, 8 pages.
149Muraskin, "Two-Minute Warnings for School Bus Riders," Internet: URL: http://www.callcentermagazine.com/shared/printableArticle.jhtml:jsessionid=PQH1SZXW... Jul. 1, 1999, 3 pages.
150Nagao et al., Walk Navi: A Location-Aware Interactive Navigation/Guideline System and Software III, First edition, pp. 9-48, published by Kindai-Kagaku-Sya Co. Ltd., Dec. 10, 1995.
151Nardi et al., "Integrating Communication and Information through Contact Map", Communications of the ACM, vol. 45, No. 4, Apr. 2002.
152Ni and Deakin, "On-Board Advanced Traveler Information Systems," Dec. 1, 2002, 10 pages.
153Noonan and Shearer, "Intelligent Transportation Systems Field Operational Test Cross-Cutting Study Advance Traveler Information systems," Intelligent Transportation Systems Field Operational Test Cross-Cutting Study, Sep. 1998, 26 pages.
154Northard, "Docking Station Communication Link", IBM TDB, Feb. 1994.
155O'Grady et al., "A Tourist-Centric Mechanism for Interacting with the Environment," Proceedings of the First International Workshop on Managing Interactions in Smart Environments (MANSE '99), Dublin, Ireland, Dec. 1999, pp. 56-67.
156Oh et al., "Spatial Applications Using 4S Technology for Mobile Environment", IEEE 2002.
157Paksoy et al., "The Global Position System-Navigation Tool of the Future", Journal of Electrical & Electronics, 2002, vol. 2, No. 1, pp. 467-476.
158Paksoy et al., "The Global Position System—Navigation Tool of the Future", Journal of Electrical & Electronics, 2002, vol. 2, No. 1, pp. 467-476.
159Parikh, "Tele Locate", IBM Technical Disclosure Bulletin, [online] [Retrieved Nov. 7, 2008] Retrieved from the Internet, URL: https://www.delphion.com/tdbs/tdb?order=92A+62775; Sep. 1992; 1 page.
160Partial International Search Report, dated Jul. 29, 2008, issued in corresponding PCT/US2008/050295.
161Pascoe et al., "Developing Personal Technology for the Field," Personal Technologies, 1998, 2:28-36.
162Pfoser et al., "Dynamic Travel Time Maps-Enabling Efficient Navigation", Proceedings of the 18th International Conference on Scientific and Statistical Database Management (SSDBM'06), IEEE, 10 pages.
163Pfoser et al., "Dynamic Travel Time Maps—Enabling Efficient Navigation", Proceedings of the 18th International Conference on Scientific and Statistical Database Management (SSDBM'06), IEEE, 10 pages.
164Popescu-Zeletin et al., "Applying Location-Aware Computing for Electronic Commerce: Mobile Guide," Proc. 5th Conference on Computer Communications, AFRICOM-CCDC'98,Oct. 20-22, 1998, 14 pages.
165Portfolio 2007; [online] [Retrieved on Jun. 14, 2007]; Retrieved from the Internet, URL: http://eric.wahlforss.com/folio; 3 pages.
166Pungel, "Traffic control—beat the jam electronically," Funkschau, 1988, 18:43-45.
167RD 409052, Research Disclosure Alerting Abstract, "Location dependent information for satellite based vehicle communication-required application of Global Position System (GPS) to automatically extract relevant portions of data package as vehicle changes position," May 10, 1998, 1 page.
168Rekimoto, J., Augment-able Reality: Situated Communication through Physical and Digital Spaces, iswc, pp. 68, Second International Symposium on Wearable computers (ISWC'98), 1998, pp. 1-8.
169Review Guide-Google Maps for mobile (beta); Google; 2006; 7 pages.
170Review Guide—Google Maps for mobile (beta); Google; 2006; 7 pages.
171Rillings and Betsold, "Advanced driver information systems," Vehicular Technology, IEEE Vehicular Technology Society, 1991, 40:31-40.
172Rogers et al., "Adaptive User Interfaces for Automotive Environments", IEEE Intelligent Vehicles Symposium 2000, Oct. 3-5, 2000, pp. 662-667.
173Rozier, J., Hear & There: An Augmented Reality System of Linked Audio, Proceedings of the International Conference on Auditory Display, Atlanta, GA, Apr. 2000, pp. 1-6.
174Samadani et al., "PathMaker: Systems for Capturing Trips", IEEE (2004) International Conference on Multimedia and Expo., Publication Date: Jun. 27-30, 2004, vol. 3, pp. 2123-2126, 2004.
175Sazegari et al., U.S. Appl. No. 12/122,339, filed May 16, 2008.
176Schreiner, "Where We At? Mobile Phones Brings GPS to the Masses", IEEE Computers Society, May/Jun. 2007, pp. 6-11.
177Serafin et al., "Functions and Features of Future Driver Information Systems," Technical Report UMTRI-91-16, May 1991, 104 pages.
178Sharpe et al., U.S. Appl. No. 12/434,582, filed May 1, 2009.
179Sharpe et al., U.S. Appl. No. 12/434,586, filed May 1, 2009.
180Shekhar and Liu, "Genesis and Advanced Traveler Information Systems (ATIS): Killer Applications for Mobile Computing?" NSF Mobidata Workshop on Mobile and Wireless Information Systems, Nov. 1994, 20 pages.
181Shibata et al., "Development and Integration of Generic Components for a Teachable Vision-Based Mobile Robot," IEEE/ASME Transactions on Mechatronics, 1996, 1(3):230-236.
182Spohrer. "New Paradigms for Using Computers", 1997; retrieved from the Internet, URL: .
183Spohrer. "New Paradigms for Using Computers", 1997; retrieved from the Internet, URL: <http://almaden.ibm.com/npuc97/1997/spohrer.htm>.
184Sung et al., "Towards Reliable Peer-to-Peer Data Sharing over Mobile Ad hoc Networks", IEEE, 2005.
185Tarumi et al., "Public Applications of SpaceTag and Their Impacts," Digital Cities, LNCS 1765, 2000, pp. 350-363.
186Tebbutt, "Dial your way out of the woods," The Australian, Feb. 2000, 1 page.
187Tijerina et al., "Driver Workload Assessment of Route Guidance System Destination Entry While Driving: A Test Track Study," Proceedings of the 5th ITS World Congress, Oct. 12-16, 1998, Seoul, Korea, 9 pages.
188Tso et al., "Always On, Always Connected Mobile Computing," Mobile Communications Operation—Mobile Handheld Products Group, 1996, pp. 918-924.
189Tsuzawa and Okamoto, "Advanced Mobile Traffic Information and Communication System," First Vehicle Navigation and Information Systems Conference, Sep. 11-13, 1989, Toronto, Canada, Abstract only.
190US 6,731,928, May 4, 2004, Tanaka (withdrawn).
191Van Os et al., U.S. Appl. No. 12/165,413, filed Jun. 30, 2008.
192Wang and Lin, "Location Aware Information Agent over WAP," Tamkang Journal of Science and Engineering, 2000, 3(2):107-115.
193Wang et al., "A Unified Vehicle Supervising and Traffic Information System", IEEE, 1996, pp. 968-972.
194Weiss et al., "Zone services-An approach for location-based data collection", Proceedings of the 8th International Conference on E-commerce Technology and the 3rd IEEE International Conference on Enterprise Computing, E-Commerce and E-Services (8 pages), 2006.
195Weiss et al., "Zone services-An approach for location-based data collection", Proceedings of the 8th International Conference on E-commerce Technology and the 3rd IEEE International Conference on Enterprise Computing, E-Commerce and E-Services, 2006; 8 pages.
196Weiss et al., "Zone services—An approach for location-based data collection", Proceedings of the 8th International Conference on E-commerce Technology and the 3rd IEEE International Conference on Enterprise Computing, E-Commerce and E-Services (8 pages), 2006.
197Weiss et al., "Zone services—An approach for location-based data collection", Proceedings of the 8th International Conference on E-commerce Technology and the 3rd IEEE International Conference on Enterprise Computing, E-Commerce and E-Services, 2006; 8 pages.
198Wheeler et al., "Development of Human Factors Guidelines for Advanced Traveler Information Systems and Commercial Vehicle Operations: Task Analysis of ATIS/CVO Functions," US Dept. Transportation Federal Highway Administration Research and Development, Publication No. FHWA-RD-95-176, Nov. 1996, 124 pages.
199Wong, "GPS: making roads safer and solving traffic tangles," Asia Engineer, 1995, 23(9):31-32.
200Yamamoto et al., "Position Location Technologies Using Signal Strength in Cellular Systems", IEEE, 2001, pp. 2570-2575.
201Yang et al. "Global Snapshots for Distributed Debugging", IEEE, pp. 436-440, 1992.
202Yang et al., "A Mutlimedia System for Route Sharing and Video-based Navigation", IEEE, 2006, pp. 73-76.
203Yanyan et al., "The model of optimum route selection in vehicle automatic navigation system based on unblocked reliability analyses", IEEE 2003.
204Ygnace et al., "Travel Time Estimation on the San Francisco Bay Area Network Using Cellular Phones as Probes", Working Paper, Institute of Transportation Studies, University of California, Berkeley, 2000.
205Yim et al., "Travinfo Field Operational Test: Work Plan for the Target, Network, and Value Added Reseller (VAR) Customer Studies," Working Papers, California Partners for Advanced Transit and Highways (PATH), Institute of Transportation Studies, UC Berkeley, Apr. 1, 1997, 49 pages.
206Yogesh C. Rathod, Third Party Submission in U.S. Appl. No. 12/233,358 mailed Mar. 30, 2010, 12 pages.
207Yokote, "The Apertos Reflective Operating System: The Concept and Its Implementation," OOPSLA'92, pp. 414-434.
208Zhao, "Mobile Phone Location Determination and Its Impact on Intelligent Transportation Systems," IEEE Transactions on Intelligent Transportation Systems, Mar. 2000, 1(1):55-64.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US8213958 *28 Dec 20093 Jul 2012Chi Mei Communication Systems, Inc.Electronic device and method for managing call records
US8296055 *2 Jun 200823 Oct 2012Randy Lawrence CanisMethod and system for positional communication
US836417123 Jul 201229 Jan 2013Enhanced Geographic LlcSystems and methods to determine the current popularity of physical business locations
US843777623 Jul 20127 May 2013Enhanced Geographic LlcMethods to determine the effectiveness of a physical advertisement relating to a physical business location
US844733123 Jul 201221 May 2013Enhanced Geographic LlcSystems and methods to deliver digital location-based content to a visitor at a physical business location
US8509123 *1 Mar 201313 Aug 2013Voxer Ip LlcCommunication application for conducting conversations including multiple media types in either a real-time mode or a time-shifted mode
US851545913 Jan 201320 Aug 2013Enhanced Geographic LlcSystems and methods to provide a reminder relating to a physical business location of interest to a user when the user is near the physical business location
US8558847 *13 Jul 200915 Oct 2013Raytheon CompanyDisplaying situational information based on geospatial data
US855997715 Mar 201315 Oct 2013Enhanced Geographic LlcConfirming a venue of user location
US856623612 Nov 201222 Oct 2013Enhanced Geographic LlcSystems and methods to determine the name of a business location visited by a user of a wireless device and process payments
US8571259 *17 Jun 200929 Oct 2013Robert Allan MargolisSystem and method for automatic identification of wildlife
US8589075 *19 Oct 201119 Nov 2013Google Inc.Method, system, and computer program product for visualizing trip progress
US8612144 *23 Oct 200617 Dec 2013Robert Bosch GmbhMethod for operating a navigation device and a corresponding navigation device
US86261944 Dec 20127 Jan 2014Enhanced Geographic LlcSystems and methods to determine the name of a business location visited by a user of a wireless device and provide suggested destinations
US8804990 *13 Mar 201312 Aug 2014Acer IncorporatedPortable apparatus
US20090299618 *23 Oct 20063 Dec 2009Rainer CorneliusMethod for operating a navigation device and a corresponding navigation device
US20090299628 *2 Jun 20083 Dec 2009Randy & Terri CanisMethod and system for positional communication
US20100304723 *28 Dec 20092 Dec 2010Chi Mei Communication Systems, Inc.Electronic device and method for managing call records
US20100305844 *14 Sep 20092 Dec 2010Choi Sung-HaMobile vehicle navigation method and apparatus thereof
US20100322483 *17 Jun 200923 Dec 2010Robert Allan MargolisSystem and method for automatic identification of wildlife
US20110010674 *13 Jul 200913 Jan 2011Raytheon CompanyDisplaying situational information based on geospatial data
US20110064312 *14 Sep 200917 Mar 2011Janky James MImage-based georeferencing
US20110320450 *29 Jun 201029 Dec 2011Alice LiuLocation based grouping of browsing histories
US20120086727 *8 Oct 201012 Apr 2012Nokia CorporationMethod and apparatus for generating augmented reality content
Classifications
U.S. Classification701/426, 340/988, 340/995.1, 701/300, 340/995.24, 701/469, 701/455
International ClassificationG01C21/00
Cooperative ClassificationG01C21/20, H04M2250/22, H04M1/72572, H04M2250/12
European ClassificationH04M1/725F2G, G01C21/20
Legal Events
DateCodeEventDescription
5 Jun 2012CCCertificate of correction
28 Sep 2009ASAssignment
Owner name: APPLE INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FORSTALL, SCOTT;CHRISTIE, GREGORY N.;BORCHERS, ROBERT E.;AND OTHERS;REEL/FRAME:023288/0743;SIGNING DATES FROM 20080310 TO 20080611
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FORSTALL, SCOTT;CHRISTIE, GREGORY N.;BORCHERS, ROBERT E.;AND OTHERS;SIGNING DATES FROM 20080310 TO 20080611;REEL/FRAME:023288/0743