US20120238254A1 - Video processing system for identifying items in video frames - Google Patents

Video processing system for identifying items in video frames Download PDF

Info

Publication number
US20120238254A1
US20120238254A1 US13/050,721 US201113050721A US2012238254A1 US 20120238254 A1 US20120238254 A1 US 20120238254A1 US 201113050721 A US201113050721 A US 201113050721A US 2012238254 A1 US2012238254 A1 US 2012238254A1
Authority
US
United States
Prior art keywords
item
video frame
identified
mobile device
incentive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/050,721
Inventor
Steve Yankovich
Ryan Melcher
Robert Dean Veres
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
eBay Inc
Original Assignee
eBay Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by eBay Inc filed Critical eBay Inc
Priority to US13/050,721 priority Critical patent/US20120238254A1/en
Assigned to EBAY INC. reassignment EBAY INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MELCHER, RYAN, YANKOVICH, STEVE, VERES, ROBERT DEAN
Priority to US13/341,552 priority patent/US8213916B1/en
Priority to EP16001895.8A priority patent/EP3131255B1/en
Priority to CN201280013401XA priority patent/CN103443816A/en
Priority to EP12757819.3A priority patent/EP2686820B1/en
Priority to AU2012229025A priority patent/AU2012229025B2/en
Priority to ES12757819.3T priority patent/ES2616841T3/en
Priority to CA2830412A priority patent/CA2830412C/en
Priority to CN201910203160.9A priority patent/CN110086770A/en
Priority to DK12757819.3T priority patent/DK2686820T3/en
Priority to PCT/US2012/029421 priority patent/WO2012125920A2/en
Publication of US20120238254A1 publication Critical patent/US20120238254A1/en
Priority to AU2016216533A priority patent/AU2016216533A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/762Media network packet handling at the source 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4722End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content
    • H04N21/4725End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting additional data associated with the content using interactive regions of the image, e.g. hot spots
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/47815Electronic shopping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • H04W4/21Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications

Definitions

  • Example embodiments of the present application generally relate to image recognition, and more specifically, to a method and system for identifying items in video frames.
  • Mobile devices such as smart phones have increasingly become prevalent.
  • Most smart phones include an optical lens for taking pictures.
  • a user interested in an item for example, at a friend's place or while walking on the street, may use the photo feature on the smart phone to take a picture of the item.
  • the user of the smart phone has to hold the mobile device steady and the object being pictured needs to remain static otherwise the picture will come out blurry. As such, the user may opt to record a video of a dynamic scene instead of taking pictures.
  • FIG. 1 is a network diagram depicting a network system, according to one embodiment, having a client-server architecture configured for exchanging data over a network;
  • FIG. 2 is a block diagram illustrating an example embodiment of a video processor application
  • FIG. 3 is a block diagram illustrating an example embodiment of a video frame selector module
  • FIG. 4 is a block diagram illustrating an example embodiment of an item identification module
  • FIG. 5 is a block diagram illustrating an example embodiment of a location-based incentive module
  • FIG. 6 is a block diagram illustrating an example embodiment of a location identification module
  • FIG. 7 is a block diagram illustrating an example embodiment of an incentive module
  • FIG. 8 is a table illustrating an example embodiment of a data structure
  • FIG. 9A is a block diagram illustrating an example of a tagged video frame
  • FIG. 9B is a block diagram illustrating another example of a tagged video frame
  • FIG. 10 is a flow diagram of an example method for tagging a video frame with items
  • FIG. 11 is a flow diagram of an example method for selecting a video frame
  • FIG. 12 is a flow diagram of an example method for tagging a video frame
  • FIG. 13A is a flow diagram of an example method for identifying an item in a video frame
  • FIG. 13B is a flow diagram of another example method for identifying an item in a video frame
  • FIG. 14A is a flow diagram of an example method for providing information on an item in a tagged video frame
  • FIG. 14B is a flow diagram of an example method for providing location-based information on an item in a tagged video frame
  • FIG. 15A is a flow diagram of another example method for identifying a location-based incentive
  • FIG. 15B is a flow diagram of another example method for identifying a targeted incentive
  • FIG. 15C is a flow diagram of an example method for expanding a search of local incentives
  • FIG. 16 shows a diagrammatic representation of machine in the example form of a computer system within which a set of instructions may be executed to cause the machine to perform any one or more of the methodologies discussed herein.
  • a method and a system generate offers to a user of a mobile device based on items identified in a video frame from a mobile device.
  • a video frame selector module determines a video frame to process from the mobile device.
  • An item identification module identifies an item in the determined video frame using an image recognition algorithm and tags the determined video frame with an identification of the item. Tags identifying the item can also be placed in the video frame adjacent to the identified item.
  • the offers just include offers to buy the product through one or more merchants.
  • the offers include an incentive to a user of a mobile device based on a geographic location of the mobile device. Incentives include and are not limited to promotions, discounts, sales, rebates, coupons, and so forth.
  • the incentive may also include item recommendations.
  • FIG. 1 is a network diagram depicting a network system 100 , according to one embodiment, having a client-server architecture configured for exchanging data over a network.
  • the network system 100 may be a publication/publisher system 102 where clients may communicate and exchange data within the network system 100 .
  • the data may pertain to various functions (e.g., online item purchases) and aspects (e.g., managing content and user reputation values) associated with the network system 100 and its users.
  • client-server architecture as an example, other embodiments may include other network architectures, such as a peer-to-peer or distributed network environment.
  • a data exchange platform in an example form of a network-based publisher 102 , may provide server-side functionality, via a network 104 (e.g., the Internet) to one or more clients.
  • the one or more clients may include users that utilize the network system 100 and more specifically, the network-based publisher 102 , to exchange data over the network 114 .
  • These transactions may include transmitting, receiving (communicating) and processing data to, from, and regarding content and users of the network system 100 .
  • the data may include, but are not limited to, content and user data such as feedback data; user reputation values; user profiles; user attributes; product and service reviews; product, service, manufacture, and vendor recommendations and identifiers; product and service listings associated with buyers and sellers; auction bids; and transaction data, among other things.
  • the data exchanges within the network system 100 may be dependent upon user-selected functions available through one or more client or user interfaces (UIs).
  • UIs may be associated with a client machine, such as a client machine 106 using a web client 110 .
  • the web client 110 may be in communication with the network-based publisher 102 via a web server 120 .
  • the UIs may also be associated with a client machine 108 using a programmatic client 112 , such as a client application, or a third party server 114 hosting a third party application 116 .
  • the client machine 106 , 108 , or third party application 114 may be associated with a buyer, a seller, a third party electronic commerce platform, a payment service provider, or a shipping service provider, each in communication with the network-based publisher 102 and optionally each other.
  • the buyers and sellers may be any one of individuals, merchants, or service providers, among other things.
  • a mobile device 132 may also be in communication with the network-based publisher 102 via a web server 120 .
  • the mobile device 132 may include a portable electronic device providing at least some of the functionalities of the client machines 106 and 108 .
  • the mobile device 132 may include a third party application 116 (or a web client) configured communicate with application server 122 .
  • the mobile device 132 includes a GPS module 134 and an optical lens 136 .
  • the GPS module 134 is configured to determine a location of the mobile device 132 .
  • the optical lens 136 enables the mobile device 132 to take pictures and videos.
  • an application program interface (API) server 118 and a web server 120 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 122 .
  • the application servers 122 host one or more publication application (s) 124 .
  • the application servers 122 are, in turn, shown to be coupled to one or more database server(s) 126 that facilitate access to one or more database(s) 128 .
  • the web server 120 and the API server 118 communicate and receive data pertaining to listings, transactions, and feedback, among other things, via various user input tools.
  • the web server 120 may send and receive data to and from a toolbar or webpage on a browser application (e.g., web client 110 ) operating on a client machine (e.g., client machine 106 ).
  • the API server 118 may send and receive data to and from an application (e.g., client application 112 or third party application 116 ) running on another client machine (e.g., client machine 108 or third party server 114 ).
  • a publication application(s) 124 may provide a number of publisher functions and services (e.g., listing, payment, etc.) to users that access the network-based publisher 102 .
  • the publication application(s) 124 may provide a number of services and functions to users for listing goods and/or services for sale, facilitating transactions, and reviewing and providing feedback about transactions and associated users.
  • the publication application(s) 124 may track and store data and metadata relating to listings, transactions, and user interaction with the network-based publisher 102 .
  • FIG. 1 also illustrates a third party application 116 that may execute on a third party server 114 and may have programmatic access to the network-based publisher 102 via the programmatic interface provided by the API server 118 .
  • the third party application 116 may use information retrieved from the network-based publisher 102 to support one or more features or functions on a website hosted by the third party.
  • the third party website may, for example, provide one or more listing, feedback, publisher or payment functions that are supported by the relevant applications of the network-based publisher 102 .
  • the network-based publisher 102 may provide a multitude of feedback, reputation, aggregation, and listing and price-setting mechanisms whereby a user may be a seller or buyer who lists or buys goods and/or services (e.g., for sale) published on the network-based publisher 102 .
  • the publication application(s) 124 are shown to include, among other things, one or more application(s) which support the network-based publisher 102 , and more specifically, the listing of goods and/or services for sale, the receipt of feedback in response to a transaction involving a listing, and the generation of reputation values for users based on transaction data between users.
  • the application server 122 may include a video-processor application 130 that communicates with publication application 124 .
  • the video processor application processes video frames sent from the mobile device 132 to identify items contained in a video frame and to provide item listings and generate offers or incentives to the mobile device as further described below. As items are identified in processed video frames, the video frame is tagged to allow for a “shopping pause” where a user of the mobile device 132 can pause video content and learn more about or purchase the identified item being shown in the video frame.
  • FIG. 2 is a block diagram illustrating an example embodiment of the video processor application 130 .
  • the video processor application 130 can include a video frame selector module 202 , an item identification module 204 , a market price module 206 , and a location-based incentive application 208 .
  • Each module (or component or sub-module thereof) may be implemented in hardware, software, firmware, or any combination thereof. In an example embodiment, each of the foregoing modules may be implemented by at least one processor.
  • the video frame selector module 202 determines which video frame (from a video clip) to process from the mobile device 132 . An embodiment and operation of the video frame selector module 202 is explained in more detail with respect to FIG. 3 .
  • the item identification module 204 identifies an item in the selected video frame and tags the determined video frame with an identification of the item. An embodiment and operation of the item identifier module 204 is explained in more detail with respect to FIG. 4 .
  • the market price module 206 generates offers of the identified item from at least one merchant to the mobile device. For example, the market price module 206 determines a current market price of the identified item using online databases, online price comparison websites, and/or online retailer prices. In one embodiment, the market price module 206 can provide the latest bidding prices from an online auction website for the identified item. In another embodiment, the market price module 206 can provide the price of the identified item sold at retail stores (nearby or online).
  • the location-based incentive application 208 offers incentives from at least one local merchant based on the identified item and a geographic location of the mobile device 132 .
  • An embodiment and operation of the location-based incentive application 208 is explained in more detail with respect to FIG. 5 .
  • FIG. 3 is a block diagram illustrating an example embodiment of a video frame selector module 202 .
  • the video frame selector module 202 comprises a video frame analyzer module 302 and a video frame tag module 304 .
  • the video processor application 130 only processes video frames that exceeds a predetermined amount of motion, thereby indicating a change or movement in the video frame subject matter.
  • the video frame analyzer module 302 determines a difference in a scene between a first video frame and a second video frame in a video clip from the mobile device 132 .
  • the video may include a subject walking down a street. As such, the person will be moving relative to the street in the video clip.
  • the video frame analyzer module 302 thus analyzes a difference of how much the subject matter has moved between a first video frame and a second frame.
  • the video frame tag module 304 tags the first or second video frame for item identification when the difference exceeds a predetermined amount of motion. As such, not every video frame is processed for item identification to preserve resources. Video frames that are to be processed for item identification are tagged for identification purposes. For example, a video frame that has been selected to be processed is tagged with a “shopping pause” tag. The tagged video frame is also referred to as the determined or selected video frame. In another embodiment, the video frame tag module fist determines whether a video frame contains an item to be identified before tagging the video frame for a shopping pause.
  • FIG. 4 is a block diagram illustrating an example embodiment of the item identification module 204 .
  • the item identification module 204 includes a scene deconstructor module 402 , an image-recognition module 404 , an area selector module 406 , and a user tag module 408 .
  • the scene deconstructor module 402 deconstructs a scene in the determined video frame into several areas. For example, the scene deconstructor module 402 analyzes each area of the video frame for item identification.
  • the video frame may contain an image of a person with a hat, a handbag, and shoes.
  • the scene deconstructor module 402 separately analyzes the hat in one area, the handbag in another area, and shoes in another area.
  • the image-recognition module 404 identifies the item based on a comparison of an image of the item from the determined video frame with a library of item images using an image recognition algorithm. The image-recognition module 404 further labels the image of the identified item in the determined video frame. In another embodiment, the image recognition module 404 identifies the item in the corresponding area of the determined video frame. In another embodiment, the image recognition module 404 identifies the item in the selected area in the determined video frame. In one embodiment, image-recognition module 404 determines a name of the identified item and a price of the identified item, and labels the name and price of the identified item adjacent to the image of the identified item in the determined video frame.
  • the area selector module 406 receives a user selection of an area in the determined video frame to identify the item. For example, a user may select an area in the video frame on which to focus on. Using the previous example, the user may tap on the image of the hat in the video frame to identify the item which is of interest to the user. In another example, the user may tap and drag a rectangular area in the video frame for the image recognition module 404 to focus on and analyze items in the selected rectangular area.
  • the user tag module 408 receives a user input tag to help identify the item in the determined video frame. For example, the user may tap on the image of a hat in the video frame and then enter the word “hat” for the image recognition module 404 to focus its search on hats. The word “hat” may be tagged to the identified item.
  • FIG. 5 is a block diagram illustrating an example embodiment of a location-based incentive module.
  • the location-based incentive application 208 has a location identification module 502 and an incentive module 506 .
  • the location identification module 502 determines a geographic location of the mobile device 132 .
  • the incentive module 506 communicates an incentive from one or more local merchants based on the identified item and the geographic location of the mobile device 132 .
  • the incentive can include a coupon, a discount, or a recommendation.
  • the location-based incentive application 502 receives a communication from the mobile device 132 .
  • the communication may include a location of the mobile device 132 .
  • the incentive module 506 consults with the database server 126 and database 128 to determine and communicate incentives from local merchants to the mobile device 132 .
  • the incentive module 506 identifies local merchants in the area of the mobile device that have the identified item in stock for sale.
  • FIG. 6 is a block diagram illustrating an example embodiment of the location identification module 502 .
  • the location of the mobile device 132 can be determined in many ways.
  • the mobile device 132 may be equipped with a Global Positioning Service (GPS) system that would allow the device to communicate the coordinate or location of the mobile device to a GPS/triangulation module 602 of the location identification module 502 .
  • GPS Global Positioning Service
  • the location of the mobile device 132 may be determined by triangulation using wireless communication towers and/or wireless nodes (e.g. wi-fi hotspots) within wireless signal reach of the mobile device 132 .
  • wireless communication towers and/or wireless nodes e.g. wi-fi hotspots
  • the GPS/triangulation module 602 of the location identification module 502 can determine the geographic location of the mobile device 132 after consulting a mapping database (not shown). Furthermore, the general location of the mobile device 132 can be located when the user of the mobile device 132 logs onto a local internet connection for example, at a hotel or coffee shop.
  • the location identification module 502 may also include a location input module 606 configured to determine a geographic location of the mobile device 132 by requesting the user to input an address, city, zip code or other location information.
  • the user can select a location from a list of locations or a map on the mobile device 132 .
  • a user on the mobile device 132 inputs the location of the mobile device 132 via an application or a web browser on the mobile device 132 .
  • the location identification module 502 may also include a location-dependent search term module 604 .
  • the location of the mobile device 132 can be inferred when the user of the mobile device 132 requests a search on the mobile device using location-dependent search terms. For example, a user inputs a search on his/her mobile device for “Best Japanese Restaurant San Jose.”
  • the location-dependent search term module 604 consults a database (not show) that can determine the geographic location of the best Japanese restaurant in San Jose. The location-dependent search term module 604 then infers that the user of the mobile device 132 is at that geographic location.
  • the location-dependent search term module 502 may infer the location of the user based on the search terms submitted by the user and irrespective of the search results or whether the user actually conducts the search.
  • the location-dependent search term module 504 may parse the search query entered by the user and infer that the user is located in or around San Jose.
  • the location identification module 502 may also include a tag module 608 configured to determine the geographic of the mobile device 132 based on a tag associated with a unique geographic location.
  • the tag may include for example, a barcode tag, such as a linear barcode, QR barcode, or other two-dimensional (2D) barcode, a Radio Frequency Identification (RFID) tag that is associated with a unique geographic location.
  • RFID Radio Frequency Identification
  • a user of the mobile device 132 may use his/her mobile device to scan the tag placed at a landmark or store.
  • the tag is uniquely associated with the geographic location of the landmark or store. Such relationship can be stored in a database.
  • the tag module 608 can then determine the geographic location of the mobile device 132 based on the tag after consulting the database.
  • FIG. 7 is a block diagram illustrating an example embodiment of the incentive module 506 that may used to execute the processes described herein.
  • the incentive module 506 includes a local merchant module 702 , an item category module 704 , an incentive matching module 706 , a user preference module 708 , an incentive receiver module 710 , an incentive code generator module 712 , and a communication module 714 .
  • the local merchant module 702 identifies at least one local merchant having at least one incentive based on the geographic location of the mobile device 132 as determined by the location identification module 502 .
  • a local merchant is a merchant or retailer that is located within a predefined distance from the geographic location of the mobile device 132 .
  • the local merchant module 702 identifies at least one local merchant with at least one incentive based on an updated search distance preference as specified in the user preference module 708 .
  • the incentive of the local merchant may or may not correspond to the item identified by the user.
  • a local merchant may feature a special sale on shoes while identified item corresponds to a digital camera.
  • the incentive match module 706 filters all local merchants based on the identified item. In the previous example, the local merchant featuring a sale on shoes may be filtered out from the search result.
  • the item category module 704 determines a category of the item specified by the user and identified by item identification module 204 .
  • the user may specify a particular digital camera.
  • the item category module 504 determines that the item specified by the user falls into the category of electronics, subcategory of cameras.
  • the incentive match module 706 determines whether the identified item specified by the user corresponds to an item identified in at least one incentive of at least one local merchant as determined by local merchant module 702 . For example, a user specifies an item with his/her mobile device 132 . The item is identified as a specific digital camera. Item identification module 204 generates the brand, model number, color, and other attributes of the specified digital camera. Local merchant module 702 identifies merchants with incentives local to the geographic location of the mobile device 132 . Incentive match module 706 matches local merchants with incentives (e.g., sale or discount) on the specific digital camera.
  • incentives e.g., sale or discount
  • the incentive match module 706 determines whether the category of the item identified by the user corresponds to a category of items as determined by item category module 704 and identified in at least one incentive of at least one local merchant. For example, a user specifies an item with his/her mobile device. The item is identified as a specific digital camera. Item identification module 204 generates the brand, model number, color, and other attributes of the specified digital camera. The item category module 704 determines the category of the identified item: electronics. Local merchant module 702 identifies merchants with incentives local to the geographic location of the mobile device. The incentive match module 706 matches local merchants with incentives (e.g., sale or discount) on electronics or categories related to the digital camera.
  • incentives e.g., sale or discount
  • the user preference module 708 provides user-defined preferences used in the process of determining local merchants or brands or category of the items.
  • the user preference module 708 allows a user to update a search distance preference for local merchants. For example, the user may wish to decrease the radius of the distance preference in a downtown area of a city. Conversely, the user may wish to increase the radius of the distance preference in a suburban or rural area of a city.
  • user preference module 708 may also allow the user to specify favorite brands of items or favorite merchants or retailers.
  • the incentive code module 712 generates a code associated with at least one incentive selected by the user at the mobile device.
  • the code is valid for a predetermined period of time at the corresponding local merchant. For example, a user selects a coupon from a local merchant on his/her mobile device.
  • the incentive code module 712 generates a code associated with the coupon.
  • the code is communicated to the mobile device of the user.
  • the user takes the code to the corresponding local merchant to redeem the discount.
  • the code can be redeemed at the local merchant by showing or telling the code to a cashier at the checkout register of the local merchant. The cashier may then enter the code at the checkout register to determine the validity of the code and appropriately apply the discount or promotion.
  • the code can also be redeemed by displaying a machine-readable code such as a bar code on a screen of the mobile device.
  • a machine-readable code such as a bar code
  • the user displays the bar code to the cashier at the checkout register who can scan the bar code to determine the validity of the code and appropriately apply the discount or promotion.
  • the code may be valid for a predetermined period of time (e.g., one day, one week).
  • the generated code may be uniquely associated with the user of the mobile device and may expire immediately upon usage.
  • the communication module 714 communicates one or more incentives of the identified item from at least one local merchant to the mobile device. For example, a list of local merchants within a preset distance radius (e.g., one mile) of the mobile device is displayed. The list of local merchants may include a sale or discount on the item identified by the user of the mobile device. The list may also include a list of recommended merchants (having an incentive on the identified item) that are located beyond the preset distance radius.
  • a preset distance radius e.g., one mile
  • the communication module 714 communicates one or more incentives of the identified category of the items from at least one local merchant to the mobile device. For example, a list of local merchants within a preset distance radius (e.g., a block) of the mobile device is displayed. The list of local merchants may include merchants having a sale or discount on similar or related items to the identified item specified by the user of the mobile device. The list may also include a list of recommended merchants (having an incentive on similar items to the identified item) that are located beyond the preset distance radius.
  • a preset distance radius e.g., a block
  • the incentive receiver module 710 collects attributes of incentives from merchants and stores the attributes of the incentives in an incentive database.
  • An example of a data structure of the incentive database is further described in FIG. 8 .
  • FIG. 8 is a block diagram illustrating attributes of an example of a data structure.
  • the data structure includes attributes of the incentives for an item.
  • the attributes include a name attribute of the merchant 802 , a name attribute of the item 804 , a brand attribute of the item 806 , a model attribute of the item 808 , a category tag of the item 810 , a sub-category tag of the item 812 , a financial promotion attribute of the item 814 , and a financial promotion term attribute of the item 816 .
  • the merchant name attribute 802 includes the name of the local merchant (e.g., Joe's Electronic Shop).
  • the item name attribute 804 includes the name of an item (e.g., digital camera XYZ D001).
  • the brand attribute 806 includes the brand name of the item (e.g., brand XYZ).
  • the model attribute 808 includes the model number of the item (e.g., D001).
  • the category tag 810 includes a category metadata associated with the item (e.g., personal electronics).
  • the sub-category tag 812 includes a sub-category metadata associated with the item (e.g., digital camera).
  • the financial promotion attribute 814 includes the sale or discount associated with the item (e.g., 40% off all digital cameras, or 20% off all brand XYZ digital cameras).
  • the financial promotion term 816 includes the terms of the sale or discount associated with the item (e.g., discount expires on xx/xx/xxxx, discount expires one week from today, or discount valid today
  • FIG. 9A is a block diagram illustrating an example of a tagged video frame 900 .
  • the video frame 900 has been selected for processing by video frame selector module 202 .
  • the item identification module 204 has identified two items (e.g., a hat 902 and a handbag 904 ) in the video frame 900 .
  • the video frame tag module 304 generates a call out bubble on the video frame 900 for each identified item.
  • a call out bubble may be placed on the video frame adjacent to the respective identified item.
  • call out bubble 906 labels the hat 902 with a market price for the identified item.
  • call out bubble 908 labels the handbag 904 with a market price for the identified item.
  • FIG. 9B is a block diagram illustrating another example of a tagged video frame 901 .
  • the user has selected a particular area within the video frame for the item identification module 204 to process. For example, the user may only be interested in the handbag. As such, the user has delineated a region of interest 910 on the video frame 900 to identify the handbag 904 .
  • FIG. 10 is a flow diagram of an example method for tagging a video frame with items.
  • a video frame from a mobile device is determined whether to be processed.
  • items in the determined or selected video frame are identified.
  • the video frame may be tagged with an identification of the items in the video frame.
  • FIG. 11 is a flow diagram of an example method for selecting a video frame.
  • a difference between a first video frame and a second video frame is determined.
  • the difference between a first frame and a second video frame is compared to a predetermined amount of difference. If the difference exceeds the predetermined amount of difference, the first or second video frame is processed and tagged for item identification at 1106 .
  • FIG. 12 is a flow diagram of an example method for tagging a video frame.
  • a scene in the determined video frame is deconstructed into a plurality of areas.
  • an item from each area is identified based on a comparison of an image of the item from the determined video frame with a library of item images.
  • the image of the identified item is labeled in the determined video frame.
  • FIG. 13A is a flow diagram of an example method for identifying an item in a video frame.
  • a user selects an area in the determined video frame to identify the item.
  • the item in the selected area of the determined video frame is identified.
  • the image of the identified item in the selected area of the determined video frame is labeled.
  • a name e.g., brand, model
  • a price of the identified item are determined. The name and price of the identified item are placed adjacent to the image of the identified item in the determined video frame.
  • FIG. 13B is a flow diagram of another example method for identifying an item in a video frame.
  • a user selects an area in the determined video frame to identify the item.
  • a user input tag is received to help identify the item in the determined video frame.
  • the item in the selected area of the determined video frame is identified based on the user input tag.
  • the image of the identified item in the selected area of the determined video frame is labeled.
  • FIG. 14A is a flow diagram of an example method for providing information on an item in a tagged video frame.
  • a video frame selection is received via the shopping pause feature as previously described.
  • the user selects an identified item in the video frame.
  • the system provides the vendors and merchants' prices.
  • the system allows the user to purchase the identified item selected in the video frame. If the user decides to purchase the identified item, the system receives the purchase selection from the user (including merchant selection).
  • FIG. 14B is a flow diagram of an example method for providing location-based information on an item in a tagged video frame.
  • a video frame selection is received via the shopping pause feature as previously described.
  • the user selects an identified item in the video frame.
  • the system determines a geographic location of the mobile device 132 and offers an incentive from at least one local merchant based on the identified item and the geographic location of the mobile device 132 .
  • the incentive can be a coupon, a discount, or a recommendation.
  • FIG. 15A is a flow chart of an example method for identifying a targeted incentive.
  • the location identification module 502 of the location-based incentive application 208 determines the geographic location of the mobile device 132 of a user.
  • the item identification module 204 of the location-based incentive application 208 identifies an item specified by the user at the geographic location of the mobile device 132 .
  • the local merchant module 702 of the incentive module 506 determines local merchants with at least one incentive.
  • the incentive match module 706 of the incentive module 506 of the location-based incentive application 208 determines whether the identified item as specified by the user corresponds to an item identified in at least one incentive of the local merchants as determined at operation 1506 .
  • the communication module 714 of the incentive module 506 of the location-based incentive application 208 communicates a list of local merchants with incentives for the identified item.
  • FIG. 15B is a flow chart of another example method for identifying a targeted incentive.
  • the item category module 704 of the incentive module 506 of the location-based incentive application 208 determines a category of the identified item.
  • the incentive match module 706 of the incentive module 506 of the location-based incentive application 208 determines whether a category of the identified item as specified by the user corresponds to a category of items identified in at least one incentive of the local merchants as determined at operation 1506 .
  • the communication module 714 of the incentive module 506 of the location-based incentive application 208 communicates a list of local merchants with incentives on similar or related items from the same category of the identified item.
  • FIG. 15C is a flow chart of an example method for expanding a search of local incentives.
  • the communication module 714 of the incentive module 506 of the location-based incentive application 208 communicates that the incentive match module 706 of the incentive module 506 of the location-based incentive application 208 cannot find any incentives from local merchants related to the identified item.
  • the incentive module 506 may offer the user to expand or increase the distance radius preference for local merchants in the user preference module 708 .
  • the user preference module 708 may be updated to reflect a new distance radius preference when searching for local merchants with incentives.
  • FIG. 16 shows a diagrammatic representation of machine in the example form of a computer system 1600 within which a set of instructions may be executed causing the machine to perform any one or more of the methodologies discussed herein.
  • the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • STB set-top box
  • WPA Personal Digital Assistant
  • the example computer system 1600 includes a processor 1602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 1604 and a static memory 1606 , which communicate with each other via a bus 1608 .
  • the computer system 1600 may further include a video display unit 1610 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
  • the computer system 1600 also includes an alphanumeric input device 1612 (e.g., a keyboard), a user interface (UI) navigation device 1614 (e.g., a mouse), a disk drive unit 1616 , a signal generation device 1618 (e.g., a speaker) and a network interface device 1620 .
  • a processor 1602 e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both
  • main memory 1604 e.g., RAM
  • static memory 1606 e.g.,
  • the disk drive unit 1616 includes a machine-readable medium 1622 on which is stored one or more sets of instructions and data structures (e.g., software 1624 ) embodying or utilized by any one or more of the methodologies or functions described herein.
  • the software 1624 may also reside, completely or at least partially, within the main memory 1604 and/or within the processor 1602 during execution thereof by the computer system 1600 , the main memory 1604 and the processor 1602 also constituting machine-readable media.
  • the software 1624 may further be transmitted or received over a network 1626 via the network interface device 1620 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).
  • HTTP transfer protocol
  • machine-readable medium 1622 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.

Abstract

A method and a system generate offers to a user of a mobile device based items identified in a video frame from a mobile device. A video frame selector module determines a video frame to process from the mobile device. An item identification module identifies an item in the determined video frame and tags the determined video frame with an identification of the item. Offers of the identified item from at least one merchant are generated to the mobile device.

Description

    TECHNICAL FIELD
  • Example embodiments of the present application generally relate to image recognition, and more specifically, to a method and system for identifying items in video frames.
  • BACKGROUND
  • Mobile devices such as smart phones have increasingly become prevalent. Most smart phones include an optical lens for taking pictures. A user interested in an item, for example, at a friend's place or while walking on the street, may use the photo feature on the smart phone to take a picture of the item. Unfortunately, the user of the smart phone has to hold the mobile device steady and the object being pictured needs to remain static otherwise the picture will come out blurry. As such, the user may opt to record a video of a dynamic scene instead of taking pictures.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which:
  • FIG. 1 is a network diagram depicting a network system, according to one embodiment, having a client-server architecture configured for exchanging data over a network;
  • FIG. 2 is a block diagram illustrating an example embodiment of a video processor application;
  • FIG. 3 is a block diagram illustrating an example embodiment of a video frame selector module;
  • FIG. 4 is a block diagram illustrating an example embodiment of an item identification module;
  • FIG. 5 is a block diagram illustrating an example embodiment of a location-based incentive module;
  • FIG. 6 is a block diagram illustrating an example embodiment of a location identification module;
  • FIG. 7 is a block diagram illustrating an example embodiment of an incentive module;
  • FIG. 8 is a table illustrating an example embodiment of a data structure;
  • FIG. 9A is a block diagram illustrating an example of a tagged video frame;
  • FIG. 9B is a block diagram illustrating another example of a tagged video frame;
  • FIG. 10 is a flow diagram of an example method for tagging a video frame with items;
  • FIG. 11 is a flow diagram of an example method for selecting a video frame;
  • FIG. 12 is a flow diagram of an example method for tagging a video frame;
  • FIG. 13A is a flow diagram of an example method for identifying an item in a video frame;
  • FIG. 13B is a flow diagram of another example method for identifying an item in a video frame;
  • FIG. 14A is a flow diagram of an example method for providing information on an item in a tagged video frame;
  • FIG. 14B is a flow diagram of an example method for providing location-based information on an item in a tagged video frame;
  • FIG. 15A is a flow diagram of another example method for identifying a location-based incentive;
  • FIG. 15B is a flow diagram of another example method for identifying a targeted incentive;
  • FIG. 15C is a flow diagram of an example method for expanding a search of local incentives;
  • FIG. 16 shows a diagrammatic representation of machine in the example form of a computer system within which a set of instructions may be executed to cause the machine to perform any one or more of the methodologies discussed herein.
  • DETAILED DESCRIPTION
  • Although the present invention has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
  • In various embodiments, a method and a system generate offers to a user of a mobile device based on items identified in a video frame from a mobile device. A video frame selector module determines a video frame to process from the mobile device. An item identification module identifies an item in the determined video frame using an image recognition algorithm and tags the determined video frame with an identification of the item. Tags identifying the item can also be placed in the video frame adjacent to the identified item. In one embodiment, the offers just include offers to buy the product through one or more merchants. In another embodiment, the offers include an incentive to a user of a mobile device based on a geographic location of the mobile device. Incentives include and are not limited to promotions, discounts, sales, rebates, coupons, and so forth. In another embodiment, the incentive may also include item recommendations.
  • FIG. 1 is a network diagram depicting a network system 100, according to one embodiment, having a client-server architecture configured for exchanging data over a network. For example, the network system 100 may be a publication/publisher system 102 where clients may communicate and exchange data within the network system 100. The data may pertain to various functions (e.g., online item purchases) and aspects (e.g., managing content and user reputation values) associated with the network system 100 and its users. Although illustrated herein as a client-server architecture as an example, other embodiments may include other network architectures, such as a peer-to-peer or distributed network environment.
  • A data exchange platform, in an example form of a network-based publisher 102, may provide server-side functionality, via a network 104 (e.g., the Internet) to one or more clients. The one or more clients may include users that utilize the network system 100 and more specifically, the network-based publisher 102, to exchange data over the network 114. These transactions may include transmitting, receiving (communicating) and processing data to, from, and regarding content and users of the network system 100. The data may include, but are not limited to, content and user data such as feedback data; user reputation values; user profiles; user attributes; product and service reviews; product, service, manufacture, and vendor recommendations and identifiers; product and service listings associated with buyers and sellers; auction bids; and transaction data, among other things.
  • In various embodiments, the data exchanges within the network system 100 may be dependent upon user-selected functions available through one or more client or user interfaces (UIs). The UIs may be associated with a client machine, such as a client machine 106 using a web client 110. The web client 110 may be in communication with the network-based publisher 102 via a web server 120. The UIs may also be associated with a client machine 108 using a programmatic client 112, such as a client application, or a third party server 114 hosting a third party application 116. It can be appreciated in various embodiments the client machine 106, 108, or third party application 114 may be associated with a buyer, a seller, a third party electronic commerce platform, a payment service provider, or a shipping service provider, each in communication with the network-based publisher 102 and optionally each other. The buyers and sellers may be any one of individuals, merchants, or service providers, among other things.
  • A mobile device 132 may also be in communication with the network-based publisher 102 via a web server 120. The mobile device 132 may include a portable electronic device providing at least some of the functionalities of the client machines 106 and 108. The mobile device 132 may include a third party application 116 (or a web client) configured communicate with application server 122. In one embodiment, the mobile device 132 includes a GPS module 134 and an optical lens 136. The GPS module 134 is configured to determine a location of the mobile device 132. The optical lens 136 enables the mobile device 132 to take pictures and videos.
  • Turning specifically to the network-based publisher 102, an application program interface (API) server 118 and a web server 120 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 122. The application servers 122 host one or more publication application (s) 124. The application servers 122 are, in turn, shown to be coupled to one or more database server(s) 126 that facilitate access to one or more database(s) 128.
  • In one embodiment, the web server 120 and the API server 118 communicate and receive data pertaining to listings, transactions, and feedback, among other things, via various user input tools. For example, the web server 120 may send and receive data to and from a toolbar or webpage on a browser application (e.g., web client 110) operating on a client machine (e.g., client machine 106). The API server 118 may send and receive data to and from an application (e.g., client application 112 or third party application 116) running on another client machine (e.g., client machine 108 or third party server 114).
  • A publication application(s) 124 may provide a number of publisher functions and services (e.g., listing, payment, etc.) to users that access the network-based publisher 102. For example, the publication application(s) 124 may provide a number of services and functions to users for listing goods and/or services for sale, facilitating transactions, and reviewing and providing feedback about transactions and associated users. Additionally, the publication application(s) 124 may track and store data and metadata relating to listings, transactions, and user interaction with the network-based publisher 102.
  • FIG. 1 also illustrates a third party application 116 that may execute on a third party server 114 and may have programmatic access to the network-based publisher 102 via the programmatic interface provided by the API server 118. For example, the third party application 116 may use information retrieved from the network-based publisher 102 to support one or more features or functions on a website hosted by the third party. The third party website may, for example, provide one or more listing, feedback, publisher or payment functions that are supported by the relevant applications of the network-based publisher 102.
  • The network-based publisher 102 may provide a multitude of feedback, reputation, aggregation, and listing and price-setting mechanisms whereby a user may be a seller or buyer who lists or buys goods and/or services (e.g., for sale) published on the network-based publisher 102.
  • The publication application(s) 124 are shown to include, among other things, one or more application(s) which support the network-based publisher 102, and more specifically, the listing of goods and/or services for sale, the receipt of feedback in response to a transaction involving a listing, and the generation of reputation values for users based on transaction data between users.
  • The application server 122 may include a video-processor application 130 that communicates with publication application 124. The video processor application processes video frames sent from the mobile device 132 to identify items contained in a video frame and to provide item listings and generate offers or incentives to the mobile device as further described below. As items are identified in processed video frames, the video frame is tagged to allow for a “shopping pause” where a user of the mobile device 132 can pause video content and learn more about or purchase the identified item being shown in the video frame.
  • FIG. 2 is a block diagram illustrating an example embodiment of the video processor application 130. The video processor application 130 can include a video frame selector module 202, an item identification module 204, a market price module 206, and a location-based incentive application 208. Each module (or component or sub-module thereof) may be implemented in hardware, software, firmware, or any combination thereof. In an example embodiment, each of the foregoing modules may be implemented by at least one processor.
  • The video frame selector module 202 determines which video frame (from a video clip) to process from the mobile device 132. An embodiment and operation of the video frame selector module 202 is explained in more detail with respect to FIG. 3.
  • The item identification module 204 identifies an item in the selected video frame and tags the determined video frame with an identification of the item. An embodiment and operation of the item identifier module 204 is explained in more detail with respect to FIG. 4.
  • The market price module 206 generates offers of the identified item from at least one merchant to the mobile device. For example, the market price module 206 determines a current market price of the identified item using online databases, online price comparison websites, and/or online retailer prices. In one embodiment, the market price module 206 can provide the latest bidding prices from an online auction website for the identified item. In another embodiment, the market price module 206 can provide the price of the identified item sold at retail stores (nearby or online).
  • The location-based incentive application 208 offers incentives from at least one local merchant based on the identified item and a geographic location of the mobile device 132. An embodiment and operation of the location-based incentive application 208 is explained in more detail with respect to FIG. 5.
  • FIG. 3 is a block diagram illustrating an example embodiment of a video frame selector module 202. The video frame selector module 202 comprises a video frame analyzer module 302 and a video frame tag module 304.
  • To efficiently process video frames, the video processor application 130 only processes video frames that exceeds a predetermined amount of motion, thereby indicating a change or movement in the video frame subject matter. As such, the video frame analyzer module 302 determines a difference in a scene between a first video frame and a second video frame in a video clip from the mobile device 132. For example, the video may include a subject walking down a street. As such, the person will be moving relative to the street in the video clip. The video frame analyzer module 302 thus analyzes a difference of how much the subject matter has moved between a first video frame and a second frame.
  • The video frame tag module 304 tags the first or second video frame for item identification when the difference exceeds a predetermined amount of motion. As such, not every video frame is processed for item identification to preserve resources. Video frames that are to be processed for item identification are tagged for identification purposes. For example, a video frame that has been selected to be processed is tagged with a “shopping pause” tag. The tagged video frame is also referred to as the determined or selected video frame. In another embodiment, the video frame tag module fist determines whether a video frame contains an item to be identified before tagging the video frame for a shopping pause.
  • FIG. 4 is a block diagram illustrating an example embodiment of the item identification module 204. The item identification module 204 includes a scene deconstructor module 402, an image-recognition module 404, an area selector module 406, and a user tag module 408.
  • The scene deconstructor module 402 deconstructs a scene in the determined video frame into several areas. For example, the scene deconstructor module 402 analyzes each area of the video frame for item identification. For example, the video frame may contain an image of a person with a hat, a handbag, and shoes. The scene deconstructor module 402 separately analyzes the hat in one area, the handbag in another area, and shoes in another area.
  • The image-recognition module 404 identifies the item based on a comparison of an image of the item from the determined video frame with a library of item images using an image recognition algorithm. The image-recognition module 404 further labels the image of the identified item in the determined video frame. In another embodiment, the image recognition module 404 identifies the item in the corresponding area of the determined video frame. In another embodiment, the image recognition module 404 identifies the item in the selected area in the determined video frame. In one embodiment, image-recognition module 404 determines a name of the identified item and a price of the identified item, and labels the name and price of the identified item adjacent to the image of the identified item in the determined video frame.
  • The area selector module 406 receives a user selection of an area in the determined video frame to identify the item. For example, a user may select an area in the video frame on which to focus on. Using the previous example, the user may tap on the image of the hat in the video frame to identify the item which is of interest to the user. In another example, the user may tap and drag a rectangular area in the video frame for the image recognition module 404 to focus on and analyze items in the selected rectangular area.
  • The user tag module 408 receives a user input tag to help identify the item in the determined video frame. For example, the user may tap on the image of a hat in the video frame and then enter the word “hat” for the image recognition module 404 to focus its search on hats. The word “hat” may be tagged to the identified item.
  • FIG. 5 is a block diagram illustrating an example embodiment of a location-based incentive module. The location-based incentive application 208 has a location identification module 502 and an incentive module 506.
  • The location identification module 502 determines a geographic location of the mobile device 132. The incentive module 506 communicates an incentive from one or more local merchants based on the identified item and the geographic location of the mobile device 132. The incentive can include a coupon, a discount, or a recommendation.
  • In one embodiment, the location-based incentive application 502 receives a communication from the mobile device 132. For example, the communication may include a location of the mobile device 132. Based on the location of the mobile device 132 and the identified item from item identifier module 204, the incentive module 506 consults with the database server 126 and database 128 to determine and communicate incentives from local merchants to the mobile device 132.
  • In another embodiment, the incentive module 506 identifies local merchants in the area of the mobile device that have the identified item in stock for sale.
  • FIG. 6 is a block diagram illustrating an example embodiment of the location identification module 502. The location of the mobile device 132 can be determined in many ways. For example, the mobile device 132 may be equipped with a Global Positioning Service (GPS) system that would allow the device to communicate the coordinate or location of the mobile device to a GPS/triangulation module 602 of the location identification module 502. In another example, the location of the mobile device 132 may be determined by triangulation using wireless communication towers and/or wireless nodes (e.g. wi-fi hotspots) within wireless signal reach of the mobile device 132. Based on the geographic coordinates, the GPS/triangulation module 602 of the location identification module 502 can determine the geographic location of the mobile device 132 after consulting a mapping database (not shown). Furthermore, the general location of the mobile device 132 can be located when the user of the mobile device 132 logs onto a local internet connection for example, at a hotel or coffee shop.
  • The location identification module 502 may also include a location input module 606 configured to determine a geographic location of the mobile device 132 by requesting the user to input an address, city, zip code or other location information. In one embodiment, the user can select a location from a list of locations or a map on the mobile device 132. For example, a user on the mobile device 132 inputs the location of the mobile device 132 via an application or a web browser on the mobile device 132.
  • The location identification module 502 may also include a location-dependent search term module 604. The location of the mobile device 132 can be inferred when the user of the mobile device 132 requests a search on the mobile device using location-dependent search terms. For example, a user inputs a search on his/her mobile device for “Best Japanese Restaurant San Jose.” The location-dependent search term module 604 consults a database (not show) that can determine the geographic location of the best Japanese restaurant in San Jose. The location-dependent search term module 604 then infers that the user of the mobile device 132 is at that geographic location. In an example embodiment, the location-dependent search term module 502 may infer the location of the user based on the search terms submitted by the user and irrespective of the search results or whether the user actually conducts the search. Using the foregoing example, the location-dependent search term module 504 may parse the search query entered by the user and infer that the user is located in or around San Jose.
  • The location identification module 502 may also include a tag module 608 configured to determine the geographic of the mobile device 132 based on a tag associated with a unique geographic location. The tag may include for example, a barcode tag, such as a linear barcode, QR barcode, or other two-dimensional (2D) barcode, a Radio Frequency Identification (RFID) tag that is associated with a unique geographic location. For example, a user of the mobile device 132 may use his/her mobile device to scan the tag placed at a landmark or store. The tag is uniquely associated with the geographic location of the landmark or store. Such relationship can be stored in a database. The tag module 608 can then determine the geographic location of the mobile device 132 based on the tag after consulting the database.
  • FIG. 7 is a block diagram illustrating an example embodiment of the incentive module 506 that may used to execute the processes described herein. The incentive module 506 includes a local merchant module 702, an item category module 704, an incentive matching module 706, a user preference module 708, an incentive receiver module 710, an incentive code generator module 712, and a communication module 714.
  • The local merchant module 702 identifies at least one local merchant having at least one incentive based on the geographic location of the mobile device 132 as determined by the location identification module 502. A local merchant is a merchant or retailer that is located within a predefined distance from the geographic location of the mobile device 132. In one embodiment, the local merchant module 702 identifies at least one local merchant with at least one incentive based on an updated search distance preference as specified in the user preference module 708.
  • It should be noted that the incentive of the local merchant may or may not correspond to the item identified by the user. For example, a local merchant may feature a special sale on shoes while identified item corresponds to a digital camera. Once all local merchants having incentives are identified based on the geographic location of the mobile device (using a database of incentives), the incentive match module 706 filters all local merchants based on the identified item. In the previous example, the local merchant featuring a sale on shoes may be filtered out from the search result.
  • The item category module 704 determines a category of the item specified by the user and identified by item identification module 204. For example, the user may specify a particular digital camera. The item category module 504 determines that the item specified by the user falls into the category of electronics, subcategory of cameras.
  • The incentive match module 706 determines whether the identified item specified by the user corresponds to an item identified in at least one incentive of at least one local merchant as determined by local merchant module 702. For example, a user specifies an item with his/her mobile device 132. The item is identified as a specific digital camera. Item identification module 204 generates the brand, model number, color, and other attributes of the specified digital camera. Local merchant module 702 identifies merchants with incentives local to the geographic location of the mobile device 132. Incentive match module 706 matches local merchants with incentives (e.g., sale or discount) on the specific digital camera.
  • In another embodiment, the incentive match module 706 determines whether the category of the item identified by the user corresponds to a category of items as determined by item category module 704 and identified in at least one incentive of at least one local merchant. For example, a user specifies an item with his/her mobile device. The item is identified as a specific digital camera. Item identification module 204 generates the brand, model number, color, and other attributes of the specified digital camera. The item category module 704 determines the category of the identified item: electronics. Local merchant module 702 identifies merchants with incentives local to the geographic location of the mobile device. The incentive match module 706 matches local merchants with incentives (e.g., sale or discount) on electronics or categories related to the digital camera.
  • The user preference module 708 provides user-defined preferences used in the process of determining local merchants or brands or category of the items. In one embodiment, the user preference module 708 allows a user to update a search distance preference for local merchants. For example, the user may wish to decrease the radius of the distance preference in a downtown area of a city. Conversely, the user may wish to increase the radius of the distance preference in a suburban or rural area of a city. In another embodiment, user preference module 708 may also allow the user to specify favorite brands of items or favorite merchants or retailers.
  • The incentive code module 712 generates a code associated with at least one incentive selected by the user at the mobile device. The code is valid for a predetermined period of time at the corresponding local merchant. For example, a user selects a coupon from a local merchant on his/her mobile device. The incentive code module 712 generates a code associated with the coupon. The code is communicated to the mobile device of the user. The user takes the code to the corresponding local merchant to redeem the discount. The code can be redeemed at the local merchant by showing or telling the code to a cashier at the checkout register of the local merchant. The cashier may then enter the code at the checkout register to determine the validity of the code and appropriately apply the discount or promotion. The code can also be redeemed by displaying a machine-readable code such as a bar code on a screen of the mobile device. The user then displays the bar code to the cashier at the checkout register who can scan the bar code to determine the validity of the code and appropriately apply the discount or promotion.
  • In one embodiment, the code may be valid for a predetermined period of time (e.g., one day, one week). In another embodiment, the generated code may be uniquely associated with the user of the mobile device and may expire immediately upon usage.
  • The communication module 714 communicates one or more incentives of the identified item from at least one local merchant to the mobile device. For example, a list of local merchants within a preset distance radius (e.g., one mile) of the mobile device is displayed. The list of local merchants may include a sale or discount on the item identified by the user of the mobile device. The list may also include a list of recommended merchants (having an incentive on the identified item) that are located beyond the preset distance radius.
  • In another embodiment, the communication module 714 communicates one or more incentives of the identified category of the items from at least one local merchant to the mobile device. For example, a list of local merchants within a preset distance radius (e.g., a block) of the mobile device is displayed. The list of local merchants may include merchants having a sale or discount on similar or related items to the identified item specified by the user of the mobile device. The list may also include a list of recommended merchants (having an incentive on similar items to the identified item) that are located beyond the preset distance radius.
  • The incentive receiver module 710 collects attributes of incentives from merchants and stores the attributes of the incentives in an incentive database. An example of a data structure of the incentive database is further described in FIG. 8.
  • FIG. 8 is a block diagram illustrating attributes of an example of a data structure. In one embodiment, the data structure includes attributes of the incentives for an item. For example, the attributes include a name attribute of the merchant 802, a name attribute of the item 804, a brand attribute of the item 806, a model attribute of the item 808, a category tag of the item 810, a sub-category tag of the item 812, a financial promotion attribute of the item 814, and a financial promotion term attribute of the item 816.
  • The merchant name attribute 802 includes the name of the local merchant (e.g., Joe's Electronic Shop). The item name attribute 804 includes the name of an item (e.g., digital camera XYZ D001). The brand attribute 806 includes the brand name of the item (e.g., brand XYZ). The model attribute 808 includes the model number of the item (e.g., D001). The category tag 810 includes a category metadata associated with the item (e.g., personal electronics). The sub-category tag 812 includes a sub-category metadata associated with the item (e.g., digital camera). The financial promotion attribute 814 includes the sale or discount associated with the item (e.g., 40% off all digital cameras, or 20% off all brand XYZ digital cameras). The financial promotion term 816 includes the terms of the sale or discount associated with the item (e.g., discount expires on xx/xx/xxxx, discount expires one week from today, or discount valid today only).
  • FIG. 9A is a block diagram illustrating an example of a tagged video frame 900. The video frame 900 has been selected for processing by video frame selector module 202. The item identification module 204 has identified two items (e.g., a hat 902 and a handbag 904) in the video frame 900. The video frame tag module 304 generates a call out bubble on the video frame 900 for each identified item. In one embodiment, a call out bubble may be placed on the video frame adjacent to the respective identified item. For example, call out bubble 906 labels the hat 902 with a market price for the identified item. Similarly, call out bubble 908 labels the handbag 904 with a market price for the identified item.
  • FIG. 9B is a block diagram illustrating another example of a tagged video frame 901. The user has selected a particular area within the video frame for the item identification module 204 to process. For example, the user may only be interested in the handbag. As such, the user has delineated a region of interest 910 on the video frame 900 to identify the handbag 904.
  • FIG. 10 is a flow diagram of an example method for tagging a video frame with items. At 1002, a video frame from a mobile device is determined whether to be processed. At 1004, items in the determined or selected video frame are identified. At 1006, the video frame may be tagged with an identification of the items in the video frame.
  • FIG. 11 is a flow diagram of an example method for selecting a video frame. At 1102, a difference between a first video frame and a second video frame is determined. At 1104, the difference between a first frame and a second video frame is compared to a predetermined amount of difference. If the difference exceeds the predetermined amount of difference, the first or second video frame is processed and tagged for item identification at 1106.
  • FIG. 12 is a flow diagram of an example method for tagging a video frame. At 1202, a scene in the determined video frame is deconstructed into a plurality of areas. At 1204, an item from each area is identified based on a comparison of an image of the item from the determined video frame with a library of item images. At 1206, the image of the identified item is labeled in the determined video frame.
  • FIG. 13A is a flow diagram of an example method for identifying an item in a video frame. At 1302, a user selects an area in the determined video frame to identify the item. At 1304, the item in the selected area of the determined video frame is identified. At 1306, the image of the identified item in the selected area of the determined video frame is labeled. In one embodiment, a name (e.g., brand, model) of the identified item and a price of the identified item are determined. The name and price of the identified item are placed adjacent to the image of the identified item in the determined video frame.
  • FIG. 13B is a flow diagram of another example method for identifying an item in a video frame. At 1308, a user selects an area in the determined video frame to identify the item. At 1310, a user input tag is received to help identify the item in the determined video frame. At 1312, the item in the selected area of the determined video frame is identified based on the user input tag. At 1314, the image of the identified item in the selected area of the determined video frame is labeled.
  • FIG. 14A is a flow diagram of an example method for providing information on an item in a tagged video frame. At 1402, a video frame selection is received via the shopping pause feature as previously described. At 1404, the user selects an identified item in the video frame. At 1406, the system provides the vendors and merchants' prices. At 1408, the system allows the user to purchase the identified item selected in the video frame. If the user decides to purchase the identified item, the system receives the purchase selection from the user (including merchant selection).
  • FIG. 14B is a flow diagram of an example method for providing location-based information on an item in a tagged video frame. At 1402, a video frame selection is received via the shopping pause feature as previously described. At 1404, the user selects an identified item in the video frame. At 1408, the system determines a geographic location of the mobile device 132 and offers an incentive from at least one local merchant based on the identified item and the geographic location of the mobile device 132. The incentive can be a coupon, a discount, or a recommendation.
  • FIG. 15A is a flow chart of an example method for identifying a targeted incentive. At 1502, the location identification module 502 of the location-based incentive application 208 determines the geographic location of the mobile device 132 of a user. At 1504, the item identification module 204 of the location-based incentive application 208 identifies an item specified by the user at the geographic location of the mobile device 132. At 1506, the local merchant module 702 of the incentive module 506 determines local merchants with at least one incentive. At 1508, the incentive match module 706 of the incentive module 506 of the location-based incentive application 208 determines whether the identified item as specified by the user corresponds to an item identified in at least one incentive of the local merchants as determined at operation 1506. At 1510, the communication module 714 of the incentive module 506 of the location-based incentive application 208 communicates a list of local merchants with incentives for the identified item.
  • FIG. 15B is a flow chart of another example method for identifying a targeted incentive. At 1512, if there are no local merchants having incentives on the identified item, the item category module 704 of the incentive module 506 of the location-based incentive application 208 determines a category of the identified item. At 1514, the incentive match module 706 of the incentive module 506 of the location-based incentive application 208 determines whether a category of the identified item as specified by the user corresponds to a category of items identified in at least one incentive of the local merchants as determined at operation 1506. At 1516, the communication module 714 of the incentive module 506 of the location-based incentive application 208 communicates a list of local merchants with incentives on similar or related items from the same category of the identified item.
  • FIG. 15C is a flow chart of an example method for expanding a search of local incentives. At 1518, the communication module 714 of the incentive module 506 of the location-based incentive application 208 communicates that the incentive match module 706 of the incentive module 506 of the location-based incentive application 208 cannot find any incentives from local merchants related to the identified item. At 1520, the incentive module 506 may offer the user to expand or increase the distance radius preference for local merchants in the user preference module 708. At 1522, the user preference module 708 may be updated to reflect a new distance radius preference when searching for local merchants with incentives.
  • FIG. 16 shows a diagrammatic representation of machine in the example form of a computer system 1600 within which a set of instructions may be executed causing the machine to perform any one or more of the methodologies discussed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • The example computer system 1600 includes a processor 1602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 1604 and a static memory 1606, which communicate with each other via a bus 1608. The computer system 1600 may further include a video display unit 1610 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 1600 also includes an alphanumeric input device 1612 (e.g., a keyboard), a user interface (UI) navigation device 1614 (e.g., a mouse), a disk drive unit 1616, a signal generation device 1618 (e.g., a speaker) and a network interface device 1620.
  • The disk drive unit 1616 includes a machine-readable medium 1622 on which is stored one or more sets of instructions and data structures (e.g., software 1624) embodying or utilized by any one or more of the methodologies or functions described herein. The software 1624 may also reside, completely or at least partially, within the main memory 1604 and/or within the processor 1602 during execution thereof by the computer system 1600, the main memory 1604 and the processor 1602 also constituting machine-readable media.
  • The software 1624 may further be transmitted or received over a network 1626 via the network interface device 1620 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).
  • While the machine-readable medium 1622 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present invention, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
  • The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims (29)

1. A system, comprising:
a processor-implemented video frame selector module configured to determine a video frame to process that is received from a mobile device;
a processor-implemented item identification module configured to identify an item in the determined video frame and to tag the determined video frame with an identification of the item; and
a processor-implemented market module configured to generate offers of the identified item from at least one merchant to the mobile device.
2. The system of claim 1 further comprising:
a processor-implemented location-based incentive module configured to offer an incentive from at least one local merchant based on the identified item and a geographic location of the mobile device.
3. The system of claim 1 wherein the processor-implemented video frame selector module comprises a video frame analyzer module configured to determine a difference between a first video frame and a second video frame, and a video frame tag module configured to tag the first or second video frame for item identification when the difference exceeds a predetermined amount.
4. The system of claim 1 wherein the processor-implemented item identification module comprises:
a processor-implemented scene deconstructor module configured to deconstruct a scene in the determined video frame into a plurality of areas;
a processor-implemented area selector module configured to receive a user selection of an area in the determined video frame to identify the item;
a processor-implemented user tag module configured to receive a user input tag to help identify the item in the determined video frame; and
a processor-implemented image-recognition module configured to identify the item based on a comparison of an image of the item from the determined video frame with a library of item images, and to label the image of the identified item in the determined video frame.
5. The system of claim 4 wherein the processor implemented image-recognition module is further configured to identify the item in the corresponding area of the determined video frame.
6. The system of claim 4 wherein the processor implemented image-recognition module is further configured to identify the item in the selected area in the determined video frame.
7. The system of claim 4 wherein the processor implemented image-recognition module is further configured to determine a name of the identified item and a price of the identified item, and to label the name and price of the identified item adjacent to the image of the identified item in the determined video frame.
8. The system of claim 3 wherein the processor-implemented location-based incentive module comprises:
a processor-implemented location identification module configured to determine the geographic location of the mobile device; and
a processor-implemented incentive module configured to offer an incentive from the at least one local merchant based on the identified item and the geographic location of the mobile device, wherein the incentive comprises a coupon, a discount, or a recommendation.
9. The system of claim 8 wherein the processor-implemented incentive module comprises:
a processor-implemented local merchant module configured to identify the at least one local merchant with at least one incentive based on the geographic location of the mobile device;
a processor-implemented incentive match module configured to determine whether the item identified by the user corresponds to an item identified in the at least one incentive of the at least one local merchant;
a processor-implemented communication module configured to communicate the at least one incentive of the identified item from the at least one local merchant to the mobile device; and
a processor-implemented incentive receiver module configured to receive attributes of incentives from at least one local merchant and store the attributes of the incentives in a database,
wherein the attributes of the incentives for an item comprises at least one of a name attribute of the local merchant, a name attribute of the item, a brand attribute of the item, a model attribute of the item, a category tag of the item, a sub-category tag of the item, a financial promotion attribute of the item, and a financial promotion term attribute of the item.
10. A computer-implemented method comprising:
determining a video frame to process from a mobile device;
identifying an item in the determined video frame;
tagging the determined video frame with an identification of the item; and
generate offers of the identified item from at least one merchant to the mobile device.
11. The computer-implemented method of claim 10 further comprising:
generating offers of the identified item from at least one merchant to the mobile device.
12. The computer-implemented method of claim 10 further comprising:
offering an incentive from at least one local merchant based on the identified item and a geographic location of the mobile device.
13. The computer-implemented method of claim 10 wherein determining the video frame comprises:
determining a difference between a first video frame and a second video frame; and
tagging the first or second video frame for item identification when the difference exceeds a predetermined amount.
14. The computer-implemented method of claim 10 wherein identifying the item comprises:
deconstructing a scene in the determined video frame into a plurality of areas;
receiving a user selection of an area in the determined video frame to identify the item;
receiving a user input tag to help identify the item in the determined video frame;
identifying the item based on a comparison of an image of the item from the determined video frame with a library of item images; and
labeling the image of the identified item in the determined video frame.
15. The computer-implemented method of claim 14 further comprising:
identifying the item in the corresponding area of the determined video frame.
16. The computer-implemented method of claim 14 further comprising:
identifying the item in the selected area in the determined video frame.
17. The computer-implemented method of claim 14 further comprising:
determining a name of the identified item and a price of the identified item; and
labeling the name and price of the identified item adjacent to the image of the identified item in the determined video frame.
18. The computer-implemented method of claim 12 further comprising:
determining a geographic location of the mobile device; and
offering the incentive from the at least one local merchant based on the identified item and the geographic location of the mobile device, wherein the incentive comprises a coupon, a discount, or a recommendation.
19. The computer-implemented method of claim 18 further comprising:
identifying at least one local merchant with at least one incentive based on the geographic location of the mobile device;
determining whether the item identified by the user corresponds to an item identified in the at least one incentive of the at least one local merchant;
communicating the at least one incentive of the identified item from the at least one local merchant to the mobile device;
receiving attributes of incentives from at least one local merchant and storing the attributes of the incentives in a database,
wherein the attributes of the incentives for an item comprises at least one of a name attribute of the local merchant, a name attribute of the item, a brand attribute of the item, a model attribute of the item, a category tag of the item, a sub-category tag of the item, a financial promotion attribute of the item, and a financial promotion term attribute of the item.
20. A non-transitory computer-readable storage medium storing a set of instructions that, when executed by a processor, causes the processor to perform operations, comprising:
determining a video frame to process from a mobile device;
identifying an item in the determined video frame;
tagging the determined video frame with an identification of the item; and
generate offers of the identified item from at least one merchant to the mobile device.
21. The non-transitory computer-readable storage medium of claim 20 further comprising:
generating offers of the identified item from at least one merchant to the mobile device.
22. The non-transitory computer-readable storage medium of claim 20 further comprising:
offering an incentive from at least one local merchant based on the identified item and a geographic location of the mobile device.
23. The non-transitory computer-readable storage medium of claim 20 wherein determining the video frame comprises:
determining a difference between a first video frame and a second video frame; and
tagging the first or second video frame for item identification when the difference exceeds a predetermined amount.
24. The non-transitory computer-readable storage medium of claim 20 wherein dentifying the item comprises:
deconstructing a scene in the determined video frame into a plurality of areas;
receiving a user selection of an area in the determined video frame to identify the item;
receiving a user input tag to help identify the item in the determined video frame;
identifying the item based on a comparison of an image of the item from the determined video frame with a library of item images; and
labeling the image of the identified item in the determined video frame.
25. The computer-implemented method of claim 24 further comprising:
identifying the item in the corresponding area of the determined video frame.
26. The computer-implemented method of claim 24 further comprising:
identifying the item in the selected area in the determined video frame.
27. The computer-implemented method of claim 24 further comprising:
determining a name of the identified item and a price of the identified item; and
labeling the name and price of the identified item adjacent to the image of the identified item in the determined video frame.
28. The computer-implemented method of claim 20 further comprising:
determining a geographic location of the mobile device; and
offering the incentive from the at least one local merchant based on the identified item and the geographic location of the mobile device, wherein the incentive comprises a coupon, a discount, or a recommendation.
29. The computer-implemented method of claim 28 further comprising:
identifying at least one local merchant with at least one incentive based on the geographic location of the mobile device;
determining whether the item identified by the user corresponds to an item identified in the at least one incentive of the at least one local merchant;
communicating the at least one incentive of the identified item from the at least one local merchant to the mobile device;
receiving attributes of incentives from at least one local merchant and storing the attributes of the incentives in a database,
wherein the attributes of the incentives for an item comprises at least one of a name attribute of the local merchant, a name attribute of the item, a brand attribute of the item, a model attribute of the item, a category tag of the item, a sub-category tag of the item, a financial promotion attribute of the item, and a financial promotion term attribute of the item.
US13/050,721 2011-03-17 2011-03-17 Video processing system for identifying items in video frames Abandoned US20120238254A1 (en)

Priority Applications (12)

Application Number Priority Date Filing Date Title
US13/050,721 US20120238254A1 (en) 2011-03-17 2011-03-17 Video processing system for identifying items in video frames
US13/341,552 US8213916B1 (en) 2011-03-17 2011-12-30 Video processing system for identifying items in video frames
PCT/US2012/029421 WO2012125920A2 (en) 2011-03-17 2012-03-16 Video processing system for identifying items in video frames
ES12757819.3T ES2616841T3 (en) 2011-03-17 2012-03-16 Video processing system to identify items in video frames
CN201280013401XA CN103443816A (en) 2011-03-17 2012-03-16 Video processing system for identifying items in video frames
EP12757819.3A EP2686820B1 (en) 2011-03-17 2012-03-16 Video processing system for identifying items in video frames
AU2012229025A AU2012229025B2 (en) 2011-03-17 2012-03-16 Video processing system for identifying items in video frames
EP16001895.8A EP3131255B1 (en) 2011-03-17 2012-03-16 Video processing system for identifying items in video frames
CA2830412A CA2830412C (en) 2011-03-17 2012-03-16 Video processing system for identifying items in video frames
CN201910203160.9A CN110086770A (en) 2011-03-17 2012-03-16 For identifying the processing system for video of article in the video frame
DK12757819.3T DK2686820T3 (en) 2011-03-17 2012-03-16 Video Processing System to identify elements of video frames
AU2016216533A AU2016216533A1 (en) 2011-03-17 2016-08-15 Video processing system for identifying items in video frames

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/050,721 US20120238254A1 (en) 2011-03-17 2011-03-17 Video processing system for identifying items in video frames

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/341,552 Continuation US8213916B1 (en) 2011-03-17 2011-12-30 Video processing system for identifying items in video frames

Publications (1)

Publication Number Publication Date
US20120238254A1 true US20120238254A1 (en) 2012-09-20

Family

ID=46320236

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/050,721 Abandoned US20120238254A1 (en) 2011-03-17 2011-03-17 Video processing system for identifying items in video frames
US13/341,552 Active US8213916B1 (en) 2011-03-17 2011-12-30 Video processing system for identifying items in video frames

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/341,552 Active US8213916B1 (en) 2011-03-17 2011-12-30 Video processing system for identifying items in video frames

Country Status (8)

Country Link
US (2) US20120238254A1 (en)
EP (2) EP2686820B1 (en)
CN (2) CN103443816A (en)
AU (2) AU2012229025B2 (en)
CA (1) CA2830412C (en)
DK (1) DK2686820T3 (en)
ES (1) ES2616841T3 (en)
WO (1) WO2012125920A2 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130044959A1 (en) * 2011-08-18 2013-02-21 Justin Mitchell Computer-Vision Content Detection for Sponsored Stories
US8909665B2 (en) 2011-08-30 2014-12-09 Microsoft Corporation Subsnippet handling in search results
US20150112832A1 (en) * 2013-10-23 2015-04-23 Wal-Mart Stores, Inc. Employing a portable computerized device to estimate a total expenditure in a retail environment
US9268464B1 (en) * 2011-11-17 2016-02-23 Yahoo! Inc. Link determination and usage using image recognition
WO2016140619A1 (en) * 2015-03-05 2016-09-09 Citrine Wireless Pte Ltd Method and system for providing selecting products
US9672496B2 (en) 2011-08-18 2017-06-06 Facebook, Inc. Computer-vision content detection for connecting objects in media to users
JP6232632B1 (en) * 2016-08-09 2017-11-22 パロニム株式会社 Video playback program, video playback device, video playback method, video distribution system, and metadata creation method
US20180227344A1 (en) * 2013-03-15 2018-08-09 Robert Ernest Troxler Mobile device displaying real time sports statistics
WO2018228955A1 (en) 2017-06-12 2018-12-20 Institut Mines-Telecom Descriptor learning method for the detection and location of objects in a video
US20190080175A1 (en) * 2017-09-14 2019-03-14 Comcast Cable Communications, Llc Methods and systems to identify an object in content
US10405059B2 (en) * 2013-01-02 2019-09-03 IMDB. COM, Inc. Medium, system, and method for identifying collections associated with subjects appearing in a broadcast
US10417321B2 (en) 2016-07-22 2019-09-17 Dropbox, Inc. Live document detection in a captured video stream
WO2019233410A1 (en) * 2018-06-06 2019-12-12 Zhejiang Dahua Technology Co., Ltd. Systems and methods for displaying object box in a video
RU2731362C1 (en) * 2017-03-24 2020-09-02 Су Бум ПАРК Method of providing information on purchases in live video

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8627379B2 (en) * 2010-01-07 2014-01-07 Amazon Technologies, Inc. Offering items identified in a media stream
US9538209B1 (en) 2010-03-26 2017-01-03 Amazon Technologies, Inc. Identifying items in a content stream
US20130262291A1 (en) * 2012-03-15 2013-10-03 Flextronics Ap, Llc Universal credit card
US9560415B2 (en) 2013-01-25 2017-01-31 TapShop, LLC Method and system for interactive selection of items for purchase from a video
US9836885B1 (en) 2013-10-25 2017-12-05 Appliance Computing III, Inc. Image-based rendering of real spaces
CN104754377A (en) * 2013-12-27 2015-07-01 阿里巴巴集团控股有限公司 Smart television data processing method, smart television and smart television system
WO2015105804A1 (en) * 2014-01-07 2015-07-16 Hypershow Ltd. System and method for generating and using spatial and temporal metadata
US20150296250A1 (en) * 2014-04-10 2015-10-15 Google Inc. Methods, systems, and media for presenting commerce information relating to video content
CN107124659A (en) * 2014-04-30 2017-09-01 广州市动景计算机科技有限公司 The output intent and device of a kind of Item Information
US20150326935A1 (en) * 2014-05-09 2015-11-12 Mastercard International Incorporated Methods and Systems for Purchasing Products From Media Content Shown on Media Display Devices
US9818048B2 (en) * 2015-01-19 2017-11-14 Ebay Inc. Fine-grained categorization
CN106358092B (en) * 2015-07-13 2019-11-26 阿里巴巴集团控股有限公司 Information processing method and device
CN105159923A (en) * 2015-08-04 2015-12-16 曹政新 Video image based article extraction, query and purchasing method
WO2018048355A1 (en) * 2016-09-08 2018-03-15 Aiq Pte. Ltd. Object detection from visual search queries
CN106254941A (en) * 2016-10-10 2016-12-21 乐视控股(北京)有限公司 Method for processing video frequency and device
US10650262B2 (en) 2016-11-09 2020-05-12 Clicpic, Inc. Electronic system for comparing positions of interest on media items
US20180376212A1 (en) * 2017-06-23 2018-12-27 Sony Corporation Modifying display region for people with vision impairment
US10805676B2 (en) 2017-07-10 2020-10-13 Sony Corporation Modifying display region for people with macular degeneration
US10650702B2 (en) 2017-07-10 2020-05-12 Sony Corporation Modifying display region for people with loss of peripheral vision
US10845954B2 (en) 2017-07-11 2020-11-24 Sony Corporation Presenting audio video display options as list or matrix
CN108319489B (en) 2018-02-13 2020-07-03 Oppo广东移动通信有限公司 Application page starting method and device, storage medium and electronic equipment
KR102194050B1 (en) * 2020-03-05 2020-12-22 이현호 Server, system and method for providing rewards based on streamming service

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020095333A1 (en) * 2001-01-18 2002-07-18 Nokia Corporation Real-time wireless e-coupon (promotion) definition based on available segment
US20030028873A1 (en) * 2001-08-02 2003-02-06 Thomas Lemmons Post production visual alterations
US6570587B1 (en) * 1996-07-26 2003-05-27 Veon Ltd. System and method and linking information to a video
US20030132298A1 (en) * 1996-09-05 2003-07-17 Jerome Swartz Consumer interactive shopping system
US6642940B1 (en) * 2000-03-03 2003-11-04 Massachusetts Institute Of Technology Management of properties for hyperlinked video
US20050137958A1 (en) * 2003-12-23 2005-06-23 Thomas Huber Advertising methods for advertising time slots and embedded objects
US20090014477A1 (en) * 2007-04-12 2009-01-15 Ronald Brenes Pressure module for dispensing chemical solutions
US20090144772A1 (en) * 2007-11-30 2009-06-04 Google Inc. Video object tag creation and processing
US20090141969A1 (en) * 2007-11-29 2009-06-04 Nec Laboratories America, Inc. Transfer Learning Methods and systems for Feed-Forward Visual Recognition Systems
US20090307013A1 (en) * 2006-02-06 2009-12-10 Itaggit, Inc. Data Tag Creation from a Physical Item Data Record to be Attached to a Physical Item
US7751482B1 (en) * 2004-02-27 2010-07-06 Vbrick Systems, Inc. Phase correlation based motion estimation in hybrid video compression
US20110040760A1 (en) * 2009-07-16 2011-02-17 Bluefin Lab, Inc. Estimating Social Interest in Time-based Media
US20110145051A1 (en) * 2009-12-13 2011-06-16 AisleBuyer LLC Systems and methods for suggesting products for purchase from a retail establishment using a mobile device
US8254699B1 (en) * 2009-02-02 2012-08-28 Google Inc. Automatic large scale video object recognition
US9105011B2 (en) * 2011-03-08 2015-08-11 Bank Of America Corporation Prepopulating application forms using real-time video analysis of identified objects
US9317860B2 (en) * 2011-03-08 2016-04-19 Bank Of America Corporation Collective network of augmented reality users

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10003781B2 (en) * 2006-08-04 2018-06-19 Gula Consulting Limited Liability Company Displaying tags associated with items in a video playback
US20080091555A1 (en) * 2006-10-13 2008-04-17 Ashley Heather User generated style content
US9516251B2 (en) * 2006-10-18 2016-12-06 Grabit Interactive, Inc. Method and apparatus for displaying and enabling the purchase of products during video playback
US7692629B2 (en) * 2006-12-07 2010-04-06 Microsoft Corporation Operating touch screen interfaces
KR100783553B1 (en) * 2007-01-22 2007-12-07 삼성전자주식회사 Mobile device, method for generating group picture of phonebook in the same and method of executing communication event using the group picture
GB0707216D0 (en) * 2007-04-14 2007-05-23 Livesey Carl Interactive shopping platform
US20090003797A1 (en) * 2007-06-29 2009-01-01 Nokia Corporation Method, Apparatus and Computer Program Product for Providing Content Tagging
US20090049413A1 (en) * 2007-08-16 2009-02-19 Nokia Corporation Apparatus and Method for Tagging Items
US7987478B2 (en) * 2007-08-28 2011-07-26 Sony Ericsson Mobile Communications Ab Methods, devices, and computer program products for providing unobtrusive video advertising content
US8843959B2 (en) 2007-09-19 2014-09-23 Orlando McMaster Generating synchronized interactive link maps linking tracked video objects to other multimedia content in real-time
US20090294538A1 (en) * 2008-05-28 2009-12-03 Sony Ericsson Mobile Communications Ab Embedded tags in a media signal
US8237807B2 (en) * 2008-07-24 2012-08-07 Apple Inc. Image capturing device with touch screen for adjusting camera settings
US9232043B2 (en) * 2010-03-03 2016-01-05 Lg Electronics Inc. Mobile terminal and control method thereof

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6570587B1 (en) * 1996-07-26 2003-05-27 Veon Ltd. System and method and linking information to a video
US20030132298A1 (en) * 1996-09-05 2003-07-17 Jerome Swartz Consumer interactive shopping system
US6642940B1 (en) * 2000-03-03 2003-11-04 Massachusetts Institute Of Technology Management of properties for hyperlinked video
US20020095333A1 (en) * 2001-01-18 2002-07-18 Nokia Corporation Real-time wireless e-coupon (promotion) definition based on available segment
US20030028873A1 (en) * 2001-08-02 2003-02-06 Thomas Lemmons Post production visual alterations
US20050137958A1 (en) * 2003-12-23 2005-06-23 Thomas Huber Advertising methods for advertising time slots and embedded objects
US7751482B1 (en) * 2004-02-27 2010-07-06 Vbrick Systems, Inc. Phase correlation based motion estimation in hybrid video compression
US20090307013A1 (en) * 2006-02-06 2009-12-10 Itaggit, Inc. Data Tag Creation from a Physical Item Data Record to be Attached to a Physical Item
US20090014477A1 (en) * 2007-04-12 2009-01-15 Ronald Brenes Pressure module for dispensing chemical solutions
US20090141969A1 (en) * 2007-11-29 2009-06-04 Nec Laboratories America, Inc. Transfer Learning Methods and systems for Feed-Forward Visual Recognition Systems
US8345962B2 (en) * 2007-11-29 2013-01-01 Nec Laboratories America, Inc. Transfer learning methods and systems for feed-forward visual recognition systems
US20090144772A1 (en) * 2007-11-30 2009-06-04 Google Inc. Video object tag creation and processing
US8254699B1 (en) * 2009-02-02 2012-08-28 Google Inc. Automatic large scale video object recognition
US20110040760A1 (en) * 2009-07-16 2011-02-17 Bluefin Lab, Inc. Estimating Social Interest in Time-based Media
US20110145051A1 (en) * 2009-12-13 2011-06-16 AisleBuyer LLC Systems and methods for suggesting products for purchase from a retail establishment using a mobile device
US9105011B2 (en) * 2011-03-08 2015-08-11 Bank Of America Corporation Prepopulating application forms using real-time video analysis of identified objects
US9317860B2 (en) * 2011-03-08 2016-04-19 Bank Of America Corporation Collective network of augmented reality users

Non-Patent Citations (16)

* Cited by examiner, † Cited by third party
Title
"Cinema Tools 4 User Manual: Frame Rate Basics," specific author unidentified, copyrighted by Apple, Inc. in 2009, downloaded 3-5-2013 from: http://documentation.apple.com/en/cinematools/usermanual/index.html#chapter=2%26section=5%26tasks=true *
13050721 LayarAugmentedBrowser-2009-YouTube video-1min18sec.pdf (YouTube screenshot at 1 minute, 18 seconds) *
13050721 LayarAugmentedBrowser-2009-YouTube video-1min26sec.pdf (YouTube screenshot at 1 minute, 26 seconds) *
13050721 LayarAugmentedBrowser-2009-YouTube video-1min32sec.pdf (YouTube screenshot at 1 minute, 32 seconds) *
13050721 LayarAugmentedBrowser-2009-YouTube video-1sec.pdf (YouTube screenshot at 1 second) *
13050721 LayarAugmentedBrowser-2009-YouTube video-33sec.pdf (YouTube screenshot at 33 seconds) *
13050721 LayarAugmentedBrowser-2009-YouTube video-38sec.pdf (YouTube screenshot at 38 seconds) *
13050721 LayarAugmentedBrowser-2009-YouTube video-44sec.pdf (YouTube screenshot at 44 seconds) *
13050721 LayarAugmentedBrowser-2009-YouTube video-57sec.pdf (YouTube screenshot at 57 seconds) *
Donna Dewberry One Stroke 10th Anniversary Book and Brushes, from YouTube, downloaded 2 October 2015 from https://www.youtube.com/watch?v=IZXOm8AX8yA, screenshot at 0:11 of 3:29 *
Home Shopping Network, from Wikipedia, downloaded 2 October 2015 from https://en.wikipedia.org/wiki/Home_Shopping_Network *
Human billboard, from Wikipedia, downloaded from http://en.wikipedia.org/wiki/Human_board on 27 October 2014 *
Product placement, from Wikipedia, downloaded from http://en.wikipedia.org/wiki/Product_placement on 4 April 2015 *
Sandwich board, from Wikipedia, downloaded from http://en.wikipedia.org/wiki/Sandwich_board on 27 October 2014 *
Toyama, Kentaro et al., "Geographic Location Tags on Digital Images," MM'03, Nov. 2-8, 2003, Berkeley, California, .COPYRGT. 2003 ACM. (downloaded from: http://delivery.acm.org/10.1145/960000/957046/p156-toyama.pdf?key1=957046&key2=8247661921&coll=DL&dl=ACM&CFID=546995&CFTOKEN=16766440 on March 5, 2013) *
YouTube video "Layar, worlds first mobile Augmented Reality browser", uploaded 6-15-2009 to YouTube, no author identified, as evidenced by eight screenshots taken from http://www.youtube.com/watch?v=b64_16K2e08 on 3-6-2013, denoted by the time clock related to the 2 minute, 8 second video *

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9135631B2 (en) * 2011-08-18 2015-09-15 Facebook, Inc. Computer-vision content detection for sponsored stories
US9672496B2 (en) 2011-08-18 2017-06-06 Facebook, Inc. Computer-vision content detection for connecting objects in media to users
US20130044959A1 (en) * 2011-08-18 2013-02-21 Justin Mitchell Computer-Vision Content Detection for Sponsored Stories
US8909665B2 (en) 2011-08-30 2014-12-09 Microsoft Corporation Subsnippet handling in search results
US9384269B2 (en) 2011-08-30 2016-07-05 Microsoft Technology Licensing, Llc Subsnippet handling in search results
US9268464B1 (en) * 2011-11-17 2016-02-23 Yahoo! Inc. Link determination and usage using image recognition
US10405059B2 (en) * 2013-01-02 2019-09-03 IMDB. COM, Inc. Medium, system, and method for identifying collections associated with subjects appearing in a broadcast
US11799928B2 (en) * 2013-03-15 2023-10-24 International Research Institute (iRI) Inc. Mobile device displaying real time sports statistics
US20180227344A1 (en) * 2013-03-15 2018-08-09 Robert Ernest Troxler Mobile device displaying real time sports statistics
US20150112832A1 (en) * 2013-10-23 2015-04-23 Wal-Mart Stores, Inc. Employing a portable computerized device to estimate a total expenditure in a retail environment
WO2016140619A1 (en) * 2015-03-05 2016-09-09 Citrine Wireless Pte Ltd Method and system for providing selecting products
US11003841B2 (en) 2016-07-22 2021-05-11 Dropbox, Inc. Enhancing documents portrayed in digital images
US11620438B2 (en) 2016-07-22 2023-04-04 Dropbox, Inc. Live document detection in a captured video stream
US10417321B2 (en) 2016-07-22 2019-09-17 Dropbox, Inc. Live document detection in a captured video stream
US10628519B2 (en) 2016-07-22 2020-04-21 Dropbox, Inc. Enhancing documents portrayed in digital images
US11017159B2 (en) 2016-07-22 2021-05-25 Dropbox, Inc. Enhancing documents portrayed in digital images
US11017158B2 (en) 2016-07-22 2021-05-25 Dropbox, Inc. Live document detection in a captured video stream
US11620439B2 (en) 2016-07-22 2023-04-04 Dropbox, Inc. Enhancing documents portrayed in digital images
JP2018026647A (en) * 2016-08-09 2018-02-15 パロニム株式会社 Video replay program, video replay device, video replay method, video delivery system and metadata creation method
JP6232632B1 (en) * 2016-08-09 2017-11-22 パロニム株式会社 Video playback program, video playback device, video playback method, video distribution system, and metadata creation method
RU2731362C1 (en) * 2017-03-24 2020-09-02 Су Бум ПАРК Method of providing information on purchases in live video
WO2018228955A1 (en) 2017-06-12 2018-12-20 Institut Mines-Telecom Descriptor learning method for the detection and location of objects in a video
US11501110B2 (en) 2017-06-12 2022-11-15 Institut Mines Telecom Descriptor learning method for the detection and location of objects in a video
US20190080175A1 (en) * 2017-09-14 2019-03-14 Comcast Cable Communications, Llc Methods and systems to identify an object in content
WO2019233410A1 (en) * 2018-06-06 2019-12-12 Zhejiang Dahua Technology Co., Ltd. Systems and methods for displaying object box in a video
US11438527B2 (en) 2018-06-06 2022-09-06 Zhejiang Dahua Technology Co., Ltd. Systems and methods for displaying object box in a video

Also Published As

Publication number Publication date
WO2012125920A2 (en) 2012-09-20
ES2616841T3 (en) 2017-06-14
EP2686820A4 (en) 2014-11-12
AU2012229025B2 (en) 2016-05-19
CA2830412C (en) 2017-12-12
EP3131255A1 (en) 2017-02-15
US8213916B1 (en) 2012-07-03
AU2012229025A1 (en) 2013-09-19
WO2012125920A3 (en) 2012-12-27
EP2686820A2 (en) 2014-01-22
CN103443816A (en) 2013-12-11
DK2686820T3 (en) 2017-01-09
CN110086770A (en) 2019-08-02
EP2686820B1 (en) 2016-10-12
EP3131255B1 (en) 2019-10-02
CA2830412A1 (en) 2012-09-20
AU2016216533A1 (en) 2016-09-01

Similar Documents

Publication Publication Date Title
US8213916B1 (en) Video processing system for identifying items in video frames
US8868443B2 (en) Targeted incentive actions based on location and intent
US20200193668A1 (en) Personal Augmented Reality
US20220067821A1 (en) In-store product detection system
US11367127B2 (en) Omnichannel retailing
TWI490808B (en) Smart device assisted commerce
US8833652B2 (en) Product information system and method using a tag and mobile device
US20130036043A1 (en) Image-based product mapping
US20130262231A1 (en) Targeted incentive actions based on the number of people within a geographic locale
US20120239481A1 (en) Digital shoebox
US20130030915A1 (en) Apparatus and method for enhanced in-store shopping services using mobile device
US20170278152A1 (en) Personalized item trading card generation and management
US20210056578A1 (en) Method and system for providing automated on-site merchant coupons
AU2015271902B2 (en) Personal augmented reality

Legal Events

Date Code Title Description
AS Assignment

Owner name: EBAY INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YANKOVICH, STEVE;MELCHER, RYAN;VERES, ROBERT DEAN;SIGNING DATES FROM 20110310 TO 20110315;REEL/FRAME:026552/0318

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION