US20100153861A1 - Interactive events - Google Patents

Interactive events Download PDF

Info

Publication number
US20100153861A1
US20100153861A1 US12/568,666 US56866609A US2010153861A1 US 20100153861 A1 US20100153861 A1 US 20100153861A1 US 56866609 A US56866609 A US 56866609A US 2010153861 A1 US2010153861 A1 US 2010153861A1
Authority
US
United States
Prior art keywords
event
interactive
client
audience
production center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/568,666
Inventor
Jeffrey David Henshaw
Alexander Irvin Hopmann
Christopher Andrew Evans
Daniel Evan Socolof
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deep Rock Drive Partners Inc
Original Assignee
Deep Rock Drive Partners Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deep Rock Drive Partners Inc filed Critical Deep Rock Drive Partners Inc
Priority to US12/568,666 priority Critical patent/US20100153861A1/en
Publication of US20100153861A1 publication Critical patent/US20100153861A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0637Strategic management or analysis, e.g. setting a goal or target of an organisation; Planning actions based on goals; Analysis or evaluation of effectiveness of goals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1093Calendar-based scheduling for persons or groups
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0242Determining effectiveness of advertisements
    • G06Q30/0246Traffic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0252Targeted advertisements based on events or environment, e.g. weather or festivals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0282Rating or review of business operators or products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/155Conference systems involving storage of or access to video conference sessions

Definitions

  • Embodiments of the present disclosure generally relate to data evaluation, categorization, and presentation. More particularly, the embodiments of the present disclosure relate to systems which deliver live entertainment online via the Internet in various media forms and allow both the performer and the observer to interact.
  • streaming multimedia represents one method of media distribution.
  • streaming multimedia is multimedia that is broadcast by a streaming provider to an end-user.
  • streaming specifically refers to the delivery method of the data rather than to the content.
  • streaming typically requires tremendous bandwidth and/or latency to cache the data locally.
  • Recent advances in computer networks combined with powerful home computers and modern operating systems have made possible, i.e. practical and affordable, the near universal distribution of streaming media for ordinary consumers.
  • Universal distribution represents multimedia that is constantly received by, and normally presented to, an end-user while it is being delivered by a streaming provider.
  • a stream of media can be on demand or live.
  • On demand streams are stored on a server for a long period of time, and are available to be transmitted at a user's request. Live streams may still use a server to broadcast the event, but are typically only available at one particular time, such as a video stream of a live sporting event, a political debate, educational lecture, or a concert. Live streams may be edited and converted into on demand streams for later content consumption. Current on demand or live streams lose any possibility for constructive feedback from the streaming targets. Essentially, live online presentations to large streaming audiences generally only provide unidirectional information in a manner that is difficult to facilitate observer participation. On demand performances are presented after the fact, preventing the presenter and/or observer(s) from directly altering the previously recorded presentation.
  • embodiments of the invention are based on the technical problem of optimizing interactive live events, categorization, and presentation in an online environment. While the internet already allows many services for one way communication and event broadcast, there have been no options for providing realtime, two way interactivity between audience members and the people creating the event. Systems and methods presented in this disclosure provide this very type of interactivity to create truly compelling live events on the internet.
  • One illustrated and described method provides large scale, real-time interactivity between distributed audience members on the internet and performers in an interactive event.
  • Multiple types of interaction are possible, including direct text communication in the form of shoutouts as well as non-verbal communication, such as Emotapplause, that represents real-world feedback mechanisms like applause, fists in the air, peace signs, or throwing kisses, among others.
  • a system suitable to solve the problems which at least one embodiment of the invention is based on, generates an interactive online event forum, such as a widget that may be used to solicit feedback from observer(s) and facilitate feedback response by the event presenter and/or publisher, to produce events for consumption by online audiences.
  • the events could be live music performances, sporting events, political meetings, education or informational lectures, news broadcasts, travel logs, game shows, or any other type of event where the “performers” are at a central location or multiple locations and the audience is distributed across the internet, or in any number of locations with internet capable devices for communicating.
  • Feedback may be generated via a client side module to input data on relative performance quality and relative emotional response from the perspective of the observer.
  • the rank-value of a particular monitored response can be calculated and presented back to the performer. In one embodiment, that ranking can be shown or used to sort lists in some embodiments.
  • Monitored responses may include a playlist compiled of proposed songs/material for the artist to consider, subject matter for further discussion, desired topics for debate, questions regarding covered material or stated positions, and relative emotional responses to the content currently being presented. These customized lists or relative feedback of the monitored responses may be used to attract additional observers and/or alter the performance of the presenter(s).
  • Shout outs are a text messages sent from the audience members to the performers and audience members.
  • the intent of the shout out is for the audience members to be able to send a directed message or question to the performers.
  • the audience members also see a subset of the messages, thus providing a sense of community among all of the audience members. Because the number of audience members could be very large for a worldwide internet event, there is no guarantee that all messages will be presented to the performers but due to the mechanism of transferring shout out messages, a good random sampling of messages from all audience members will be presented to both the performers and other audience members.
  • Emotapplause is a mechanism of sending non-verbal communication from the audience members to the performers.
  • graphical representations of the emotapplause such as clapping hands, a heart, etc
  • a message is sent to a centralized service that aggregates all of the feedback from the audience.
  • the performers then see a graphical representation of the aggregated feedback.
  • the actual experience by the performer changes based on how many audience members are using that emotapplause image at that moment, so if 70% of the audience was ‘clapping’ and 10% of the audience was sending kisses, the visualization might include very large clapping hands, or perhaps many clapping hands and a smaller representation of kissing lips.
  • One of the best ways to keep an audience engaged in an event is to give them some control of how the event unfolds. Providing a voting mechanism allows them to decide what song is played next, what topic is covered next or the audience decision on the outcome of some sporting event or any number of other mechanisms for impacting the flow of the event based on popular vote.
  • An event producer may use the ranked lists generated for the client/user interface in a variety of ways including to present audience feedback for the performer and to alter a previously recorded presentation for the individual observer.
  • a producer may solicit feedback by an observer and/or performer according to a variety of factors including prior event experience, available repertoire, relative quality of received feedback, and real time education of audience trends.
  • Producers may also maintain and/or improve participant (performer and/or audience) satisfaction by reviewing the event rankings and comparing the responses with comparable events to identify trends. By decreasing or eliminating any discrepancies with participant expectations, the producer will likely increase the quality of the participant experience and thereby reduce marketing costs associated with bringing new events to market.
  • FIG. 1 illustrates a block diagram view of computer systems in an online interactive event environment in accordance with at least one embodiment
  • FIG. 2 illustrates a block diagram view of components contained in an interactive client system configured in accordance with at least one embodiment
  • FIG. 3 illustrates a block/flow diagram view of a portion of computer systems to filter feedback in an exemplary online interactive event environment in accordance with at least one embodiment
  • FIG. 4 illustrates a flow diagram view of a method of a portion of operation for interactive event data evaluation, categorization, and presentation in accordance with at least one embodiment
  • FIGS. 5A-5D illustrate block diagram views of portions of user interfaces, each generated in an interactive feedback system configured for compelling live event quality via relative interactivity in accordance with various embodiments;
  • FIG. 6 illustrates a block diagram view of a portion of an interactive client interface of an online interactive event environment during presentation of the event in accordance with various embodiments of the present disclosure
  • FIG. 7 illustrates a block diagram view of a portion of an interactive client interface of an online interactive event environment for various after party presentations associated with an online interactive event in accordance with various embodiments of the present disclosure.
  • connection may mean a direct electrical, electro-magnetic, mechanical, logical, or other connection between the items connected, without any electrical, mechanical, logical or other intermediary therebetween.
  • coupled can mean a direct connection between items, an indirect connection through one or more intermediaries, or communication between items in a manner that may not constitute a connection.
  • circuit or “circuitry” as used in any embodiment described herein, can mean a single component or a plurality of components, active and/or passive, discrete or integrated, that are coupled together to provide a desired function and may include, for example, singly or in any combination, hardwired circuitry, programmable circuitry, state machine circuitry, and/or firmware that stores instructions executed by programmable circuitry.
  • signal can mean at least one current, voltage, charge, data, or other such identifiable quantity.
  • the phrase “A/B” means “A or B”.
  • the phrase “A and/or B” means “(A), (B), or (A and B)”.
  • the phrase “at least one of A, B and C” means “(A), (B), (C), (A and B), (A and C), (B and C) or (A, B and C)”.
  • the phrase “(A) B” means “(A B) or (B)”, that is “A” is optional.
  • FIG. 1 through FIG. 4 are represented by block diagrams and flow diagrams that illustrate in more detail scope of the present disclosure.
  • the block diagrams often illustrate certain embodiments of modules for performing various functions of the present invention.
  • the represented modules include therein executable and operational data for operation within a system as depicted in FIG. 1 and/or FIG. 2 in accordance with embodiments of the present disclosure.
  • Various operations of the system may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding embodiments of the present disclosure; however, the order of description should not be construed to imply that these operations are order dependent.
  • executable code is intended to include any type of computer instruction and computer executable code that may be located within a memory device and/or transmitted as electronic signals over a system bus or network.
  • An identified module of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be located together, but may comprise disparate instructions stored in different locations which together comprise the module and achieve the purpose stated for the module. Indeed, an executable may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices.
  • operational data may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure.
  • the operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may at least partially exist merely as electronic signals on a system bus or network.
  • the online interactive event environment 100 includes both a variety of operating environments and a variety of network devices.
  • Operating environments within the online interactive event environment 100 may include, but are not limited to, multiple interactive client endpoints that may attach via a communication network, such as the internet, to a production center and/or one or more performance studios.
  • the production center includes network operations and a datacenter.
  • the performance studio includes an event studio, an event database, an event interface, and an interactive display.
  • the production center and performance studio may be separately connected via a private communication network or via a virtual private network across a public communication network, such as the internet.
  • An interactive client endpoint may represent a variety of consumer devices including, but not limited to, general purpose computer systems, personal digital assistants, digital media players, mobile telephones, video equipment, application specific devices, and other digital communication devices.
  • Performance centers provide executable code and operational data to the interactive client endpoints, directly and indirectly via the production center.
  • Interactive client endpoints can be visitors of the event website, people who own or purchase a ticket, employees of the production company running the web site, or any other types of people or device that may participate in the interactive event.
  • Various multimedia devices may be used to upload a rich variety of media information for or about an event to the event profile. For example, multiple cameras or webcams may be used to collect video images of an event, conduct a separate web interviews, and/or provide a video preview of an event. Likewise, multiple microphones may be used to collect sound from the event and/or associated interviews or advertisements.
  • the audience member at the interactive client endpoint joins an ongoing event and initiates interactivity with the event by typing a message, clicking or otherwise choosing an emotapplause image, voting for event presentation lists, selecting a camera angle, or some other method of indicating the message they would like to send.
  • the messages are then sent to a centralized internet web service that adds user information about that audience member such as their name, image, location, source, etc. That information is then stored in a central database or data store such that the web service can index, search, log and recall each request, or aggregated totals of requests.
  • Interactive client applications can then periodically issue requests for the current summary state of the interactivity information. That information includes a set of recent shout out messages and their related metadata, the current aggregate information for emotapplause items, current voting topics and voting choices, and any other status information that is helpful for the client to be able to process this data. Because of the potential quantity of requests coming from audience members, various caching mechanisms can be used to reduce the overhead spent gathering this information on every request. To maintain relevancy it is important that the information sent out to clients be very current, so as to maintain the feeling of interactivity at the event. In one embodiment, shout out messages are not allowed to be more than about 30 seconds old (time they were sent from audience member) and preferably represent the most recent messages received by the system.
  • the response to the interactive client may be encoded in at least one of a variety of different formats, including but not limited to, XML, JSON, CSV, and the like.
  • the interactive audience client or performance studio client when they initially receives the data, they present the information to the performers or audience members in an appropriate way.
  • the performers that may be showing the name of the audience member, their image, location and the shoutout message itself in an interesting animation.
  • FIG. 2 a computer system is shown for implementing at least one embodiment of the invention, the system including a computing device 200 in which executable and operational data may be hosted and transmitted to one or more interactive stations via a communication network of the previously described online interactive event environment 100 .
  • Computing device 200 may be configured as a client, a server, a mobile device, or any other computing device that interacts with data in a network based collaboration system.
  • computing device 200 typically includes at least one processing unit 220 .
  • the processing unit 220 includes at least one processor.
  • the term “processor”, as used herein, should be interpreted to mean an individual processor, firmware logic, reconfigurable logic, a hardware description language logic configuration, a state machine, an application-specific integrated circuit, a processing core co-disposed in an integrated circuit package with at least one other processing core and/or component, or combinations thereof.
  • the processing unit 220 may be operably connected to system memory 210 .
  • system memory 210 may be non-volatile memory 211 (such as ROM, flash memory, etc.), volatile memory 214 (such as RAM), or some combination of the two.
  • System memory 210 typically includes Basic Input/Output System (BIOS) firmware code 212 , an operating system 215 , one or more applications 216 , and may include program modules and data 217 .
  • BIOS Basic Input/Output System
  • a configuration library 218 e.g., registries
  • which contain code and data to be shared and changed in a modular or database fashion to provide services to applications 216 and programs 217 is also often included in system memory 210 .
  • Computing device 200 may have additional features or functionality.
  • computing device 200 may also have a dedicated graphics rendering device, such as video adapter 230 coupled with at least one display monitor 235 .
  • Computing device 200 may also have a variety of human input device(s) (HID) 259 such as keyboard, mouse, pen, voice input device, touch input device, and the like.
  • human input device (HID) 259 may also include various output devices such as a display monitor 235 , speakers, printer, and the like.
  • Computing device 200 may utilize a variety of ports via port interface 250 to share data including wireless ports 253 , parallel ports 255 , and serial ports 257 . Each of these port types may include further varieties, for example serial ports may include a Universal Serial Bus (USB) port and/or a FireWire/IEEE 1394 port.
  • USB Universal Serial Bus
  • computing device 200 may also include a storage drive interface 240 for communication with additional data storage devices (removable and/or non-removable) such as, for example, magnetic disk drives 242 , optical disk drives 243 , hard disk drives 244 , tape drives, and other storage devices.
  • additional storage is illustrated in FIG. 2 by removable magnetic storage 241 and removable optical storage 249 and non-removable storage (hard disk drive 244 ).
  • Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • System memory 210 , removable storage and non-removable storage are all examples of computer storage media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 200 . Any such computer storage media may be used to store desired information, such as operating system 245 , one or more applications 246 , programs 247 , and/or registries and configuration libraries 248 accessible to computing device 200 .
  • Computing device 200 may also contain a communication connection via port interface 250 and/or network interface card 260 that allows the device 200 to communicate with other remote computing devices 280 , such as over a communication network.
  • the communication network may comprise a local area network (LAN) and/or a wide area network (WAN). Each network may be wired or wireless or combination thereof.
  • the communication network may also comprise other large scale networks including, but not limited to, intranets and extranets, or combinations thereof.
  • the communication network is an interconnected system of networks, one particular example of which is the Internet and the World Wide Web supported on the Internet.
  • a variety of configurations may be used to connect the computing device 200 to the remote computing devices 280 .
  • modem 265 is illustrated as connecting to the remote computing device 280
  • a remote server via a WAN
  • network interface 260 is illustrated as connecting via a LAN
  • both the network interface 260 and/or the modem 265 may just as well be coupled to other large scale networks including, but not limited to, a global system of interconnected computer networks (internet), various intranets and extranets, or combinations thereof.
  • internet interconnected computer networks
  • Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
  • computer readable media as used herein includes both storage media and communication media.
  • FIGS. 3 and 4 methods and various operations of the interactive event system, in accordance with at least one embodiment, are described in terms of firmware, software, and/or hardware with reference to flowcharts and/or flow diagrams. More specifically, FIG. 3 is a block and flow diagram schematically showing a portion of various computer systems configured to filter feedback in an exemplary online interactive event environment in accordance with at least one embodiment.
  • FIG. 4 is a flow diagram showing a portion of a method of operation for interactive event data evaluation, categorization, and presentation in accordance with at least one embodiment. Describing a method and/or various operations by reference to a flowchart enables one skilled in the art to develop programs, including instructions to carry out the methods on suitably configured computer systems and electronic devices.
  • portions of the operations to be performed by an electronic device or computer system may constitute circuits, general purpose processors (e.g., micro-processors, micro-controllers, an ASIC, or digital signal processors (DSPs)), special purpose processors (e.g., application specific integrated circuits or ASICs), firmware (e.g., firmware that is used by a processor such as a micro-processor, a micro-controller, and/or a digital signal processor), state machines, hardware arrays, reconfigurable hardware, and/or software made up of executable instructions.
  • the executable instructions may be embodied in firmware logic, reconfigurable logic, a hardware description language, a state machine, an application-specific integrated circuit (ASIC), or combinations thereof.
  • At least one of the processors of a suitably configured electronic communication device executes the instructions from a storage medium.
  • the computer-executable instructions may be written in a computer programming language or executable code. If written in a programming language conforming to a recognized standard, such instructions may be executed on a variety of hardware platforms and may interface with a variety of operating systems.
  • the system 300 includes one or more performance studios for producing the underlying content for the event, a production center to produce and monitor the interactive event, and a plurality of interactive client endpoints to generate the interactive content associated with the underlying content of the event.
  • Each of the one or more interactive client endpoints are configured to receive the data transmitted from the performance studio and transmit user-generated interactive feedback associated with the interactive event back to the production center and/or the performance studio.
  • the various multimedia streams received by the client include camera captured feedback from performing artist and fans in the studio audience.
  • the user-generated interactive content transmitted by the client may include voting results, shout outs, emotapplause, and other feedback solicited and/or generated from the watching audience.
  • the performance studio may be a customized interactive studio, such as a Deep Rock Drive certified performance studio or a traditional performance studio upgraded with interactive equipment.
  • each performance studio includes at least one interactive display to receive interact content, such as voting results, shout outs, emotapplause, and other feedback from the watching audience.
  • the at least one production center is configured to control a variety of network operations and provide a datacenter for the interactive event.
  • the production center monitors the flow of content to the interactive clients to maintain a log of the event and ensure quality reception of the content sent to the client. Quality levels may be adjusted in a variety of ways including bandwidth throttling, data compression, refresh rate manipulation, and adjustment of packet size and/or frequency.
  • the production center may also receive the content transmitted by the interactive client for additional processing, including interactive content sampling, filtering, and transformation.
  • content may be filtered prior to transmission to the performance studio.
  • Filtered content may merely be removed from the feedback stream.
  • filtered content may be replaced with alternative content expressing a similar intent, but in a more acceptable manner.
  • Another form of filtering includes the relative weighting of received responses from the interactive client endpoints. This allows the performer to get a feel for the response of the audience. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of interactive content filtration may be substituted for the specific embodiment of filtering as shown.
  • the illustrated configuration of the system 300 may also include a wide variety of alternate and/or equivalent implementations.
  • the performance studio and the production center may be the same location.
  • some of the interactive clients may also be co-located at the performance studio and/or the production center.
  • the event is established in block 410 .
  • the established event may include information about the performers at the event, size (number of available tickets), ticket sales thresholds, anticipated playlists, online location of the event, and other particulars about the event.
  • tickets or admission codes for the event are issued based on event information.
  • the method 400 begins to determine which interactive clients may have access to the data being transmitted.
  • Query block 440 handles this by determining whether the soliciting client has ticket or admission code. If not then the soliciting client is encouraged to purchase a ticket in block 420 . If the client has a ticket, then they are allowed into the event in block 450 .
  • the interactive client Upon registering with the event coordinators, the interactive client will be allowed to receive the event stream in block 460 , including at least one integrated multimedia audio and video stream from the performance studio.
  • the integrated multimedia audio and video stream includes multiple synchronized streams, one for each camera angle.
  • Monitoring block 470 determines whether the event has concluded. If not concluded, the method 400 continues to accept and process interactive inputs from the interactive client, such as requests to change camera angles 482 , voting information 484 including votes regarding upcoming playlists, emotapplause 486 , and shout outs 488 . If the event has concluded, the method 400 directs interactive clients towards after party presentations 490 associated with the event, which may include post videos 494 , post photos 496 , post notes 498 , and other post event offerings.
  • the post videos 494 may include the entire event stream for review of the interactive client.
  • the post photos 496 may include a collection of images from the event and/or publicity shots of the performers at the event.
  • the post notes 498 may include links to additional information about the performers at the event, including future concerts that may be available.
  • FIGS. 5A-5D block diagram views of portions of user interfaces ( 510 , 520 , 530 , 540 ) are illustrated.
  • Each user interface generated in an interactive feedback system configured for compelling live event quality via relative interactivity in accordance with various embodiments.
  • portions of user interface 510 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit emotapplause via an emoticon indicative of a current emotional state of the event attendee.
  • Emotapplause is a mechanism of sending non-verbal communication from the audience members to the performers. By clicking graphical representations of the emotapplause (such as clapping hands, a heart, etc) a message is sent to a centralized service that aggregates all of the feedback from the audience. The performers then see a graphical representation of the aggregated feedback.
  • emotapplause messages may be displayed to the performers based on statistical aggregation of the number of times each emotapplause item is clicked by audience members in accordance with one embodiment. It may be appreciated by those of ordinary skill in the art and others that a variety of algorithms may be used to determine the quantity, size and intensity of the animation that is presented to the performers. For example, if a statistically larger percentage of the audience is clicking one icon in the most recent set of data received from the interactive clients, the associated animation may be larger than the other animations for the less used emotapplause at that moment.
  • emotapplause if one form of emotapplause is trending up in total number of clicks over a number of recent requests for data from the service that could result in the corresponding animations also growing in size, quantity and/or intensity. Similarly, if a trend is downward, the corresponding animations could shrink in size, quantity, and/or intensity.
  • different animations may be displayed to indicate some such large milestone has been hit when detected emotapplause images from the audience hit a designated milestone in number or a threshold gauging relative intensity of user actions is reached.
  • multiple animations may be shown simultaneously, and/or different display surfaces may show different sets of animations where the placement of the display surfaces could indicate a higher or lower priority to the performer or audience.
  • animations on the audience member's interface could also show similar animations based on the activity of the overall audience, so they will be able to see how active different emotapplause items are.
  • Various embodiments enable animations to be overlaid on the video stream to allow audience members to see exactly what the performers are seeing.
  • portions of user interface 520 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit a prioritized interactive playlist.
  • One of the best ways to keep an audience engaged in an event is to give them some control of how the event unfolds. Providing a voting mechanism allows them to decide what song is played next, what topic is covered next or the audience decision on the outcome of some sporting event or any number of other mechanisms for impacting the flow of the event based on popular vote. Voting can be presented as a list of choices below some header describing what is currently being voted on. Each choice has an option for the audience member to make or change their choice. When they make a choice, it is sent to the service which tallies the votes and provides summary information in the client data requests.
  • voting mechanisms may be substituted for the specific embodiment of voting on a presented playlist as shown.
  • the questions to be voted on can be sent in real time by an administrator, based on input by the performers.
  • the voting results can be presented in real-time to performers and/or audience members.
  • One embodiment allows past ballot results or voting history to be saved for later use and review.
  • portions of user interface 530 are shown illustrating the solicitation on an interactive client of an event attendee to provide and transmit a virtual shout out to the performer.
  • Shout outs are a text messages sent from the audience members to the performers and audience members. The intent of the shout out is for the audience members to be able to send a directed message or question to the performers.
  • the audience members In addition to the performer seeing the message at the performance venue, the audience members also see a subset of the messages, thus providing a sense of community among all of the audience members.
  • messages from audience members may be filtered if the same message is sent multiple times in a row to prevent “spamming” of messages to the participants.
  • messages from audience members may also be filtered based on content and length.
  • the audience and/or performers may be shielded from inappropriate content or specific topics.
  • a message can be filtered if too long to prevent situations where information download would be slowed by extra long messages.
  • One variation allows long messages to be parsed and resent separately, while another throws out long messages. Determining which action should be taken may be based in part on the content of the message.
  • specific audience members can be blocked from sending messages if they are found to be consistently sending inappropriate messages and/or “spamming” messages.
  • messages are blocked, various embodiments allow the audience member to still see their message as if they were sent, so that they are unaware that messages they send have been blocked.
  • messages that are displayed to audience member and/or performers are displayed for a relative period of time related to the length of the message, so that longer messages are displayed longer while short messages go by faster. This helps audience and/or artist to both read and comprehend messages before they disappear. For example, messages like “yay!” take less time to comprehend than more complex messages like “That was amazing, what were you thinking when you wrote that song?”
  • the message animations at event location may be overlaid on the video stream to allow audience members to see exactly what the performers are seeing.
  • the client when the incoming content is slow, for example from a low attendance event, the client may show messages from farther back in time.
  • one embodiment monitors and limits the length of time that an old message may be used to prevent displayed messages from seeming out of context due to latency since the message was originally sent.
  • portions of user interface 540 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit a desired camera angle to the producer. Audience members may have a specialized interest in the performing band and camera angle selection allows the event attendee to choose the position of their virtual seat in the performance hall.
  • FIG. 6 a block diagram view of a portion of an interactive client interface 600 of an online interactive event environment is illustrated showing portions of the presentation during an interactive event in accordance with various embodiments of the present disclosure.
  • the interactive client interface 600 may include a video presentation of the event, and audio presentation of the event, or some combination thereof.
  • the illustrated interactive client interface 600 incorporates each of the previously discussed user interfaces 510 , 520 , 530 , and 540 into the event presentation.
  • the illustrated embodiment also shows event sponsorship of the event. Accordingly, this sponsorship may be sold in accordance with a variety of advertising mechanisms, including but not limited to per event, per song, per minute, per impression, or some combination thereof.
  • an event sponsor may present customized logos and marketing material targeted for the audience of the event.
  • On embodiment provides promotional links on the presentation page of event. When clicked, another window may open without interrupting the stream.
  • a sponsorship link may change the look of the event interface.
  • Other more subtle methods of promotion also considered within the scope of the disclosure include use of a watermark and/or background images and/or desktop/window wallpaper of promotional material.
  • FIG. 7 a block diagram view of a portion of an interactive client interface 700 of after party presentations associated with an online interactive event environment is illustrated in accordance with various embodiments of the present disclosure.
  • the interface 700 shows various after party presentations including links to websites associated with the presenter, sponsors, upcoming events, topical news, photo and video archives, discussion boards, and other information associated with the online interactive event.
  • a playback of the event such as a highlight reel, may also be available at the after party interactive client interface 700 .

Abstract

An interactive event allows clients to provide feedback to the performing artist and/or producers relative to the event being observed. Feedback options include shout outs, emotapplause, and voting.

Description

    TECHNICAL FIELD
  • Embodiments of the present disclosure generally relate to data evaluation, categorization, and presentation. More particularly, the embodiments of the present disclosure relate to systems which deliver live entertainment online via the Internet in various media forms and allow both the performer and the observer to interact.
  • BACKGROUND
  • Attempts to display media on computers date back to the earliest days of computing. However, little progress was made for several decades, primarily due to the high cost, limited capabilities and to a lesser extent compatibilities of available computer hardware. Recently consumer-grade personal computers have become powerful enough to display various types of media, including high quality audio and/or video streams.
  • Streaming multimedia represents one method of media distribution. In essence streaming multimedia is multimedia that is broadcast by a streaming provider to an end-user. Generally, the term streaming specifically refers to the delivery method of the data rather than to the content. Unfortunately, streaming typically requires tremendous bandwidth and/or latency to cache the data locally. Recent advances in computer networks combined with powerful home computers and modern operating systems have made possible, i.e. practical and affordable, the near universal distribution of streaming media for ordinary consumers. Universal distribution represents multimedia that is constantly received by, and normally presented to, an end-user while it is being delivered by a streaming provider.
  • A stream of media can be on demand or live. On demand streams are stored on a server for a long period of time, and are available to be transmitted at a user's request. Live streams may still use a server to broadcast the event, but are typically only available at one particular time, such as a video stream of a live sporting event, a political debate, educational lecture, or a concert. Live streams may be edited and converted into on demand streams for later content consumption. Current on demand or live streams lose any possibility for constructive feedback from the streaming targets. Essentially, live online presentations to large streaming audiences generally only provide unidirectional information in a manner that is difficult to facilitate observer participation. On demand performances are presented after the fact, preventing the presenter and/or observer(s) from directly altering the previously recorded presentation.
  • SUMMARY
  • In view of the problems in the state of the art, embodiments of the invention are based on the technical problem of optimizing interactive live events, categorization, and presentation in an online environment. While the internet already allows many services for one way communication and event broadcast, there have been no options for providing realtime, two way interactivity between audience members and the people creating the event. Systems and methods presented in this disclosure provide this very type of interactivity to create truly compelling live events on the internet.
  • One illustrated and described method provides large scale, real-time interactivity between distributed audience members on the internet and performers in an interactive event. Multiple types of interaction are possible, including direct text communication in the form of shoutouts as well as non-verbal communication, such as Emotapplause, that represents real-world feedback mechanisms like applause, fists in the air, peace signs, or throwing kisses, among others.
  • A system, suitable to solve the problems which at least one embodiment of the invention is based on, generates an interactive online event forum, such as a widget that may be used to solicit feedback from observer(s) and facilitate feedback response by the event presenter and/or publisher, to produce events for consumption by online audiences. The events could be live music performances, sporting events, political meetings, education or informational lectures, news broadcasts, travel logs, game shows, or any other type of event where the “performers” are at a central location or multiple locations and the audience is distributed across the internet, or in any number of locations with internet capable devices for communicating.
  • Feedback may be generated via a client side module to input data on relative performance quality and relative emotional response from the perspective of the observer. After receiving input, the rank-value of a particular monitored response can be calculated and presented back to the performer. In one embodiment, that ranking can be shown or used to sort lists in some embodiments. Monitored responses may include a playlist compiled of proposed songs/material for the artist to consider, subject matter for further discussion, desired topics for debate, questions regarding covered material or stated positions, and relative emotional responses to the content currently being presented. These customized lists or relative feedback of the monitored responses may be used to attract additional observers and/or alter the performance of the presenter(s).
  • Various types of relative feedback options may include shout outs, emotapplause, and voting. Shout outs are a text messages sent from the audience members to the performers and audience members. The intent of the shout out is for the audience members to be able to send a directed message or question to the performers. In addition to the performer seeing the message at the performance venue, the audience members also see a subset of the messages, thus providing a sense of community among all of the audience members. Because the number of audience members could be very large for a worldwide internet event, there is no guarantee that all messages will be presented to the performers but due to the mechanism of transferring shout out messages, a good random sampling of messages from all audience members will be presented to both the performers and other audience members.
  • Emotapplause is a mechanism of sending non-verbal communication from the audience members to the performers. By clicking graphical representations of the emotapplause (such as clapping hands, a heart, etc) a message is sent to a centralized service that aggregates all of the feedback from the audience. The performers then see a graphical representation of the aggregated feedback. The actual experience by the performer changes based on how many audience members are using that emotapplause image at that moment, so if 70% of the audience was ‘clapping’ and 10% of the audience was sending kisses, the visualization might include very large clapping hands, or perhaps many clapping hands and a smaller representation of kissing lips.
  • One of the best ways to keep an audience engaged in an event is to give them some control of how the event unfolds. Providing a voting mechanism allows them to decide what song is played next, what topic is covered next or the audience decision on the outcome of some sporting event or any number of other mechanisms for impacting the flow of the event based on popular vote.
  • An event producer may use the ranked lists generated for the client/user interface in a variety of ways including to present audience feedback for the performer and to alter a previously recorded presentation for the individual observer. A producer may solicit feedback by an observer and/or performer according to a variety of factors including prior event experience, available repertoire, relative quality of received feedback, and real time education of audience trends. Producers may also maintain and/or improve participant (performer and/or audience) satisfaction by reviewing the event rankings and comparing the responses with comparable events to identify trends. By decreasing or eliminating any discrepancies with participant expectations, the producer will likely increase the quality of the participant experience and thereby reduce marketing costs associated with bringing new events to market.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Non-limiting and non-exhaustive exemplary embodiments of the present disclosure are described with reference to the following drawings in which:
  • FIG. 1 illustrates a block diagram view of computer systems in an online interactive event environment in accordance with at least one embodiment;
  • FIG. 2 illustrates a block diagram view of components contained in an interactive client system configured in accordance with at least one embodiment;
  • FIG. 3 illustrates a block/flow diagram view of a portion of computer systems to filter feedback in an exemplary online interactive event environment in accordance with at least one embodiment;
  • FIG. 4 illustrates a flow diagram view of a method of a portion of operation for interactive event data evaluation, categorization, and presentation in accordance with at least one embodiment;
  • FIGS. 5A-5D illustrate block diagram views of portions of user interfaces, each generated in an interactive feedback system configured for compelling live event quality via relative interactivity in accordance with various embodiments;
  • FIG. 6 illustrates a block diagram view of a portion of an interactive client interface of an online interactive event environment during presentation of the event in accordance with various embodiments of the present disclosure; and
  • FIG. 7 illustrates a block diagram view of a portion of an interactive client interface of an online interactive event environment for various after party presentations associated with an online interactive event in accordance with various embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • In the following detailed description, reference is made to the accompanying drawings which form a part hereof wherein like numerals designate like parts throughout, and in which are shown, by way of illustration, specific embodiments in which the disclosure may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present disclosure. Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of a portion of the present disclosure is defined by the appended claims and their equivalents.
  • Throughout the specification and claims, the following terms take at least the meanings explicitly associated herein, unless the context clearly dictates otherwise. The meanings identified below are not intended to limit the terms, but merely provide illustrative examples for use of the terms. The meaning of “a,” “an,” and “the” may include reference to both the singular and the plural. Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The meaning of “in” may include “in” and “on.” The appearances of the phrases “in one embodiment” or “in an embodiment” in various places in the specification do not necessarily all refer to the same embodiment, but it may. The term “connected” may mean a direct electrical, electro-magnetic, mechanical, logical, or other connection between the items connected, without any electrical, mechanical, logical or other intermediary therebetween. The term “coupled” can mean a direct connection between items, an indirect connection through one or more intermediaries, or communication between items in a manner that may not constitute a connection. The term “circuit” or “circuitry” as used in any embodiment described herein, can mean a single component or a plurality of components, active and/or passive, discrete or integrated, that are coupled together to provide a desired function and may include, for example, singly or in any combination, hardwired circuitry, programmable circuitry, state machine circuitry, and/or firmware that stores instructions executed by programmable circuitry. The term “signal” can mean at least one current, voltage, charge, data, or other such identifiable quantity.
  • In an effort to clarify comparative phrases used in the specification and the claims of this disclosure, please note that the following phrases take at least the meanings indicated and associated herein, unless the context clearly dictates otherwise. The phrase “A/B” means “A or B”. The phrase “A and/or B” means “(A), (B), or (A and B)”. The phrase “at least one of A, B and C” means “(A), (B), (C), (A and B), (A and C), (B and C) or (A, B and C)”. The phrase “(A) B” means “(A B) or (B)”, that is “A” is optional.
  • In addition, various embodiments depicted in FIG. 1 through FIG. 4 are represented by block diagrams and flow diagrams that illustrate in more detail scope of the present disclosure. The block diagrams often illustrate certain embodiments of modules for performing various functions of the present invention. In general, the represented modules include therein executable and operational data for operation within a system as depicted in FIG. 1 and/or FIG. 2 in accordance with embodiments of the present disclosure. Various operations of the system may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding embodiments of the present disclosure; however, the order of description should not be construed to imply that these operations are order dependent.
  • As used herein, the term executable code, or merely “executable,” is intended to include any type of computer instruction and computer executable code that may be located within a memory device and/or transmitted as electronic signals over a system bus or network. An identified module of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be located together, but may comprise disparate instructions stored in different locations which together comprise the module and achieve the purpose stated for the module. Indeed, an executable may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices. Similarly, operational data may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may at least partially exist merely as electronic signals on a system bus or network.
  • Referring now to FIG. 1, a block diagram of various computer systems in an online interactive event environment 100 is shown. Computer systems useful for data evaluation, categorization, and presentation of interactive events are shown in accordance with various embodiments of the present disclosure. The online interactive event environment 100 includes both a variety of operating environments and a variety of network devices. Operating environments within the online interactive event environment 100 may include, but are not limited to, multiple interactive client endpoints that may attach via a communication network, such as the internet, to a production center and/or one or more performance studios. In one embodiment, the production center includes network operations and a datacenter. In one embodiment, the performance studio includes an event studio, an event database, an event interface, and an interactive display. The production center and performance studio may be separately connected via a private communication network or via a virtual private network across a public communication network, such as the internet.
  • An interactive client endpoint may represent a variety of consumer devices including, but not limited to, general purpose computer systems, personal digital assistants, digital media players, mobile telephones, video equipment, application specific devices, and other digital communication devices.
  • Performance centers provide executable code and operational data to the interactive client endpoints, directly and indirectly via the production center. Interactive client endpoints, in accordance with various embodiments, can be visitors of the event website, people who own or purchase a ticket, employees of the production company running the web site, or any other types of people or device that may participate in the interactive event. Various multimedia devices may be used to upload a rich variety of media information for or about an event to the event profile. For example, multiple cameras or webcams may be used to collect video images of an event, conduct a separate web interviews, and/or provide a video preview of an event. Likewise, multiple microphones may be used to collect sound from the event and/or associated interviews or advertisements.
  • In one embodiment, the audience member at the interactive client endpoint joins an ongoing event and initiates interactivity with the event by typing a message, clicking or otherwise choosing an emotapplause image, voting for event presentation lists, selecting a camera angle, or some other method of indicating the message they would like to send. The messages are then sent to a centralized internet web service that adds user information about that audience member such as their name, image, location, source, etc. That information is then stored in a central database or data store such that the web service can index, search, log and recall each request, or aggregated totals of requests.
  • Interactive client applications can then periodically issue requests for the current summary state of the interactivity information. That information includes a set of recent shout out messages and their related metadata, the current aggregate information for emotapplause items, current voting topics and voting choices, and any other status information that is helpful for the client to be able to process this data. Because of the potential quantity of requests coming from audience members, various caching mechanisms can be used to reduce the overhead spent gathering this information on every request. To maintain relevancy it is important that the information sent out to clients be very current, so as to maintain the feeling of interactivity at the event. In one embodiment, shout out messages are not allowed to be more than about 30 seconds old (time they were sent from audience member) and preferably represent the most recent messages received by the system. The response to the interactive client may be encoded in at least one of a variety of different formats, including but not limited to, XML, JSON, CSV, and the like.
  • In one embodiment, when the interactive audience client or performance studio client initially receives the data, they present the information to the performers or audience members in an appropriate way. For the performers, that may be showing the name of the audience member, their image, location and the shoutout message itself in an interesting animation. Some additional options for emotapplause and shoutouts are described below with reference to FIG. 5A and FIG. 5C.
  • Referring now to FIG. 2, a computer system is shown for implementing at least one embodiment of the invention, the system including a computing device 200 in which executable and operational data may be hosted and transmitted to one or more interactive stations via a communication network of the previously described online interactive event environment 100. Computing device 200 may be configured as a client, a server, a mobile device, or any other computing device that interacts with data in a network based collaboration system.
  • In a very basic configuration, computing device 200 typically includes at least one processing unit 220. In one embodiment, the processing unit 220 includes at least one processor. As such, the term “processor”, as used herein, should be interpreted to mean an individual processor, firmware logic, reconfigurable logic, a hardware description language logic configuration, a state machine, an application-specific integrated circuit, a processing core co-disposed in an integrated circuit package with at least one other processing core and/or component, or combinations thereof.
  • The processing unit 220 may be operably connected to system memory 210. Depending on the exact configuration and type of computing device, system memory 210 may be non-volatile memory 211 (such as ROM, flash memory, etc.), volatile memory 214 (such as RAM), or some combination of the two. System memory 210 typically includes Basic Input/Output System (BIOS) firmware code 212, an operating system 215, one or more applications 216, and may include program modules and data 217. A configuration library 218 (e.g., registries), which contain code and data to be shared and changed in a modular or database fashion to provide services to applications 216 and programs 217 is also often included in system memory 210.
  • Computing device 200 may have additional features or functionality. For example, computing device 200 may also have a dedicated graphics rendering device, such as video adapter 230 coupled with at least one display monitor 235. Computing device 200 may also have a variety of human input device(s) (HID) 259 such as keyboard, mouse, pen, voice input device, touch input device, and the like. In a broader sense, human input device (HID) 259 may also include various output devices such as a display monitor 235, speakers, printer, and the like. Computing device 200 may utilize a variety of ports via port interface 250 to share data including wireless ports 253, parallel ports 255, and serial ports 257. Each of these port types may include further varieties, for example serial ports may include a Universal Serial Bus (USB) port and/or a FireWire/IEEE 1394 port.
  • In various embodiments, computing device 200 may also include a storage drive interface 240 for communication with additional data storage devices (removable and/or non-removable) such as, for example, magnetic disk drives 242, optical disk drives 243, hard disk drives 244, tape drives, and other storage devices. Such additional storage is illustrated in FIG. 2 by removable magnetic storage 241 and removable optical storage 249 and non-removable storage (hard disk drive 244).
  • Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. System memory 210, removable storage and non-removable storage are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 200. Any such computer storage media may be used to store desired information, such as operating system 245, one or more applications 246, programs 247, and/or registries and configuration libraries 248 accessible to computing device 200.
  • Computing device 200 may also contain a communication connection via port interface 250 and/or network interface card 260 that allows the device 200 to communicate with other remote computing devices 280, such as over a communication network. The communication network may comprise a local area network (LAN) and/or a wide area network (WAN). Each network may be wired or wireless or combination thereof. The communication network may also comprise other large scale networks including, but not limited to, intranets and extranets, or combinations thereof. In one embodiment the communication network is an interconnected system of networks, one particular example of which is the Internet and the World Wide Web supported on the Internet.
  • A variety of configurations may be used to connect the computing device 200 to the remote computing devices 280. For example, although modem 265 is illustrated as connecting to the remote computing device 280, a remote server, via a WAN and network interface 260 is illustrated as connecting via a LAN, both the network interface 260 and/or the modem 265 may just as well be coupled to other large scale networks including, but not limited to, a global system of interconnected computer networks (internet), various intranets and extranets, or combinations thereof.
  • The information transmitted as data across the previously discussed communication connections are examples of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
  • Although many of the examples refer to computing devices with a single operating system, file system and configuration library, the concepts, principles, and examples disclosed below may be extended to provide interactive event functionality across several or many operating systems, file systems, and/or configurations libraries (e.g., registries). Accordingly, it is contemplated that the principles described herein may be applied to these and other computing systems and devices, both existing and yet to be developed, using the methods and principles disclosed herein.
  • Turning now to FIGS. 3 and 4, methods and various operations of the interactive event system, in accordance with at least one embodiment, are described in terms of firmware, software, and/or hardware with reference to flowcharts and/or flow diagrams. More specifically, FIG. 3 is a block and flow diagram schematically showing a portion of various computer systems configured to filter feedback in an exemplary online interactive event environment in accordance with at least one embodiment. FIG. 4 is a flow diagram showing a portion of a method of operation for interactive event data evaluation, categorization, and presentation in accordance with at least one embodiment. Describing a method and/or various operations by reference to a flowchart enables one skilled in the art to develop programs, including instructions to carry out the methods on suitably configured computer systems and electronic devices. In various embodiments, portions of the operations to be performed by an electronic device or computer system may constitute circuits, general purpose processors (e.g., micro-processors, micro-controllers, an ASIC, or digital signal processors (DSPs)), special purpose processors (e.g., application specific integrated circuits or ASICs), firmware (e.g., firmware that is used by a processor such as a micro-processor, a micro-controller, and/or a digital signal processor), state machines, hardware arrays, reconfigurable hardware, and/or software made up of executable instructions. The executable instructions may be embodied in firmware logic, reconfigurable logic, a hardware description language, a state machine, an application-specific integrated circuit (ASIC), or combinations thereof.
  • With respect to various embodiments using a software implementation (e.g., a hardware simulator), at least one of the processors of a suitably configured electronic communication device, such as a computer, executes the instructions from a storage medium. The computer-executable instructions may be written in a computer programming language or executable code. If written in a programming language conforming to a recognized standard, such instructions may be executed on a variety of hardware platforms and may interface with a variety of operating systems. Although the various embodiments are not described with reference to any particular programming language, it will be appreciated that a variety of programming languages may be used to implement the teachings of the embodiments as described herein. Furthermore, it is common in the art to speak of software in one form or another (e.g., program, procedure, process, application, etc.) as taking an action or causing a result. Such expressions are merely a shorthand way of saying that execution of the software by a device causes the processor of the computer to perform an action or a produce a result.
  • Referring now to FIG. 3, a block and flow diagram view of a portion of a system 300 configured to filter feedback in an exemplary online interactive event environment is illustrated in accordance with at least one embodiment. The system 300 includes one or more performance studios for producing the underlying content for the event, a production center to produce and monitor the interactive event, and a plurality of interactive client endpoints to generate the interactive content associated with the underlying content of the event.
  • Each of the one or more interactive client endpoints are configured to receive the data transmitted from the performance studio and transmit user-generated interactive feedback associated with the interactive event back to the production center and/or the performance studio. In one embodiment, the various multimedia streams received by the client include camera captured feedback from performing artist and fans in the studio audience. Accordingly, the user-generated interactive content transmitted by the client may include voting results, shout outs, emotapplause, and other feedback solicited and/or generated from the watching audience.
  • In various embodiments, the performance studio may be a customized interactive studio, such as a Deep Rock Drive certified performance studio or a traditional performance studio upgraded with interactive equipment. In one embodiment, each performance studio includes at least one interactive display to receive interact content, such as voting results, shout outs, emotapplause, and other feedback from the watching audience.
  • The at least one production center is configured to control a variety of network operations and provide a datacenter for the interactive event. In one embodiment, the production center monitors the flow of content to the interactive clients to maintain a log of the event and ensure quality reception of the content sent to the client. Quality levels may be adjusted in a variety of ways including bandwidth throttling, data compression, refresh rate manipulation, and adjustment of packet size and/or frequency. In one embodiment, the production center may also receive the content transmitted by the interactive client for additional processing, including interactive content sampling, filtering, and transformation.
  • In one embodiment, content may be filtered prior to transmission to the performance studio. Filtered content may merely be removed from the feedback stream. Alternatively, filtered content may be replaced with alternative content expressing a similar intent, but in a more acceptable manner. Another form of filtering includes the relative weighting of received responses from the interactive client endpoints. This allows the performer to get a feel for the response of the audience. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of interactive content filtration may be substituted for the specific embodiment of filtering as shown.
  • Moreover, the illustrated configuration of the system 300 may also include a wide variety of alternate and/or equivalent implementations. For example, in at least one embodiment, the performance studio and the production center may be the same location. Moreover, in at least one embodiment, some of the interactive clients may also be co-located at the performance studio and/or the production center.
  • Referring now to FIG. 4, a flow diagram view of a portion of a method of operation 400 for interactive event data evaluation, categorization, and presentation is illustrated in accordance with at least one embodiment. Initially the event is established in block 410. The established event may include information about the performers at the event, size (number of available tickets), ticket sales thresholds, anticipated playlists, online location of the event, and other particulars about the event. In block 420 tickets or admission codes for the event are issued based on event information.
  • Once the event opens in block 430, such as the beginning of a performance, the method 400 begins to determine which interactive clients may have access to the data being transmitted. Query block 440 handles this by determining whether the soliciting client has ticket or admission code. If not then the soliciting client is encouraged to purchase a ticket in block 420. If the client has a ticket, then they are allowed into the event in block 450. Upon registering with the event coordinators, the interactive client will be allowed to receive the event stream in block 460, including at least one integrated multimedia audio and video stream from the performance studio. In one embodiment, the integrated multimedia audio and video stream includes multiple synchronized streams, one for each camera angle.
  • Monitoring block 470 determines whether the event has concluded. If not concluded, the method 400 continues to accept and process interactive inputs from the interactive client, such as requests to change camera angles 482, voting information 484 including votes regarding upcoming playlists, emotapplause 486, and shout outs 488. If the event has concluded, the method 400 directs interactive clients towards after party presentations 490 associated with the event, which may include post videos 494, post photos 496, post notes 498, and other post event offerings. In one embodiment, the post videos 494 may include the entire event stream for review of the interactive client. In one embodiment, the post photos 496 may include a collection of images from the event and/or publicity shots of the performers at the event. In one embodiment, the post notes 498 may include links to additional information about the performers at the event, including future concerts that may be available.
  • Referring now to FIGS. 5A-5D, block diagram views of portions of user interfaces (510, 520, 530, 540) are illustrated. Each user interface generated in an interactive feedback system configured for compelling live event quality via relative interactivity in accordance with various embodiments.
  • In FIG. 5A, portions of user interface 510 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit emotapplause via an emoticon indicative of a current emotional state of the event attendee. Emotapplause is a mechanism of sending non-verbal communication from the audience members to the performers. By clicking graphical representations of the emotapplause (such as clapping hands, a heart, etc) a message is sent to a centralized service that aggregates all of the feedback from the audience. The performers then see a graphical representation of the aggregated feedback. The actual experience by the performer changes based on how many audience members are using that emotapplause image at that moment, so if 70% of the audience was ‘clapping’ and 10% of the audience was sending kisses, the visualization might include very large clapping hands, or perhaps many clapping hands and a smaller representation of kissing lips. Other sample emoticons include a lighter, a unity or rock-on fist, a hang-loose or horned devil hand sign, a virtual bra, and clapping. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of emotapplause may be substituted for the specific embodiment of emotapplause shown. For example, emotapplause messages may be displayed to the performers based on statistical aggregation of the number of times each emotapplause item is clicked by audience members in accordance with one embodiment. It may be appreciated by those of ordinary skill in the art and others that a variety of algorithms may be used to determine the quantity, size and intensity of the animation that is presented to the performers. For example, if a statistically larger percentage of the audience is clicking one icon in the most recent set of data received from the interactive clients, the associated animation may be larger than the other animations for the less used emotapplause at that moment. Alternatively, in one embodiment, if one form of emotapplause is trending up in total number of clicks over a number of recent requests for data from the service that could result in the corresponding animations also growing in size, quantity and/or intensity. Similarly, if a trend is downward, the corresponding animations could shrink in size, quantity, and/or intensity. In one embodiment, different animations may be displayed to indicate some such large milestone has been hit when detected emotapplause images from the audience hit a designated milestone in number or a threshold gauging relative intensity of user actions is reached. In one embodiment, multiple animations may be shown simultaneously, and/or different display surfaces may show different sets of animations where the placement of the display surfaces could indicate a higher or lower priority to the performer or audience. In one embodiment, animations on the audience member's interface could also show similar animations based on the activity of the overall audience, so they will be able to see how active different emotapplause items are. Various embodiments enable animations to be overlaid on the video stream to allow audience members to see exactly what the performers are seeing.
  • In FIG. 5B, portions of user interface 520 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit a prioritized interactive playlist. One of the best ways to keep an audience engaged in an event is to give them some control of how the event unfolds. Providing a voting mechanism allows them to decide what song is played next, what topic is covered next or the audience decision on the outcome of some sporting event or any number of other mechanisms for impacting the flow of the event based on popular vote. Voting can be presented as a list of choices below some header describing what is currently being voted on. Each choice has an option for the audience member to make or change their choice. When they make a choice, it is sent to the service which tallies the votes and provides summary information in the client data requests. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of voting mechanisms may be substituted for the specific embodiment of voting on a presented playlist as shown. For example, the questions to be voted on can be sent in real time by an administrator, based on input by the performers. In one embodiment, the voting results can be presented in real-time to performers and/or audience members. One embodiment allows past ballot results or voting history to be saved for later use and review.
  • In FIG. 5C, portions of user interface 530 are shown illustrating the solicitation on an interactive client of an event attendee to provide and transmit a virtual shout out to the performer. Shout outs are a text messages sent from the audience members to the performers and audience members. The intent of the shout out is for the audience members to be able to send a directed message or question to the performers. In addition to the performer seeing the message at the performance venue, the audience members also see a subset of the messages, thus providing a sense of community among all of the audience members. Because the number of audience members could be very large for a worldwide internet event, there is no guarantee that all messages will be presented to the performers but due to the mechanism of transferring shout out messages, a good random sampling of messages from all audience members will be presented to both the performers and other audience members. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of selective instant messaging mechanisms may be substituted for the specific shout out embodiment shown. For example, in one embodiment, messages from audience members may be filtered if the same message is sent multiple times in a row to prevent “spamming” of messages to the participants. Moreover, messages from audience members may also be filtered based on content and length. In one embodiment, the audience and/or performers may be shielded from inappropriate content or specific topics. In one embodiment, a message can be filtered if too long to prevent situations where information download would be slowed by extra long messages. One variation allows long messages to be parsed and resent separately, while another throws out long messages. Determining which action should be taken may be based in part on the content of the message.
  • In one embodiment, specific audience members can be blocked from sending messages if they are found to be consistently sending inappropriate messages and/or “spamming” messages. When messages are blocked, various embodiments allow the audience member to still see their message as if they were sent, so that they are unaware that messages they send have been blocked.
  • In one embodiment, messages that are displayed to audience member and/or performers are displayed for a relative period of time related to the length of the message, so that longer messages are displayed longer while short messages go by faster. This helps audience and/or artist to both read and comprehend messages before they disappear. For example, messages like “yay!” take less time to comprehend than more complex messages like “That was amazing, what were you thinking when you wrote that song?” In one embodiment, the message animations at event location may be overlaid on the video stream to allow audience members to see exactly what the performers are seeing.
  • In one embodiment, when the incoming content is slow, for example from a low attendance event, the client may show messages from farther back in time. However, one embodiment monitors and limits the length of time that an old message may be used to prevent displayed messages from seeming out of context due to latency since the message was originally sent.
  • In FIG. 5D, portions of user interface 540 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit a desired camera angle to the producer. Audience members may have a specialized interest in the performing band and camera angle selection allows the event attendee to choose the position of their virtual seat in the performance hall.
  • Referring now to FIG. 6, a block diagram view of a portion of an interactive client interface 600 of an online interactive event environment is illustrated showing portions of the presentation during an interactive event in accordance with various embodiments of the present disclosure. The interactive client interface 600, in one embodiment, may include a video presentation of the event, and audio presentation of the event, or some combination thereof. The illustrated interactive client interface 600 incorporates each of the previously discussed user interfaces 510, 520, 530, and 540 into the event presentation. In addition, the illustrated embodiment also shows event sponsorship of the event. Accordingly, this sponsorship may be sold in accordance with a variety of advertising mechanisms, including but not limited to per event, per song, per minute, per impression, or some combination thereof. In one embodiment, an event sponsor may present customized logos and marketing material targeted for the audience of the event. On embodiment provides promotional links on the presentation page of event. When clicked, another window may open without interrupting the stream. Alternatively, a sponsorship link may change the look of the event interface. Other more subtle methods of promotion also considered within the scope of the disclosure include use of a watermark and/or background images and/or desktop/window wallpaper of promotional material.
  • Referring now to FIG. 7, a block diagram view of a portion of an interactive client interface 700 of after party presentations associated with an online interactive event environment is illustrated in accordance with various embodiments of the present disclosure. The interface 700 shows various after party presentations including links to websites associated with the presenter, sponsors, upcoming events, topical news, photo and video archives, discussion boards, and other information associated with the online interactive event. In various embodiments, a playback of the event, such as a highlight reel, may also be available at the after party interactive client interface 700.
  • The above specification, examples and data provide a description of the manufacture and use of the composition of the invention. Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations may be substituted for the specific embodiment shown in the described without departing from the spirit and scope of the present disclosure. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifested and intended that the disclosure be limited only by the claims and the equivalence thereof.

Claims (3)

1. A method for participation in interactive online events, comprising:
requesting admittance to an event from an event production center;
obtaining a ticket for the event from the event production center;
receiving a stream of the event from the event performance studio;
displaying the stream of the event for user observation;
transmitting user initiated feedback messages of an interactive client directed to at least one of the event production center, the event performance studio, other audience clients of the interactive event, and/or back to the interactive client; and
receiving and displaying a stream of the event from the event performance studio.
2. A interactive event system comprising:
a performance studio having at least one interactive display;
a production center for producing an interactive event based on event material received from the performance studio; and
an interactive client having at least one interactive display and configured to generate and transmit event feedback to at least one of the production center, the performance studio, other audience clients of the interactive event, and/or the interactive client.
3. A method of performing for an interactive event, comprising:
a performance studio having at least one interactive display;
a production center for producing an interactive event based on event material received from the performance studio;
an interactive client having at least one interactive display and configured to generate and transmit event feedback to at least one of the production center, the performance studio, other audience clients of the interactive event, and/or the interactive client;
requesting admittance to an event from an event production center;
obtaining a ticket for the event from the event production center;
receiving a stream of the event from the event performance studio;
displaying the stream of the event for user observation;
transmitting user initiated feedback messages of an interactive client directed to at least one of the event production center, the event performance studio, other audience clients of the interactive event, and/or back to the interactive client; and
receiving and displaying a stream of the event from the event performance studio.
US12/568,666 2008-09-26 2009-09-28 Interactive events Abandoned US20100153861A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/568,666 US20100153861A1 (en) 2008-09-26 2009-09-28 Interactive events

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US10070608P 2008-09-26 2008-09-26
US10070408P 2008-09-26 2008-09-26
US10070108P 2008-09-26 2008-09-26
US10070308P 2008-09-26 2008-09-26
US12/568,666 US20100153861A1 (en) 2008-09-26 2009-09-28 Interactive events

Publications (1)

Publication Number Publication Date
US20100153861A1 true US20100153861A1 (en) 2010-06-17

Family

ID=42076479

Family Applications (6)

Application Number Title Priority Date Filing Date
US12/568,666 Abandoned US20100153861A1 (en) 2008-09-26 2009-09-28 Interactive events
US12/586,920 Expired - Fee Related US8442424B2 (en) 2008-09-26 2009-09-28 Interactive live political events
US12/586,923 Expired - Fee Related US9548950B2 (en) 2008-09-26 2009-09-28 Switching camera angles during interactive events
US12/586,921 Abandoned US20100094686A1 (en) 2008-09-26 2009-09-28 Interactive live events
US12/586,922 Abandoned US20100088128A1 (en) 2008-09-26 2009-09-28 Ticket scarcity management for interactive events
US13/894,269 Expired - Fee Related US9160692B2 (en) 2008-09-26 2013-05-14 Interactive live political events

Family Applications After (5)

Application Number Title Priority Date Filing Date
US12/586,920 Expired - Fee Related US8442424B2 (en) 2008-09-26 2009-09-28 Interactive live political events
US12/586,923 Expired - Fee Related US9548950B2 (en) 2008-09-26 2009-09-28 Switching camera angles during interactive events
US12/586,921 Abandoned US20100094686A1 (en) 2008-09-26 2009-09-28 Interactive live events
US12/586,922 Abandoned US20100088128A1 (en) 2008-09-26 2009-09-28 Ticket scarcity management for interactive events
US13/894,269 Expired - Fee Related US9160692B2 (en) 2008-09-26 2013-05-14 Interactive live political events

Country Status (1)

Country Link
US (6) US20100153861A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130005465A1 (en) * 2011-06-29 2013-01-03 EarDish Corporation Audio playlist selections and related entertainment systems and methods
US20130097635A1 (en) * 2011-10-13 2013-04-18 Gface Gmbh Interactive remote participation in live entertainment
EP2621165A1 (en) 2012-01-25 2013-07-31 Alcatel Lucent, S.A. Videoconference method and device
US20150052238A1 (en) * 2013-08-19 2015-02-19 Google Inc. Device Compatibility Management
US20160127288A1 (en) * 2014-11-04 2016-05-05 Calay Venture S.à r.l. System and method for inviting users to participate in activities based on interactive recordings
CN109688006A (en) * 2018-12-24 2019-04-26 北京天元特通科技有限公司 Support the high performance network log information distribution method of object set group dynamic instrumentation

Families Citing this family (92)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10742433B2 (en) * 2003-06-16 2020-08-11 Meetup, Inc. Web-based interactive meeting facility, such as for progressive announcements
US20130218688A1 (en) * 2007-09-26 2013-08-22 Aq Media, Inc. Audio-visual navigation and communication dynamic memory architectures
US20090138554A1 (en) * 2007-11-26 2009-05-28 Giuseppe Longobardi Controlling virtual meetings with a feedback history
US9892028B1 (en) 2008-05-16 2018-02-13 On24, Inc. System and method for debugging of webcasting applications during live events
US10430491B1 (en) 2008-05-30 2019-10-01 On24, Inc. System and method for communication between rich internet applications
US20100153861A1 (en) * 2008-09-26 2010-06-17 Deep Rock Drive Partners Inc. Interactive events
US20140176665A1 (en) * 2008-11-24 2014-06-26 Shindig, Inc. Systems and methods for facilitating multi-user events
WO2010080639A2 (en) * 2008-12-18 2010-07-15 Band Crashers, Llc Media systems and methods for providing synchronized multiple streaming camera signals of an event
US8655383B2 (en) * 2009-06-15 2014-02-18 Alpine Electronics, Inc Content delivery system and method
US20140094153A1 (en) * 2012-10-02 2014-04-03 Alpine Audio Now Digital, LLC System and method of interacting with a broadcaster via an application
US11438410B2 (en) 2010-04-07 2022-09-06 On24, Inc. Communication console with component aggregation
US8706812B2 (en) 2010-04-07 2014-04-22 On24, Inc. Communication console with component aggregation
US8667533B2 (en) * 2010-04-22 2014-03-04 Microsoft Corporation Customizing streaming content presentation
US20110276882A1 (en) 2010-05-04 2011-11-10 Kai Buehler Automatic grouping for users experiencing a specific broadcast media
US8606073B2 (en) 2010-05-12 2013-12-10 Woodman Labs, Inc. Broadcast management system
US20110289539A1 (en) * 2010-05-19 2011-11-24 Kim Sarubbi Multimedia content production and distribution platform
US20120004950A1 (en) * 2010-07-01 2012-01-05 Effective Measure System and method for integrated offline audience validation
US20120116789A1 (en) * 2010-11-09 2012-05-10 International Business Machines Corporation Optimizing queue loading through variable admittance fees
US9009194B2 (en) * 2010-12-01 2015-04-14 Democrasoft, Inc. Real time and dynamic voting
US20120191774A1 (en) * 2011-01-25 2012-07-26 Vivek Bhaskaran Virtual dial testing and live polling
US9246957B2 (en) * 2011-03-04 2016-01-26 Viafoura Systems and methods for interactive content generation
AU2012225536B9 (en) 2011-03-07 2014-01-09 Kba2, Inc. Systems and methods for analytic data gathering from image providers at an event or geographic location
US8845429B2 (en) * 2011-05-27 2014-09-30 Microsoft Corporation Interaction hint for interactive video presentations
US10972530B2 (en) 2016-12-30 2021-04-06 Google Llc Audio-based data structure generation
US11087424B1 (en) 2011-06-24 2021-08-10 Google Llc Image recognition-based content item selection
US8688514B1 (en) 2011-06-24 2014-04-01 Google Inc. Ad selection using image data
US8768139B2 (en) 2011-06-27 2014-07-01 First Principles, Inc. System for videotaping and recording a musical group
US8732739B2 (en) 2011-07-18 2014-05-20 Viggle Inc. System and method for tracking and rewarding media and entertainment usage including substantially real time rewards
US9870552B2 (en) * 2011-10-19 2018-01-16 Excalibur Ip, Llc Dynamically updating emoticon pool based on user targeting
US11093692B2 (en) 2011-11-14 2021-08-17 Google Llc Extracting audiovisual features from digital components
US10586127B1 (en) * 2011-11-14 2020-03-10 Google Llc Extracting audiovisual features from content elements on online documents
US20130144716A1 (en) * 2011-12-06 2013-06-06 Sony Network Entertainment International Llc Advertising opportunities for live streaming contents and services
WO2013088584A1 (en) * 2011-12-14 2013-06-20 京セラ株式会社 Display terminal, power control system, and display method
US20130304575A1 (en) * 2012-05-09 2013-11-14 Michael Fetyko Systems, methods and machine-readable media for providing and managing an interface between fans and celebrities within a social network environment
US20140156752A1 (en) * 2012-05-11 2014-06-05 iConnectUS LLC Software applications for interacting with live events and methods of use thereof and creating custom content streams
EP2670156A1 (en) 2012-06-01 2013-12-04 Thomson Licensing Interactive audio/video broadcast system, method for operating the same and user device for operation in the interactive audio/video broadcast system
WO2013191006A1 (en) * 2012-06-22 2013-12-27 ソニー株式会社 Information processing device, information processing method, and terminal device
US10296532B2 (en) * 2012-09-18 2019-05-21 Nokia Technologies Oy Apparatus, method and computer program product for providing access to a content
US20140089401A1 (en) * 2012-09-24 2014-03-27 Google Inc. System and method for camera photo analytics
US8554873B1 (en) * 2012-10-05 2013-10-08 Google Inc. Custom event and attraction suggestions
US8606872B1 (en) * 2012-10-22 2013-12-10 HotSpots U, Inc. Method and apparatus for organizing, packaging, and sharing social content and social affiliations
US20140214522A1 (en) * 2013-01-30 2014-07-31 Robert Alan Skollar Methods and systems for providing online events
US9461958B1 (en) 2013-03-13 2016-10-04 Greenfly, Inc. Methods and system for distributing information via multiple forms of delivery services
US8782140B1 (en) 2013-03-13 2014-07-15 Greenfly Digital, LLC Methods and system for distributing information via multiple forms of delivery services
US9264474B2 (en) 2013-05-07 2016-02-16 KBA2 Inc. System and method of portraying the shifting level of interest in an object or location
US9953085B1 (en) 2013-05-31 2018-04-24 Google Llc Feed upload for search entity based content selection
US11030239B2 (en) 2013-05-31 2021-06-08 Google Llc Audio based entity-action pair based selection
US20150113403A1 (en) * 2013-10-20 2015-04-23 Eric A. Harvey Simultaneously presenting media on devices
US11429781B1 (en) 2013-10-22 2022-08-30 On24, Inc. System and method of annotating presentation timeline with questions, comments and notes using simple user inputs in mobile devices
CN104596929B (en) * 2013-10-31 2017-06-23 国际商业机器公司 Determine the method and apparatus of air quality
US20150170061A1 (en) * 2013-12-17 2015-06-18 Scott J. Glosserman Web ticketing event system
US9900362B2 (en) 2014-02-11 2018-02-20 Kiswe Mobile Inc. Methods and apparatus for reducing latency shift in switching between distinct content streams
US20150350733A1 (en) * 2014-06-02 2015-12-03 Grid News Bureau, LLC Systems and methods for opinion sharing related to live events
WO2015195394A1 (en) * 2014-06-16 2015-12-23 Google Inc. Surfacing live events in search results
US9832500B2 (en) 2014-07-05 2017-11-28 TiltedGlobe LLC System for enabling a virtual theater
CN104158892A (en) * 2014-08-22 2014-11-19 苏州乐聚一堂电子科技有限公司 Raked stage for interaction of concert
US10785325B1 (en) * 2014-09-03 2020-09-22 On24, Inc. Audience binning system and method for webcasting and on-line presentations
WO2016040833A1 (en) 2014-09-12 2016-03-17 Kiswe Mobile Inc. Methods and apparatus for content interaction
CN105988776B (en) * 2015-01-27 2019-11-26 阿里巴巴集团控股有限公司 Release processing method and processing device
US11087264B2 (en) * 2015-03-16 2021-08-10 International Business Machines Corporation Crowdsourcing of meetings
US10116714B2 (en) 2015-06-15 2018-10-30 At&T Intellectual Property I, L.P. Apparatus and method for on-demand multi-device social network experience sharing
US20160379441A1 (en) * 2015-06-23 2016-12-29 Mark Oley Application for enhancing a sport viewing experience on an electronic device
US9817557B2 (en) * 2015-07-22 2017-11-14 Enthrall Sports LLC Interactive audience communication for events
US20170032336A1 (en) * 2015-07-28 2017-02-02 Randy G. Connell Live fan-artist interaction system and method
US11882628B2 (en) 2015-11-23 2024-01-23 Tesla Laboratories, LLC System and method for using a mobile device as an input device for surveys at a live event
US10178710B2 (en) 2015-11-23 2019-01-08 Fevr Tech Llc System and method for using a mobile device as an input device for surveys at a live event
US10176486B2 (en) 2015-11-23 2019-01-08 Tesla Laboratories, LLC System and method for using a mobile device as an input device for surveys at a live event
US9959689B2 (en) 2015-11-23 2018-05-01 Tesla Laboratories Llc System and method for creation of unique identification for use in gathering survey data from a mobile device at a live event
US10386931B2 (en) * 2016-01-27 2019-08-20 Lenovo (Singapore) Pte. Ltd. Toggling between presentation and non-presentation of representations of input
US11050845B2 (en) * 2016-02-25 2021-06-29 At&T Intellectual Property I, L.P. Method and apparatus for providing configurable event content
US10970737B2 (en) * 2016-06-24 2021-04-06 Vet Fest Llc System and process for automatically generating rewards with ticket sales
US9843768B1 (en) * 2016-09-23 2017-12-12 Intel Corporation Audience engagement feedback systems and techniques
US10795560B2 (en) * 2016-09-30 2020-10-06 Disney Enterprises, Inc. System and method for detection and visualization of anomalous media events
US10423822B2 (en) 2017-03-15 2019-09-24 International Business Machines Corporation Video image overlay of an event performance
US11128675B2 (en) 2017-03-20 2021-09-21 At&T Intellectual Property I, L.P. Automatic ad-hoc multimedia conference generator
CN107579959A (en) * 2017-08-22 2018-01-12 广州华多网络科技有限公司 Ballot receiving/transmission method, device and the relevant device of client and server end
WO2019070351A1 (en) 2017-10-03 2019-04-11 Fanmountain Llc Systems, devices, and methods employing the same for enhancing audience engagement in a competition or performance
US11285372B2 (en) 2017-10-03 2022-03-29 Todd Wanke Systems, devices, and methods employing the same for enhancing audience engagement in a competition or performance
US11281723B2 (en) 2017-10-05 2022-03-22 On24, Inc. Widget recommendation for an online event using co-occurrence matrix
US11188822B2 (en) 2017-10-05 2021-11-30 On24, Inc. Attendee engagement determining system and method
US10417500B2 (en) 2017-12-28 2019-09-17 Disney Enterprises, Inc. System and method for automatic generation of sports media highlights
US11076202B2 (en) 2018-04-05 2021-07-27 International Business Machines Corporation Customizing digital content based on consumer context data
CN109120507B (en) * 2018-07-17 2021-04-23 奇酷互联网络科技(深圳)有限公司 Mobile terminal and method and device for realizing instant messaging with fixed terminal
US10693956B1 (en) 2019-04-19 2020-06-23 Greenfly, Inc. Methods and systems for secure information storage and delivery
JP7365212B2 (en) * 2019-12-03 2023-10-19 株式会社ソニー・インタラクティブエンタテインメント Video playback device, video playback system, and video playback method
US11055119B1 (en) * 2020-02-26 2021-07-06 International Business Machines Corporation Feedback responsive interface
US11722589B2 (en) * 2020-04-08 2023-08-08 Huawei Technologies Co., Ltd. Rapid ledger consensus system and method for distributed wireless networks
US20220201370A1 (en) * 2020-12-18 2022-06-23 Sony Group Corporation Simulating audience reactions for performers on camera
US11375380B1 (en) 2021-03-24 2022-06-28 Nearcast Inc. Method and system of a public engagement computing platform
CN113448475A (en) * 2021-06-30 2021-09-28 广州博冠信息科技有限公司 Interaction control method and device for virtual live broadcast room, storage medium and electronic equipment
US20230097459A1 (en) * 2021-08-14 2023-03-30 David Petrosian Mkervali System and method of conducting a mental confrontation in a form of a mobile application or a computer program
EP4246390A1 (en) * 2022-03-17 2023-09-20 Combo Entertainment GmbH A cloud based event management platform and a method thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6519771B1 (en) * 1999-12-14 2003-02-11 Steven Ericsson Zenith System for interactive chat without a keyboard
US7680912B1 (en) * 2000-05-18 2010-03-16 thePlatform, Inc. System and method for managing and provisioning streamed data
US20100094686A1 (en) * 2008-09-26 2010-04-15 Deep Rock Drive Partners Inc. Interactive live events
US20110090347A1 (en) * 2008-12-18 2011-04-21 Band Crashers LLC. Media Systems and Methods for Providing Synchronized Multiple Streaming Camera Signals of an Event

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5226177A (en) * 1990-03-27 1993-07-06 Viewfacts, Inc. Real-time wireless audience response system
US20040261127A1 (en) * 1991-11-25 2004-12-23 Actv, Inc. Digital interactive system for providing full interactivity with programming events
ES2394537T3 (en) * 1995-04-24 2013-02-01 United Video Properties, Inc. Procedure and electronic guide system for television programming with remote contracting of products
CA2269778A1 (en) * 1996-09-16 1998-03-19 Advanced Research Solutions, Llc Data correlation and analysis tool
US20020120925A1 (en) * 2000-03-28 2002-08-29 Logan James D. Audio and video program recording, editing and playback systems using metadata
US6317881B1 (en) * 1998-11-04 2001-11-13 Intel Corporation Method and apparatus for collecting and providing viewer feedback to a broadcast
AU2918601A (en) * 1999-11-05 2001-05-14 Donald A. Glaser A method and system for audience participation and selective viewing of various aspects of a theatrical performance, whether opera, symphonic, drama or dance orcombinations and variations thereof
US7196722B2 (en) * 2000-05-18 2007-03-27 Imove, Inc. Multiple camera video system which displays selected images
US7149549B1 (en) * 2000-10-26 2006-12-12 Ortiz Luis M Providing multiple perspectives for a venue activity through an electronic hand held device
US7782363B2 (en) * 2000-06-27 2010-08-24 Front Row Technologies, Llc Providing multiple video perspectives of activities through a data network to a remote multimedia server for selective display by remote viewing audiences
US7796162B2 (en) * 2000-10-26 2010-09-14 Front Row Technologies, Llc Providing multiple synchronized camera views for broadcast from a live venue activity to remote viewers
KR20190096450A (en) * 2000-10-11 2019-08-19 로비 가이드스, 인크. Systems and methods for delivering media content
US20020108125A1 (en) * 2001-02-07 2002-08-08 Joao Raymond Anthony Apparatus and method for facilitating viewer or listener interaction
US20020115453A1 (en) * 2001-02-16 2002-08-22 Poulin Ronald Leon Method and system for location based wireless communication services
US20040064838A1 (en) * 2002-01-08 2004-04-01 Lykke Olesen Method and device for viewing a live performance
US20030208613A1 (en) * 2002-05-02 2003-11-06 Envivio.Com, Inc. Managing user interaction for live multimedia broadcast
US7603321B2 (en) * 2002-05-22 2009-10-13 Gurvey Amy R Electronic system and method coupling live event ticketing and interactive entries with the sale, distribution and transmission of event recordings, mastering system and intelligent terminal designs
US20050024488A1 (en) * 2002-12-20 2005-02-03 Borg Andrew S. Distributed immersive entertainment system
US20070070210A1 (en) * 2003-04-11 2007-03-29 Piccionelli Gregory A Video production with selectable camera angles
US7428000B2 (en) * 2003-06-26 2008-09-23 Microsoft Corp. System and method for distributed meetings
TWI256257B (en) * 2004-03-17 2006-06-01 Era Digital Media Company Ltd Real-time interactive video management system
US20060104600A1 (en) * 2004-11-12 2006-05-18 Sfx Entertainment, Inc. Live concert/event video system and method
US7478334B2 (en) * 2005-01-20 2009-01-13 International Business Machines Corporation Folding text in side conversations
EP1854294A4 (en) * 2005-02-28 2009-11-11 Inlive Interactive Ltd Method and apparatus for conducting real time dialogues with mass viewer audiences during live programs
MX2007011675A (en) * 2005-03-22 2008-11-04 Ticketmaster Apparatus and methods for providing queue messaging over a network.
WO2007016575A2 (en) * 2005-08-01 2007-02-08 Airplay Network, Inc. A live television show utilizing real-time input from a viewing audience
US20070233785A1 (en) * 2006-03-30 2007-10-04 International Business Machines Corporation Communicating using collaboration spaces
US8019815B2 (en) * 2006-04-24 2011-09-13 Keener Jr Ellis Barlow Interactive audio/video method on the internet
US20080046910A1 (en) * 2006-07-31 2008-02-21 Motorola, Inc. Method and system for affecting performances
US8850464B2 (en) * 2006-10-09 2014-09-30 Verizon Patent And Licensing Inc. Systems and methods for real-time interactive television polling
US20080098417A1 (en) * 2006-10-19 2008-04-24 Mehdi Hatamian Viewer participatory television shows in conjuction with a system and method for real-time data collection and statistical assessment
US20080271082A1 (en) * 2007-04-27 2008-10-30 Rebecca Carter User controlled multimedia television broadcast on single channel
US20090052645A1 (en) * 2007-08-22 2009-02-26 Ravi Prakash Bansal Teleconference system with participant feedback
US9131016B2 (en) * 2007-09-11 2015-09-08 Alan Jay Glueckman Method and apparatus for virtual auditorium usable for a conference call or remote live presentation with audience response thereto
US9060094B2 (en) * 2007-09-30 2015-06-16 Optical Fusion, Inc. Individual adjustment of audio and video properties in network conferencing
US9584564B2 (en) * 2007-12-21 2017-02-28 Brighttalk Ltd. Systems and methods for integrating live audio communication in a live web event
US9661275B2 (en) * 2008-06-13 2017-05-23 Scott Gordon Dynamic multi-perspective interactive event visualization system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6519771B1 (en) * 1999-12-14 2003-02-11 Steven Ericsson Zenith System for interactive chat without a keyboard
US7680912B1 (en) * 2000-05-18 2010-03-16 thePlatform, Inc. System and method for managing and provisioning streamed data
US20100094686A1 (en) * 2008-09-26 2010-04-15 Deep Rock Drive Partners Inc. Interactive live events
US20110090347A1 (en) * 2008-12-18 2011-04-21 Band Crashers LLC. Media Systems and Methods for Providing Synchronized Multiple Streaming Camera Signals of an Event

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Timmy Curran Launches Interactive Website on Modlife.com to Connect with Music and Surfing Fans, August 6, 2008, retrieved via Internet at http://www.surfline.com/surf-news/press-release/timmy-curran-launches-interactive-webs *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130005465A1 (en) * 2011-06-29 2013-01-03 EarDish Corporation Audio playlist selections and related entertainment systems and methods
US20130097635A1 (en) * 2011-10-13 2013-04-18 Gface Gmbh Interactive remote participation in live entertainment
US9066144B2 (en) * 2011-10-13 2015-06-23 Crytek Gmbh Interactive remote participation in live entertainment
EP2621165A1 (en) 2012-01-25 2013-07-31 Alcatel Lucent, S.A. Videoconference method and device
US20150052238A1 (en) * 2013-08-19 2015-02-19 Google Inc. Device Compatibility Management
US10331786B2 (en) * 2013-08-19 2019-06-25 Google Llc Device compatibility management
US20160127288A1 (en) * 2014-11-04 2016-05-05 Calay Venture S.à r.l. System and method for inviting users to participate in activities based on interactive recordings
US11165596B2 (en) * 2014-11-04 2021-11-02 Tmrw Foundation Ip S. À R.L. System and method for inviting users to participate in activities based on interactive recordings
CN109688006A (en) * 2018-12-24 2019-04-26 北京天元特通科技有限公司 Support the high performance network log information distribution method of object set group dynamic instrumentation

Also Published As

Publication number Publication date
US20100088128A1 (en) 2010-04-08
US20140237381A1 (en) 2014-08-21
US8442424B2 (en) 2013-05-14
US20120123811A1 (en) 2012-05-17
US20100088159A1 (en) 2010-04-08
US9548950B2 (en) 2017-01-17
US9160692B2 (en) 2015-10-13
US20100094686A1 (en) 2010-04-15

Similar Documents

Publication Publication Date Title
US20100153861A1 (en) Interactive events
Eriksson et al. Spotify teardown: Inside the black box of streaming music
USRE48579E1 (en) Method and apparatus for internet-based interactive programming
JP6161249B2 (en) Mass media social and interactive applications
JP5318116B2 (en) Select ads to present
US20150026602A1 (en) System Network-Enabled Interactive Media Player
US20090063995A1 (en) Real Time Online Interaction Platform
US20090113022A1 (en) Facilitating music collaborations among remote musicians
JP2018523386A (en) Streaming media presentation system
US20030220970A1 (en) Electronic disk jockey service
US20220210514A1 (en) System and process for collaborative digital content generation, publication, distribution, and discovery
CN102216945B (en) Networking with media fingerprints
US11064232B2 (en) Media broadcast system
US11388561B2 (en) Providing a summary of media content to a communication device
CA3160379A1 (en) Method and system for aggregating live streams
CN107172178B (en) A kind of content delivery method and device
US20140233913A1 (en) Cross-platform portable personal video compositing and media content distribution system
US8561081B1 (en) System and method for dynamic brokering of digital content requests
US11792492B2 (en) Milestone determination associated with video presentation
US20120291020A1 (en) Cross-platform portable personal video compositing and media content distribution system
Te et al. Developing e-radio: An online audio streaming application
Santomier et al. Sport new media
Wołk Building an Internet Radio System with Interdisciplinary factored system for automatic content recommendation
Zheng et al. Enhancing virtual event experiences through short video marketing
Välimäki Increasing brand awareness with podcasting: case: The YesFinland Podcast

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION