US20100117934A1 - Moving Lights forming Pixels of a Video - Google Patents

Moving Lights forming Pixels of a Video Download PDF

Info

Publication number
US20100117934A1
US20100117934A1 US12/613,736 US61373609A US2010117934A1 US 20100117934 A1 US20100117934 A1 US 20100117934A1 US 61373609 A US61373609 A US 61373609A US 2010117934 A1 US2010117934 A1 US 2010117934A1
Authority
US
United States
Prior art keywords
video
lamps
remotely controllable
controlling
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/613,736
Inventor
Jeremiah J. Harris
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Production Resource Group LLC
Original Assignee
Production Resource Group LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Production Resource Group LLC filed Critical Production Resource Group LLC
Priority to US12/613,736 priority Critical patent/US20100117934A1/en
Assigned to PRODUCTION RESOURCE GROUP LLC reassignment PRODUCTION RESOURCE GROUP LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HARRIS, JEREMIAH J
Publication of US20100117934A1 publication Critical patent/US20100117934A1/en
Priority to US15/088,422 priority patent/US10268356B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/02Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes by tracing or scanning a light beam on a screen
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/022Electronic editing of analogue information signals, e.g. audio or video signals
    • G11B27/028Electronic editing of analogue information signals, e.g. audio or video signals with computer assistance
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/315Modulator illumination systems
    • H04N9/3164Modulator illumination systems using multiple light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls

Definitions

  • Stage lights can illuminate an area based on a remote control of the light output.
  • a conventional stage lighting console controls these lights using a format such as DMX 512, or some other comparable format that allows each of the lights to be controlled individually according to a cue.
  • the cue for example, can be a series of commands to be executed by the light.
  • a conventional show is made by taking different lights, and defining cues for those lights.
  • LED lights such as an LED walls and lights formed of multiple different LEDs are also known.
  • a digital light which has the ability to control a single beam of light on a pixel by pixel basis is itself known. Digital lights can produce pixel-mapped outputs.
  • the present application describes a system that takes plural moving lights, and controls each of the moving lights to act as a single portion of an overall effect.
  • a video can be played using the moving lights, wherein each light in the array of lights forms one portion, e.g., a “superpixel” of the video.
  • Another aspect describes controlling individual moving lights as though they were pixels of the video.
  • FIG. 1 shows an embodiment
  • FIG. 2 shows another embodiment with all lights in a linear row
  • FIG. 3 shows control of the lights
  • FIG. 4 shows using a light as multiple pixels.
  • FIG. 1 An embodiment is shown in FIG. 1 with groups of lights. Each of these lights is mapped to a portion of the video, so that the output of each light becomes a portion of the video, referred to herein as being a pixel of the video, even though the pixel might itself be formed of many different items of information.
  • a pixel which itself is made up of many different pixels is referred to herein as a superpixel.
  • FIG. 1 shows a stage environment 100 , defining an area to be illuminated by the stage lights.
  • a first group of lights 110 is located at stage left, and a second group of lights at 120 is located at stage right.
  • the lights are shown in a two-dimensional array.
  • the lights can alternatively be linearly located as shown in FIG. 2 . Since the pan and tilt positions of the lights can be changed to point the lights in different directions, any shaped group of lights can be directed into a desired rectangular shape of projection. For example, lights can be located linearly along a linear truss, but still form a two dimensional display.
  • Each of the lights within the array 120 such as 121 is individually controllable, both the amount and color of light that it projects, and also for its direction.
  • these lights are pan and tilt controllable lights, which can hence be panned and tilted in any of a number of different directions.
  • the embodiment of FIG. 1 shows only nine lights forming a 3 ⁇ 3 array, but it should be understood that any number of lights can be in this array.
  • the embodiment contemplates a 4 ⁇ 4 array, 5 ⁇ 5 array, 6 ⁇ 6, 7 ⁇ 7, 8 ⁇ 8, 9 ⁇ 9, 10 ⁇ 10, or more generally, an n ⁇ n array, where n can be any number between 2 and 2000, for example.
  • a 10 ⁇ 10 array might provide, for example, 100 pixels to display the video.
  • control is carried out from a remotely located console 130 which produces output signals that control the different lights.
  • the console 130 may produce only a single output, with DMX 512 operating to time division multiplex the signals.
  • FIG. 1 shows three separate outputs, but it should be understood that there can be just as easily one separate output.
  • the console 130 may include a video selector formed of a graphical user interface 135 that allows selecting the video, here 136 .
  • the video can be of any shape or size.
  • the video is pixel mapped to a 10 ⁇ 10 light or 10 ⁇ 8 array, and the pixel mapped video is then mapped to the array 120 of lights.
  • Each pixel in the video is mapped to a moving light, and hence becomes a single pixel on the stage.
  • pixel 137 is mapped to light 121 and shown as light spot 125 on the stage.
  • the same video may also be or some other video may also be mapped to array 110 .
  • the video may be mapped to the array in exactly the same way as 120 , or in a reverse way for example.
  • Video 136 can also map to the pixel 150 .
  • this allows each of the different devices, movable stage lights, LEDs, and/or projection screens 160 to be mapped using the same video source.
  • the lights 121 are movable in pan and tilt directions. This allows the light cluster shown generally as 126 to be moved in any of the directions for example shown by the arrows. The lights can be moved further apart to expand the size of the video, closer together, or can be moved within the stage. In one embodiment, the lights 126 are move to the right in the direction of the arrow 127 , until they reach the LED device 150 . At that point, the LED device may be driven by the same pixels of the same video, so that the same effect is maintained, using a different device. For example, the matrix of lights 110 may begin projecting as the device moves across the stage. Other devices can be controlled in the same way. In this way, this allows wiping across the stage using the video source to drive the source of the wiping of the video across the stage.
  • Another aspect describes a way of using movable stage lights for playing video. Since a video can be played using the stage lights, the video can be moved around on the stage in a different way than in the prior art. Moreover, since each of the lights are movable, the video can not only be moved in pan and tilt directions, but also can be rotated by moving the pointing location of the lights.
  • the video pixels can be rotated in the plane of the paper as shown in FIG. 1 .
  • One advantage is that this enables assembling a color motion pattern through video that would take a very long time to carry out on a console using cues.
  • the motion playback is fluid rather than chunks of command as which would be given in a series of cues.
  • each stage light that is displaying the video may itself be a digital light, capable of displaying a video of 640 ⁇ 480 pixels.
  • each “pixel” displayed by each stage light is itself a superpixel, formed of many subpixels.
  • a 10 ⁇ 10 array could display 100 superpixels, each of 640 ⁇ 480 pixels.
  • the resolution displayed by the superpixels can be adjusted, e.g., each individual pixel of each displaying lamp can be set to display multiple different pixels.
  • the arrays can be non-square, for example, in order to accommodate the usual 1:1-1/3 aspect ratio.
  • Another embodiment may use conventional lights to display each pixel of a video in significantly reduced resolution. This may rely on downsampling of the pixel video; for example downsampling of 50:1.
  • a 640 ⁇ 480 video might be downsampled to 10 ⁇ 8, with each downsampled pixel representing approximately 48 ⁇ 48 neighborhood of old pixels.
  • the downsampling can use averaging of multiple pixels and replacing the pixel by its average; or replacing the pixel by its median, or any other technique, both for color and for brightness of the downsampled pixel.
  • FIG. 1 shows this console.
  • a graphical user interface 135 that allows selecting of media, here video to be played.
  • the console also includes a timeline 160 , which may start at time 0 , or may you more generally start at the time “now” 161 .
  • the video or other media can be dragged onto the timeline at a specific time, causing the console to then command that video be played at the specific time set in the timeline.
  • the specific time such as time 162
  • FIG. 1 shows the video 136 being placed into the cue.
  • there are also other layers at 162 including layers shown such as 163 .
  • a layer may be a see-through layer, a blurring layer, or the like.
  • 170 represents a location line, and a virtual stage is shown as 175 . Different areas on that stage, for example the area 126 , are shown. The operator can select that area 126 , and drag it into the location line adjacent the video and 137 and other items which have been placed into section 162 . This produces, for example, a format where the video 137 is associated with the location 126 as shown in the blowup section 173 .
  • Another area such as 111 can also be placed in the same area with the same or a different video.
  • Another layer shown as 180 is a modification layer, and may be used for dimmers, colors, or other modifications to the other information previously stated.
  • FIG. 2 shows a truss 201 which is the conventional linear truss with a number of lamps 210 thereon being linearly arranged.
  • the lights are controlled by the controller 130 according to their specific locations, in order to make the linearly-arranged lights appear in a two-dimensional array 126 as shown.
  • the displaying lamps 210 can be located in any location, on opposite sides of the stage or anywhere, and can be controlled with other lamps to form the superpixels of the projection.
  • each lamp is controlled according to its position and the position in the pixel.
  • light 205 may be projected to pixel position 206 .
  • Light 205 is at a known position x,y,z.
  • the location of the video may have an origin shown as zero in FIG. 3 , and the position of the lamp 206 is at a specified location relative to that origin.
  • a transformation is carried out which compares; and which determines the pan and tilt position of the light based on its known location x,y,z, and based on the position of the origin zero, and the position of the pixel 206 relative to the origin zero.
  • the pixel will have moved to a new origin shown as O 1 , and the light thereafter is moved to area 2 the position of the pixel at 207 .
  • a geometric transformation between the known point in space of the light 205 , and the pixels, can therefore be carried out in this way.
  • digital lights form superpixels of the displayed video.
  • Analog lights form single-pixel-per lamp pixels of the displayed video.
  • Another embodiment may allow each of the analog lights to take the place of multiple pixels in a video unit.
  • certain lights such as the IconTM, allow projecting a split beam, where there is a split between two different colors.
  • the beam can form a pixel with two different colors as shown in FIG. 4 where the light 400 produces a beam 402 which has one pixel 404 on the left and another pixel 406 on the right.
  • the light beam can be divided into four or more pieces, so that each light beam represents 4 pixels.
  • the breakup of the light beam into multiple parts can use a digital light, for example, which can use any kind of spatial light modulator.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. These devices may also be used to select values for devices as described herein.
  • a software module may reside in Random Access Memory (RAM), flash memory, Read Only Memory (ROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.
  • the ASIC may reside in a user terminal.
  • the processor and the storage medium may reside as discrete components in a user terminal.
  • the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
  • Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • any connection is properly termed a computer-readable medium.
  • the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave
  • the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
  • the computers described herein may be any kind of computer, either general purpose, or some specific purpose computer such as a workstation.
  • the programs may be written in C, or Java, Brew or any other programming language.
  • the programs may be resident on a storage medium, e.g., magnetic or optical, e.g. the computer hard drive, a removable disk or media such as a memory stick or SD media, or other removable medium.
  • the programs may also be run over a network, for example, with a server or other machine sending signals to the local machine, which allows the local machine to carry out the operations described herein.

Abstract

A video is divided into portions, and the portions are used to control moving lights. A video is displayed where each moving light becomes a superpixel of the eventual displayed video. The superpixels themselves may be formed of pixels.

Description

  • This application claims priority from application No. 61/112,133, filed Nov. 6, 2008, the entirety of which is herewith incorporated by reference.
  • BACKGROUND
  • Stage lights can illuminate an area based on a remote control of the light output. A conventional stage lighting console controls these lights using a format such as DMX 512, or some other comparable format that allows each of the lights to be controlled individually according to a cue. The cue, for example, can be a series of commands to be executed by the light. A conventional show is made by taking different lights, and defining cues for those lights.
  • LED lights, such as an LED walls and lights formed of multiple different LEDs are also known. In addition, a digital light, which has the ability to control a single beam of light on a pixel by pixel basis is itself known. Digital lights can produce pixel-mapped outputs.
  • SUMMARY
  • The present application describes a system that takes plural moving lights, and controls each of the moving lights to act as a single portion of an overall effect. A video can be played using the moving lights, wherein each light in the array of lights forms one portion, e.g., a “superpixel” of the video.
  • Another aspect describes controlling individual moving lights as though they were pixels of the video.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an embodiment;
  • FIG. 2 shows another embodiment with all lights in a linear row;
  • FIG. 3 shows control of the lights; and
  • FIG. 4 shows using a light as multiple pixels.
  • DETAILED DESCRIPTION
  • An embodiment is shown in FIG. 1 with groups of lights. Each of these lights is mapped to a portion of the video, so that the output of each light becomes a portion of the video, referred to herein as being a pixel of the video, even though the pixel might itself be formed of many different items of information. A pixel which itself is made up of many different pixels is referred to herein as a superpixel.
  • FIG. 1 shows a stage environment 100, defining an area to be illuminated by the stage lights. A first group of lights 110 is located at stage left, and a second group of lights at 120 is located at stage right. In this embodiment, the lights are shown in a two-dimensional array. However, the lights can alternatively be linearly located as shown in FIG. 2. Since the pan and tilt positions of the lights can be changed to point the lights in different directions, any shaped group of lights can be directed into a desired rectangular shape of projection. For example, lights can be located linearly along a linear truss, but still form a two dimensional display.
  • Each of the lights within the array 120 such as 121 is individually controllable, both the amount and color of light that it projects, and also for its direction. In the embodiment, these lights are pan and tilt controllable lights, which can hence be panned and tilted in any of a number of different directions. The embodiment of FIG. 1 shows only nine lights forming a 3×3 array, but it should be understood that any number of lights can be in this array. For example, the embodiment contemplates a 4×4 array, 5×5 array, 6×6, 7×7, 8×8, 9×9, 10×10, or more generally, an n×n array, where n can be any number between 2 and 2000, for example.
  • A 10×10 array might provide, for example, 100 pixels to display the video. In the embodiment, control is carried out from a remotely located console 130 which produces output signals that control the different lights. The console 130 may produce only a single output, with DMX 512 operating to time division multiplex the signals. For clarity, FIG. 1 shows three separate outputs, but it should be understood that there can be just as easily one separate output.
  • The console 130 may include a video selector formed of a graphical user interface 135 that allows selecting the video, here 136. The video can be of any shape or size. In this embodiment, the video is pixel mapped to a 10×10 light or 10×8 array, and the pixel mapped video is then mapped to the array 120 of lights. Each pixel in the video is mapped to a moving light, and hence becomes a single pixel on the stage. Hence, pixel 137 is mapped to light 121 and shown as light spot 125 on the stage. The same video may also be or some other video may also be mapped to array 110. The video may be mapped to the array in exactly the same way as 120, or in a reverse way for example.
  • 150 shows an LED wall, which is also formed of a number of pixels such as 151, each of which is formed by a single LED or group of LEDs for example. Video 136 can also map to the pixel 150.
  • For example, this allows each of the different devices, movable stage lights, LEDs, and/or projection screens 160 to be mapped using the same video source.
  • In an embodiment, the lights 121 are movable in pan and tilt directions. This allows the light cluster shown generally as 126 to be moved in any of the directions for example shown by the arrows. The lights can be moved further apart to expand the size of the video, closer together, or can be moved within the stage. In one embodiment, the lights 126 are move to the right in the direction of the arrow 127, until they reach the LED device 150. At that point, the LED device may be driven by the same pixels of the same video, so that the same effect is maintained, using a different device. For example, the matrix of lights 110 may begin projecting as the device moves across the stage. Other devices can be controlled in the same way. In this way, this allows wiping across the stage using the video source to drive the source of the wiping of the video across the stage.
  • One of the advantages of this system is based on the realization that right now there is one way of communicating with pixel mapped devices such as LEDs, and another way of communicating with moving lights. In order to make the LEDs red, you send them a video of all red pixels. However, in order to make the light show red, you can send each light a cue or other information telling it to be red. This technique allows driving a group of lights as though that group were pixels, in precisely the same way that the lights are driven for pixel devices such as LEDs.
  • Another aspect describes a way of using movable stage lights for playing video. Since a video can be played using the stage lights, the video can be moved around on the stage in a different way than in the prior art. Moreover, since each of the lights are movable, the video can not only be moved in pan and tilt directions, but also can be rotated by moving the pointing location of the lights.
  • For example, the video pixels can be rotated in the plane of the paper as shown in FIG. 1. One advantage is that this enables assembling a color motion pattern through video that would take a very long time to carry out on a console using cues. The motion playback is fluid rather than chunks of command as which would be given in a series of cues.
  • The “superpixels” shown by the lights can themselves have many items of information. For example, each stage light that is displaying the video may itself be a digital light, capable of displaying a video of 640×480 pixels. In this case, each “pixel” displayed by each stage light is itself a superpixel, formed of many subpixels. In this case, a 10×10 array could display 100 superpixels, each of 640×480 pixels.
  • In the case where the source video has less resolution than that which could be displayed by the 100 superpixels, the resolution displayed by the superpixels can be adjusted, e.g., each individual pixel of each displaying lamp can be set to display multiple different pixels.
  • The arrays can be non-square, for example, in order to accommodate the usual 1:1-1/3 aspect ratio.
  • Another embodiment may use conventional lights to display each pixel of a video in significantly reduced resolution. This may rely on downsampling of the pixel video; for example downsampling of 50:1. A 640×480 video might be downsampled to 10×8, with each downsampled pixel representing approximately 48×48 neighborhood of old pixels.
  • The downsampling can use averaging of multiple pixels and replacing the pixel by its average; or replacing the pixel by its median, or any other technique, both for color and for brightness of the downsampled pixel.
  • Another aspect relates to a very specific new way of forming a console that does not require cues to carry out the operation. FIG. 1 shows this console. According to the system in FIG. 1, there is a graphical user interface 135 that allows selecting of media, here video to be played. The console also includes a timeline 160, which may start at time 0, or may you more generally start at the time “now” 161. The video or other media can be dragged onto the timeline at a specific time, causing the console to then command that video be played at the specific time set in the timeline. Moreover, at the specific time such as time 162, there may be a number of different items which are substantially simultaneously placed into the cue. FIG. 1 shows the video 136 being placed into the cue. However, there are also other layers at 162, including layers shown such as 163.
  • Another item can be placed in a different layer of the timeline, to be displayed at the same time as the timeline. For example, a layer may be a see-through layer, a blurring layer, or the like.
  • 170 represents a location line, and a virtual stage is shown as 175. Different areas on that stage, for example the area 126, are shown. The operator can select that area 126, and drag it into the location line adjacent the video and 137 and other items which have been placed into section 162. This produces, for example, a format where the video 137 is associated with the location 126 as shown in the blowup section 173.
  • In addition, another area such as 111 can also be placed in the same area with the same or a different video. Each of these collectively represents a specific time slot here shown as 162. Any number of these timeslots may be provided.
  • In addition, there can be another layer shown generally as 163 which is also at the spot 126. By displaying a video at location 126, all of the lights of the array 120 are automatically assigned to the video, and caused to display the video as shown.
  • Another layer shown as 180 is a modification layer, and may be used for dimmers, colors, or other modifications to the other information previously stated.
  • The above has shown the lights being arranged in a two-dimensional array 120. However, the lights can also be linearly arranged and still controlled via pan and tilt controls to display their output into a two-dimensional display. FIG. 2 shows a truss 201 which is the conventional linear truss with a number of lamps 210 thereon being linearly arranged. The lights are controlled by the controller 130 according to their specific locations, in order to make the linearly-arranged lights appear in a two-dimensional array 126 as shown. In fact, the displaying lamps 210 can be located in any location, on opposite sides of the stage or anywhere, and can be controlled with other lamps to form the superpixels of the projection.
  • As in the other embodiments, the positions of the lights within the array 126 can be controlled. In this embodiment, each lamp is controlled according to its position and the position in the pixel. For example, light 205 may be projected to pixel position 206.
  • Light 205 is at a known position x,y,z. The location of the video may have an origin shown as zero in FIG. 3, and the position of the lamp 206 is at a specified location relative to that origin. According to an embodiment, a transformation is carried out which compares; and which determines the pan and tilt position of the light based on its known location x,y,z, and based on the position of the origin zero, and the position of the pixel 206 relative to the origin zero. At time t2, the pixel will have moved to a new origin shown as O1, and the light thereafter is moved to area 2 the position of the pixel at 207.
  • A geometric transformation between the known point in space of the light 205, and the pixels, can therefore be carried out in this way.
  • In the above, digital lights form superpixels of the displayed video. Analog lights form single-pixel-per lamp pixels of the displayed video. Another embodiment may allow each of the analog lights to take the place of multiple pixels in a video unit. For example, certain lights such as the Icon™, allow projecting a split beam, where there is a split between two different colors. The beam can form a pixel with two different colors as shown in FIG. 4 where the light 400 produces a beam 402 which has one pixel 404 on the left and another pixel 406 on the right. In similar ways, the light beam can be divided into four or more pieces, so that each light beam represents 4 pixels. The breakup of the light beam into multiple parts can use a digital light, for example, which can use any kind of spatial light modulator.
  • Although only a few embodiments have been disclosed in detail above, other embodiments are possible and the inventors intend these to be encompassed within this specification. The specification describes specific examples to accomplish a more general goal that may be accomplished in another way. This disclosure is intended to be exemplary, and the claims are intended to cover any modification or alternative which might be predictable to a person having ordinary skill in the art. For example, other lights and controls can be used. While the above describes pixels of a video, it should be understood that this system can represent pixels of a still image.
  • Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the exemplary embodiments of the invention.
  • The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein, may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. These devices may also be used to select values for devices as described herein.
  • The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), flash memory, Read Only Memory (ROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
  • In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
  • Also, the inventors intend that only those claims which use the words “means for” are intended to be interpreted under 35 USC 112, sixth paragraph. Moreover, no limitations from the specification are intended to be read into any claims, unless those limitations are expressly included in the claims. The computers described herein may be any kind of computer, either general purpose, or some specific purpose computer such as a workstation. The programs may be written in C, or Java, Brew or any other programming language. The programs may be resident on a storage medium, e.g., magnetic or optical, e.g. the computer hard drive, a removable disk or media such as a memory stick or SD media, or other removable medium. The programs may also be run over a network, for example, with a server or other machine sending signals to the local machine, which allows the local machine to carry out the operations described herein.
  • Where a specific numerical value is mentioned herein, it should be considered that the value may be increased or decreased by 20%, while still staying within the teachings of the present application, unless some different range is specifically mentioned. Where a specified logical sense is used, the opposite logical sense is also intended to be encompassed.
  • The previous description of the disclosed exemplary embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these exemplary embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (21)

1. A method comprising:
first controlling a plurality of remotely controllable lamps using a first control such that each of the remotely controllable lamps creates a separate output at a separate location based on said first control, and at least one of said remotely controllable lamps produces an output with content that is different than another of said remotely controllable lamps based on said first control; and
second controlling the plurality of remotely controllable lamps using a second control at a different time than said first control, such that the remotely controllable lamps collectively produce a frame of a video responsive to said second control at said different time, and subsequently produces another frame of the same video at another time subsequent to said different time.
2. A method as in claim 1, wherein said second controlling produces outputs to the remotely controllable lamps such that each of the lamps produces a portion of said frame of said video.
3. A method as in claim 2, wherein said first controlling and said second controlling also comprises controlling a pointing location of said plurality of remotely controllable lamps in pan and tilt directions.
4. A method as in claim 2, wherein each of said remotely controllable lamps forms a superpixel of said frame of said video, said superpixel having a plurality of pixels being projected thereby.
5. A method as in claim 2, further comprising, during said second controlling, moving each of the plurality of lamps at the same time to thereby move a position of display of the video.
6. A method as in claim 3, further comprising controlling said lamps to collectively form a rectangular shaped array display.
7. A method as in claim 1, further comprising controlling both a projection light and an LED light using the same control form at said first time.
8. A method as in claim 1, further comprising controlling display using a computer by dragging items to be displayed onto a timeline of a user interface, and automatically displaying said items at a time related to a position on said timeline.
9. A method, comprising:
using a computer for forming a user interface having a timeline representing future times;
using the computer for placing items to be displayed on said timeline;
using the computer for automatically determining a current time, automatically determining items which have been placed on the timeline at times corresponding to the current time, and automatically causing at least one remotely controllable light to display the items on the timeline based on said current time matching the time at which said items are on the timeline.
11. A method as in claim 10, wherein said placing items comprises placing items on the timeline to be displayed by multiple different lamps.
12. A method as in claim 11, wherein said placing comprises placing different items into different layers.
13. A method as in claim 12, wherein one layer modifies the video on another layer.
14. A method as in claim 13, wherein said modifies comprises changing an aspect of display of a video.
15. An apparatus, comprising:
a controller device which produces outputs for controlling a plurality of remotely controllable lamps, said controller controlling in a first control mode to produce outputs such that each of the remotely controllable lamps creates a separate output at a separate location based on said first control, and at least one of said remotely controllable lamps displays content that is independent from another of said remotely controllable lamps based on said first control, said controller operating in a second control mode to control the plurality of remotely controllable lamps using a second control at a different time than said first control, such that the remotely controllable lamps collectively produce a first frame of a video responsive to said second control at said different time, and to subsequently produce another frame of the same video at another time subsequent to said different time.
16. An apparatus as in claim 15, wherein said second controlling by said controller produces outputs to the remotely controllable lamps such that each of the lamps produces a portion of said frame of said video.
17. An apparatus as in claim 16, wherein said controller device also controls a pointing location of said plurality of remotely controllable lamps in pan and tilt directions.
18. An apparatus as in claim 15, wherein each of said remotely controllable lamps forms a superpixel of said frame of said video in said second control mode, said superpixel having a plurality of pixels being projected thereby.
19. An apparatus as in claim 16, further comprising, during said second controlling, moving each of the plurality of lamps at the same time to thereby move a position of display of the video.
20. An apparatus as in claim 16, further comprising controlling said lamps to collectively form a rectangular shaped array display.
21. An apparatus as in claim 15, wherein said output controls both a projection light and an LED light using the same control form at said first time.
22. An apparatus as in claim 15, wherein said controller has a timeline, and allows dragging items to be displayed onto the timeline of a user interface, and automatically displaying said items at a time related to a position on said timeline.
US12/613,736 2008-11-06 2009-11-06 Moving Lights forming Pixels of a Video Abandoned US20100117934A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/613,736 US20100117934A1 (en) 2008-11-06 2009-11-06 Moving Lights forming Pixels of a Video
US15/088,422 US10268356B2 (en) 2008-11-06 2016-04-01 Moving lights forming pixels of a video

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11213308P 2008-11-06 2008-11-06
US12/613,736 US20100117934A1 (en) 2008-11-06 2009-11-06 Moving Lights forming Pixels of a Video

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/088,422 Division US10268356B2 (en) 2008-11-06 2016-04-01 Moving lights forming pixels of a video

Publications (1)

Publication Number Publication Date
US20100117934A1 true US20100117934A1 (en) 2010-05-13

Family

ID=42164737

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/613,736 Abandoned US20100117934A1 (en) 2008-11-06 2009-11-06 Moving Lights forming Pixels of a Video
US15/088,422 Active 2030-12-16 US10268356B2 (en) 2008-11-06 2016-04-01 Moving lights forming pixels of a video

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/088,422 Active 2030-12-16 US10268356B2 (en) 2008-11-06 2016-04-01 Moving lights forming pixels of a video

Country Status (1)

Country Link
US (2) US20100117934A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150254815A1 (en) * 2014-03-07 2015-09-10 Novatek Microelectronics Corp. Image downsampling apparatus and method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5769527A (en) * 1986-07-17 1998-06-23 Vari-Lite, Inc. Computer controlled lighting system with distributed control resources
US20040105264A1 (en) * 2002-07-12 2004-06-03 Yechezkal Spero Multiple Light-Source Illuminating System

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6882331B2 (en) * 2002-05-07 2005-04-19 Jiahn-Chang Wu Projector with array LED matrix light source
US7073127B2 (en) * 2002-07-01 2006-07-04 Arcsoft, Inc. Video editing GUI with layer view
EP1600559A1 (en) * 2004-05-26 2005-11-30 3M Innovative Properties Company Carriageway-marking device and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5769527A (en) * 1986-07-17 1998-06-23 Vari-Lite, Inc. Computer controlled lighting system with distributed control resources
US20040105264A1 (en) * 2002-07-12 2004-06-03 Yechezkal Spero Multiple Light-Source Illuminating System

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
laserist webpage (wayback archive as of 4/27/2008, at http://web.archive.org/web/20080427182052/http://www.laserist.org/guide-to-laser-shows.htm. Please refer to the attached pdf file laserist.pdf for the claim mapping as some of the figures in that webpage is labeled by the examiner for easier referencing. *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150254815A1 (en) * 2014-03-07 2015-09-10 Novatek Microelectronics Corp. Image downsampling apparatus and method
US9996948B2 (en) * 2014-03-07 2018-06-12 Novatek Microelectronics Corp. Image downsampling apparatus and method

Also Published As

Publication number Publication date
US10268356B2 (en) 2019-04-23
US20160283100A1 (en) 2016-09-29

Similar Documents

Publication Publication Date Title
US9342157B2 (en) Video buttons for a stage lighting console
US10217274B2 (en) Control for digital lighting
EP1889112B1 (en) Device for projecting a pixelated lighting pattern
US9564078B2 (en) Quantum dots for display panels
US7293880B2 (en) Light source including multiple light emitting devices driven by a control device, and projector
US20130249433A1 (en) Lighting controller
JP5180739B2 (en) Backlight device
KR20200129340A (en) Transferring apparatus and method of manufacturing micro led display using the same
CN109754751A (en) display device
US20090076627A1 (en) Gobo Virtual Machine
CN101189547A (en) Multiple location illumination system and projection display system employing same
CN102308572A (en) Distortion corrected improved beam angle range higher output digital luminaire system
CN102714933A (en) Component mounting device, and illumination device and illumination method for capturing images
US10665149B2 (en) Translating color selector layer for display resolution enhancement
US10268356B2 (en) Moving lights forming pixels of a video
WO2010136953A1 (en) Picture selection method for modular lighting system
CN102105309B (en) Bidirectional imaging with varying speeds
JP4677473B2 (en) Digital light processing projection apparatus and display method thereof
US20080012848A1 (en) Video Buttons for a Stage Lighting Console
US9410670B2 (en) Animated gobo
CN109946917B (en) Method for imaging super-high pixel color image on microfilm
CN105408682A (en) LED pixel device with dynamic diffuser effects
EP4278868A1 (en) A control system for configuring a plurality of lighting devices of a lighting system and a method thereof
JP3906672B2 (en) Projector, projector illumination and pixel driving method
JP2006162658A (en) Projector apparatus capable of changing color division time of light transmitted or reflected through/by color filter

Legal Events

Date Code Title Description
AS Assignment

Owner name: PRODUCTION RESOURCE GROUP LLC,NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HARRIS, JEREMIAH J;REEL/FRAME:023809/0392

Effective date: 20091130

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION