US20150348411A1 - Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program - Google Patents

Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program Download PDF

Info

Publication number
US20150348411A1
US20150348411A1 US14/820,495 US201514820495A US2015348411A1 US 20150348411 A1 US20150348411 A1 US 20150348411A1 US 201514820495 A US201514820495 A US 201514820495A US 2015348411 A1 US2015348411 A1 US 2015348411A1
Authority
US
United States
Prior art keywords
signal
information
unit
vehicle
basis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/820,495
Inventor
Hideya Inoue
Koji Yamagaki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2010177725A external-priority patent/JP2012038089A/en
Priority claimed from JP2010177724A external-priority patent/JP5724241B2/en
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to US14/820,495 priority Critical patent/US20150348411A1/en
Publication of US20150348411A1 publication Critical patent/US20150348411A1/en
Priority to US15/907,219 priority patent/US10977938B2/en
Priority to US17/225,990 priority patent/US20210225166A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/07Controlling traffic signals
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/07Controlling traffic signals
    • G08G1/08Controlling traffic signals according to detected number or speed of vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position

Definitions

  • the present invention relates to an information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, signal, and program.
  • a television camera apparatus for traffic monitoring in which a camera is mounted in a traffic signal and which takes photographs near the intersection and monitors the traffic situation using the photographed image such as described in Japanese Unexamined Patent Application Publication No. 11-261990.
  • a traffic signal receives a control signal for changing a display state of a signal display unit from a high-order control apparatus and changes the display state of the signal display unit on the basis of the received control signal, for example, such as described in Japanese Unexamined Patent Application Publication No. 10-97696.
  • the television camera apparatus for traffic monitoring disclosed in Japanese Unexamined Patent Application Publication No. 11-261990 displays a photographed image on a monitor, and a person needs to view and confirm the photographed image in order to acquire the information regarding a vehicle within the photographed image, for example.
  • a person extracts the information regarding a vehicle and the like within the photographed image on the basis of an image photographed in a traffic signal, there has been a problem in that a large amount of time and effort are required.
  • the aspect related to the present invention has been made in view of the above point, and it is an object of the aspect related to the present invention to provide an information control apparatus capable of acquiring the information regarding a vehicle and the like within an image photographed near a traffic signal, data analyzing apparatus, signal, server, information control system, and program.
  • An aspect of the present invention has been made to solve the above-described problems and is characterized in that a determination unit which determines at least an attribute of objects to be analyzed based on the captured image data acquired by an imaging apparatus fixed to a signal, and an output unit which outputs the determination result information of the determination unit to a data analyzing unit that generates an analyzing result information which is at least based on the attributes of the object to be analyzed, are provided.
  • control unit which changes a display state of a signal display unit fixed to a signal based on an image captured by an imaging unit, and a battery unit which supplies electric power to each component of the own apparatus, are provided.
  • FIG. 1 is a schematic diagram showing the configuration of an information control system related to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing each configuration of the information control system related to the first embodiment of the present invention.
  • FIG. 3 shows an example of a table of signal surrounding information related to the first embodiment of the present invention.
  • FIG. 4 shows an example of a table of vehicle attribution information related to the first embodiment of the present invention.
  • FIG. 5 shows an example of a table of person attribution information related to the first embodiment of the present invention.
  • FIG. 6 is a flow chart for explaining the processing flow of a signal information control apparatus related to the first embodiment of the present invention.
  • FIG. 7 is a flow chart for explaining the processing flow of a signal information control server related to the first embodiment of the present invention.
  • FIG. 8 is a flow chart for explaining the processing flow of a mobile communication device related to the first embodiment of the present invention.
  • FIG. 9 is a block diagram showing the configuration of a signal system and a signal control apparatus according to a second embodiment of the present invention.
  • FIG. 10 is a state transition diagram illustrating the transition between modes of a signal according to the second embodiment of the present invention.
  • FIG. 11 is an explanatory view showing a crossroads (first example) as an example in the second embodiment of the present invention.
  • FIG. 12 is an explanatory view showing an example of the detected traffic volume in the second embodiment of the present invention.
  • FIG. 13 is an explanatory view showing an example of periods of lighting and extinguishing of a signal lamp in the second embodiment of the present invention.
  • FIG. 14 is an operation diagram showing the operation of the signal according to the second embodiment of the present invention.
  • FIG. 15 is an explanatory view showing a crossroads (second example) as an example in the second embodiment of the present invention.
  • FIG. 1 shows the configuration of an information control system related to the first embodiment.
  • an information control system 100 includes a plurality of signal information control apparatus 2 _ 1 , 2 _ 2 , . . . , 2 _n mounted in a plurality of signals (traffic signals) 1 _ 1 , 1 _ 2 , . . . , 1 _n and a signal information control server 3 .
  • the plurality of signal information control apparatus 2 _ 1 , 2 _ 2 , . . . , 2 _n and the signal information control server 3 are communicably connected to each other through a network NW.
  • the plurality of signal information control apparatus 2 _ 1 , 2 _ 2 , . . . , 2 _n and the signal information control server 3 can communicate with a plurality of mobile communication devices 5 _ 1 , . . . , 5 _m, which is mounted in a plurality of vehicles 4 _ 1 , . . . , 4 _m, through the network NW.
  • a communication device with a car navigation function and the like mounted in a vehicle is described as an example of the mobile communication devices 5 _ 1 , . . . , 5 _m herein, the present invention is not limited to this, and each of the mobile communication devices 5 _ 1 , . . . , 5 _m may be a personal computer or the like of a user in a vehicle.
  • FIG. 2 is a block diagram showing an example of the configuration.
  • an example of a signal applicable to the plurality of signals 1 _ 1 , 1 _ 2 , . . . , 1 _n is set as a signal (traffic signal) 1
  • an example of a signal information control apparatus applicable to the plurality of signal information control apparatus 2 _ 1 , 2 _ 2 , . . . , 2 _n is set as a signal information control apparatus 2
  • an example of a mobile communication device applicable to the plurality of mobile communication devices 5 _ 1 , . . . , 5 _m is set as a the mobile communication device 5 .
  • the signal information control apparatus 2 and an imaging apparatus 6 are fixed to the signal 1 .
  • the signal 1 and the signal information control apparatus 2 are connected to each other through an I/F (interface) 71 .
  • the signal information control apparatus 2 and the imaging apparatus 6 are connected to each other through an I/F 72 .
  • This signal 1 includes a signal display unit 11 and a control device 12 .
  • the signal display unit 11 includes light emitting sections which emit light of green (or blue), red, yellow, and the like and is controlled by the control device 12 so that the light emitting section of each color emits light at a predetermined timing.
  • the control device 12 controls the light emitting section of each color of the signal display unit 11 so that predetermined signal display can be performed. This control device 12 may control the light emitting section of each color to emit light according to a timing determined in advance, or may control the light emitting section of each color to emit light according to a control signal input from the signal information control apparatus 2 .
  • the imaging apparatus 6 includes an imaging unit 61 which is a camera capable of capturing a moving image or an image, for example.
  • This imaging unit 61 captures an image or a video image near an intersection where the signal 1 is fixed, and outputs the captured image, which is obtained by imaging, to the signal information control apparatus 2 through the I/F 72 .
  • an object to be analyzed is a vehicle
  • a high-resolution camera capable of detecting the specific features indicating the license plate number, vehicle type, and the like by image processing is used as the imaging apparatus 6 .
  • the imaging unit 61 acquires and outputs the captured image data at intervals of 1/60 second, for example.
  • the signal information control apparatus 2 includes an image processing unit 21 , a determination unit 22 , a signal surrounding information generating unit 23 , a storage unit 24 , a control unit 25 , a communication unit 26 , a temperature sensor 27 , a timepiece unit 28 , and a microphone 29 .
  • a signal ID identification which is a unique identification number is given to the signal information control apparatus 2 .
  • This signal ID is information for identifying each signal information control apparatus 2 and is also information matched with the position where the signal information control apparatus 2 is placed.
  • the temperature sensor 27 a sensor section which detects the temperature is mounted in the signal 1 in a state exposed to the outside of the signal 1 , and the temperature sensor 27 detects a temperature near the signal 1 and outputs the temperature information indicating this temperature to the signal surrounding information generating unit 23 .
  • the timepiece unit 28 measures date and time and outputs the information indicating the measured date and time to the signal surrounding information generating unit 23 .
  • the microphone 29 is mounted in the signal 1 in a state exposed to the outside of the signal 1 , and the microphone 29 detects a sound near the signal 1 and outputs the sound information indicating this sound to the signal surrounding information generating unit 23 .
  • the captured image data acquired by the imaging apparatus 6 is input to the image processing unit 21 through the I/F 72 , and the image processing unit 21 detects an object to be analyzed which is present within an image of the captured image data.
  • the image processing unit 21 detects an object which moves (moving object), such as a vehicle or a person, as an object to be analyzed.
  • the image processing unit 21 calculates a motion vector of the captured image data which continues in time series, and detects an image region corresponding to the moving object on the basis of the motion vector and also detects the moving speed of the moving object which is an object to be analyzed.
  • the image processing unit 21 assigns a unique image ID to each item of the input captured image data and also outputs the captured image data, the image ID, and the information indicating the detected moving speed to the signal surrounding information generating unit 23 in a state matched with each other.
  • the information which is output from the image processing unit 21 and which is obtained by matching the captured image data, the image ID, and the information indicating the moving speed with each other is called image processing result information hereinafter.
  • the data of an image region corresponding to the object to be analyzed (here, a moving object) is input from the image processing unit 21 to the determination unit 22 .
  • the determination unit 22 determines the information regarding the moving object included in the image region by performing pattern recognition on the data of the image region and outputs the determination result information indicating this determination result.
  • the determination unit 22 determines the number of moving objects, type (a vehicle or a person) of a moving object, and the attribute of a moving object on the basis of the data of the image region corresponding to the object to be analyzed (moving object) detected by the image processing unit 21 and outputs the determination result information indicating the number, types, and attributes of moving objects.
  • the determination unit 22 acquires the attributes of the vehicle, such as the type (a bicycle, a large motorbike, a motor scooter, a sedan type automobile, a minivan type automobile, a light truck, or a heavy truck), a vehicle body color, the license plate number, the number of occupants, driver's sex, driver's age, and the like by pattern recognition, for example.
  • the type a bicycle, a large motorbike, a motor scooter, a sedan type automobile, a minivan type automobile, a light truck, or a heavy truck
  • a vehicle body color such as the type (a bicycle, a large motorbike, a motor scooter, a sedan type automobile, a minivan type automobile, a light truck, or a heavy truck), a vehicle body color, the license plate number, the number of occupants, driver's sex, driver's age, and the like by pattern recognition, for example.
  • the determination unit 22 acquires the attributes of the person, such as, for example, sex, age, height, clothing, moving method (on foot, a bicycle, or a motorbike), belongings (kinds of belongings, such as a baby carriage or a stick), and the like by pattern recognition.
  • the determination unit 22 determines the weather at the time of imaging by analyzing the captured image data input from the image processing unit 21 . For example, the determination unit 22 determines the weather at the time of imaging among sunny, cloudy, rain, and snow on the basis of the brightness, color of the sky, and the existence of rain or snow, or the like of captured image data.
  • the signal surrounding information generating unit 23 writes the input information in, for example, a table of signal surrounding information shown in FIG. 3 , a table of vehicle attribution information shown in FIG. 4 , and a table of person attribution information shown in FIG. 5 all of which are stored in the storage unit 24 in advance.
  • the image processing result information acquired by the image processing unit 21 and the determination result information acquired by the determination unit 22 are input to the signal surrounding information generating unit 23 , and the signal surrounding information generating unit 23 writes the image processing result information and the determination result information in each corresponding table of the signal surrounding information, the vehicle attribution information, and the person attribution information.
  • the signal surrounding information generating unit 23 writes each item of the captured image data in each corresponding table of the signal surrounding information table, the vehicle attribution information table, and the person attribution information table together with the temperature information indicating the temperature near the signal 1 detected by the temperature sensor 27 , the time information indicating a time measured by the timepiece unit 28 , and the sound information indicating the sound near the signal 1 acquired by the microphone 29 , respectively.
  • the table of signal surrounding information will be described with reference to FIG. 3 .
  • an example of the information based on the captured image data acquired at intervals of 5 minutes among the captured image data continuously acquired at intervals of 1/60 second by the imaging apparatus 6 is shown herein.
  • the present invention is not limited to the above-described configuration, and image processing result information and determination result information based on all items of the captured image data may be made to match the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information.
  • the information based on the captured image data acquired at certain fixed intervals may be made to match each table, or only the information acquired on the basis of the captured image data acquired when a moving object is detected by the image processing unit 21 may be made to match each table.
  • the table of signal surrounding information is a table in which an image ID, date and time, the number of vehicles, the number of persons, vehicle attribution information, person attribution information, the weather, temperature, a noise level, and a signal lighting color are matched with each other.
  • the date and time is information indicating the date and time at which the captured image data of a corresponding image ID is acquired by the imaging apparatus 6 . Moreover, regarding the date and time, date and time at which the captured image data is input from the imaging apparatus 6 to the signal information control apparatus 2 through the I/F 72 may be set as an imaging timing. The date and time is date and time measured by the timepiece unit 28 .
  • the number of vehicles is the number of vehicles included in the captured image data of the corresponding image ID.
  • the number of persons is the number of persons included in the captured image data of the corresponding image ID.
  • the vehicle attribution information includes a vehicle ID of each vehicle included in the captured image data of the corresponding image ID.
  • This vehicle ID is unique information assigned to each vehicle within an image, and is an identifier for matching each vehicle with its attribution information with reference to the table of vehicle attribution information shown in FIG. 4 .
  • the person attribution information includes a person ID of each person included in the captured image data of the corresponding image ID.
  • This person ID is unique information assigned to each person within an image, and is an identifier for matching each person with its attribution information with reference to the table of person attribution information shown in FIG. 5 .
  • the weather is information indicating the weather determined by the determination unit 22 .
  • the weather information may be the information obtained by the communication unit 26 .
  • the temperature is information indicating a temperature detected by the temperature sensor 27 when the captured image data is imaged by the imaging apparatus 6 .
  • the noise level is information indicating the sound volume of sound information determined by the signal information control apparatus 2 on the basis of the sound information acquired by the microphone 29 .
  • the signal lighting color indicates a color (green, red, yellow) lit by the signal display unit 11 of the signal 1 when the captured image data of the corresponding image ID is imaged.
  • the information indicating the signal lighting color is included in a control signal output from the control unit 25 to the signal 1 , is output from the control unit 25 to the determination unit 22 , and is input from the determination unit 22 to the signal surrounding information generating unit 23 together with determination result information.
  • the table of vehicle attribution information is a table in which a vehicle ID, a license plate number, a vehicle type, a vehicle body color, the number of occupants, driver's sex, driver's age, and traveling speed are matched with each other.
  • the vehicle ID is information which specifies each vehicle included in the captured image data of the corresponding image ID.
  • the license plate number, the vehicle type, the vehicle body color, the number of occupants, the driver's sex, the driver's age, and the traveling speed are information indicating the attributes of a vehicle indicated by the vehicle ID.
  • the person attribution information table is a table in which a person ID, sex, age, height, clothing, moving method (on foot, a bicycle, or a motorbike), belongings (kinds of belongings, such as a baby carriage or a stick), and the walking speed are matched with each other.
  • the person ID is information which specifies each person included in the captured image data of the corresponding image ID.
  • the age, sex, height, clothing, moving method, belongings, and walking speed are information indicating the attributes of a person indicated by the person ID.
  • the storage unit 24 stores a table of signal surrounding information, a table of vehicle attribution information, a table of person attribution information, and the captured image data matched therewith. In addition, the storage unit 24 stores a signal ID assigned in advance to each signal information control apparatus 2 .
  • the control unit 25 generates a control signal for controlling the lighting timing of a light emitting section of each color of the signal display unit 11 so that predetermined signal display of the signal 1 is performed, and outputs the control signal to the signal 1 through the I/F 71 .
  • the communication unit 26 is communicably connected to the signal information control server 3 through the network NW.
  • the communication unit 26 transmits the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data, which are stored in the storage unit 24 , to the signal information control server 3 periodically or in response to the request from the signal information control server 3 .
  • the communication unit 26 transmits the transmission information so as to match the signal ID.
  • a power supply unit 73 supplies stored electric power to the signal 1 , the signal information control apparatus 2 , and the imaging apparatus 6 .
  • the signal information control server 3 includes a communication unit 31 , a data analyzing unit 32 , an output unit 33 , and a storage unit 34 .
  • the communication unit 31 is communicably connected to the signal information control apparatus 2 through the network NW.
  • the communication unit 31 outputs to the data analyzing unit 32 the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data received from the signal information control apparatus 2 .
  • the data analyzing unit 32 stores the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data, which have been received from the signal information control apparatus 2 through the communication unit 31 , in the storage unit 34 .
  • the data analyzing unit 32 performs various kinds of data analyses, which will be described later, on the basis of the information stored in the storage unit 34 and generates the analysis result information, which is based on the attributes of the object to be analyzed, on the basis of the determination result information and the like.
  • the output unit 33 is a display device, such as a liquid crystal display, or a data communication unit that transmits the information, image data, or the like to an external device or the mobile communication device 5 and outputs the analysis result information generated by the data analyzing unit 32 .
  • the output unit 33 transmits the analysis result information to the vehicle.
  • the output unit 33 transmits the captured image data corresponding to the certain vehicle to this vehicle on the basis of the analysis result information.
  • the storage unit 34 stores the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data received from the signal information control apparatus 2 .
  • This storage unit 34 includes a table in which each signal ID and the position, at which the signal 1 indicated by the signal ID is placed, are matched with each other.
  • the mobile communication device 5 includes a communication unit 51 , a control unit 52 , and an output unit 53 .
  • the communication unit 51 is communicably connected to the signal information control apparatus 2 and the signal information control server 3 through the network NW.
  • the communication unit 51 outputs to the control unit 52 the captured image data received from the signal information control apparatus 2 or the analysis result information received from the signal information control server 3 .
  • the control unit 52 performs control to output the captured image data and the analysis result information, which have been received through the communication unit 51 , to the output unit 53 .
  • the output unit 53 is a data output unit that outputs the data to a display device or an external display device, for example, and is controlled by the control unit 52 and outputs the captured image data and the analysis result information.
  • FIG. 6 is a flow chart for explaining an example of the processing flow of the signal information control apparatus 2 .
  • the imaging unit 61 of the imaging apparatus 6 images an image near the intersection. Then, the captured image data of the captured image is input to the image processing unit 21 of the signal information control apparatus 2 through the I/F 72 (step ST 1 ).
  • the image processing unit 21 assigns a unique image ID to the input captured image data. For example, the following explanation will be given using the case where the image processing unit 21 assigns an image ID “0002” to the input captured image data as an example.
  • the image processing unit 21 calculates a motion vector of the captured image data (image ID “0002”) and the captured image data (for example, an image with an image ID “0001”) acquired in the past which continues in time series.
  • the image processing unit 21 detects an image region corresponding to the moving object on the basis of the calculated motion vector and also calculates the moving speed of the moving object. For example, the image processing unit 21 detects 20 image regions corresponding to the moving object and calculates the moving speed of each image region.
  • the image processing unit 21 acquires the image processing result information including the captured image data, the image ID “0002”, data indicating the image region corresponding to the moving object (for example, information which specifies corresponding pixels and the pixel value), and information indicating the moving speed of each image region (step ST 2 ).
  • the image processing unit 21 matches the captured image data, the image ID “0002”, and the data indicating the image region corresponding to the moving object with each other and outputs them to the determination unit 22 .
  • the image processing unit 21 matches the captured image data, the image ID “0002”, and the information indicating the moving speed of each image region with each other and outputs them to the signal surrounding information generating unit 23 .
  • the determination unit 22 determines the information regarding the moving object included in the image region by performing pattern recognition on the data of the image region corresponding to the moving object input from the image processing unit 21 and outputs the determination result information indicating the determination result (step ST 3 ).
  • the determination unit 22 determines that the data of each of the plurality of image regions corresponding to the moving object is data of an image region indicating the data of an image region, which indicates 15 vehicles, and five persons by performing pattern recognition for determining the type and the number of moving objects.
  • the determination unit 22 determines the attributes indicated by the data of the plurality of image regions corresponding to the vehicle, such as the vehicle type of the vehicle, a vehicle body color, the license plate number, the number of occupants, driver's sex, and driver's age, by performing pattern recognition for determining these attributes of the moving object (vehicle) determined in advance. In addition, the determination unit 22 determines the attributes indicated by the data of the plurality of image regions corresponding to a person, such as the age of the person, by performing pattern recognition for determining these attributes of the moving object (person) determined in advance. In addition, the determination unit 22 determines the weather at the time of imaging by analyzing the captured data input from the image processing unit 21 .
  • the signal surrounding information generating unit 23 generates the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information on the basis of the image processing result information acquired by the image processing unit 21 , the determination result information acquired by the determination unit 22 , the temperature information indicating the temperature near the signal 1 detected by the temperature sensor 27 , the time information indicating a time measured by the timepiece unit 28 , and the sound information indicating the sound near the signal 1 acquired by the microphone 29 . That is, the signal surrounding information generating unit 23 stores the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information in the storage unit 24 so as to match the input information (step ST 4 ).
  • the communication unit 26 gives a signal ID to the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data, which are stored in the storage unit 24 , and transmits them to the signal information control server 3 (step ST 5 ).
  • the communication unit 26 determines whether or not the analysis result information, which is a result of data analysis of the signal information control server 3 , has been received (step ST 6 ).
  • the communication unit 26 transmits the analysis result information to the mobile communication device 5 by radio communication (step ST 7 ).
  • step ST 8 -YES when the new captured image data is input from the imaging apparatus 6 (step ST 8 -YES), the signal information control apparatus 2 returns to step ST 2 again to repeat processing.
  • FIG. 7 is a flow chart for explaining an example of the processing flow of the signal information control server 3 .
  • the communication unit 31 of the signal information control server 3 transmits a signal which requests transmission of the acquired information, for example, to the signal information control apparatus 2 and receives the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data received from the signal information control apparatus 2 (step ST 21 ).
  • the communication unit 31 outputs the received information to the data analyzing unit 32 .
  • the data analyzing unit 32 stores the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data, which have been received from the signal information control apparatus 2 through the communication unit 31 , in the storage unit 34 and performs desired data analysis, which will be described later, on the basis of the information stored in the storage unit 34 (step ST 22 ).
  • the data analyzing unit 32 transmits the analysis result information to the signal information control apparatus 2 , which is indicated by the signal ID and from which the data has been transmitted in step ST 21 , through the communication unit 31 (step ST 24 ).
  • the data analyzing unit 32 displays an image indicating the analysis result on a display screen of the output unit 33 , for example (step ST 26 ).
  • the output unit 33 may transmit the analysis result information to the corresponding mobile communication device 5 according to the analysis result information, or may transmit the captured image data to the corresponding mobile communication device 5 .
  • the output unit 33 may transmit the analysis result information or the captured image data to the corresponding mobile communication device 5 directly through the network NW, or may transmit the analysis result information or the captured image data to the corresponding mobile communication device 5 indirectly through the signal information control apparatus 2 .
  • step ST 27 -NO when data analysis is not ended, the process returns to step ST 22 to repeat processing (step ST 27 -NO).
  • FIG. 8 is a flow chart for explaining an example of the processing flow of the mobile communication device 5 .
  • the communication unit 51 of the mobile communication device 5 performs radio communication with the communication unit 26 of the signal information control apparatus 2 .
  • the communication unit 51 of the mobile communication device 5 outputs the information to the control unit 52 .
  • the control unit 52 displays an image indicating the analysis result on a display screen of the output unit 53 , for example (step ST 42 ).
  • the mobile communication device 5 may receive the information or data, which is directly transmitted from the output unit 33 of the signal information control server 3 , without being limited to the above method.
  • the data analyzing unit 32 can execute at least one of the data analyses described below.
  • the data analyzing unit 32 of the signal information control server 3 determines a dangerous area where a traffic accident tends to occur by data analysis, for example.
  • the data analyzing unit 32 calculates the incidence rate of sudden braking by a vehicle, which is an object to be analyzed, on the basis of the determination result information by statistical processing and acquires it as analysis result information. In addition, the data analyzing unit 32 calculates the incidence rate of sudden braking on the basis of the position corresponding to the signal ID by statistical processing for each area and acquires it as analysis result information.
  • the data analyzing unit 32 counts the number of vehicles, on which the sudden braking are hit, by the change in the traveling speed of each vehicle on the basis of the traveling speed of the table of vehicle attribution information.
  • the data analyzing unit 32 performs statistical processing of the rate of vehicles, on which the sudden braking are hit, for each intersection.
  • the analysis result information indicating the rate of vehicles on which the sudden braking are hit for each area, which has been acquired as described above, is useful information in that it is predicted that there is a possibility of a traffic accident at the intersection with a high rate of vehicles on which the sudden braking are hit and warning display or the like indicating the danger at the intersection can be performed.
  • the signal information control server 3 may make the signal information control apparatus 2 , which is mounted in the signal 1 at the intersection with a high rate of vehicles on which the sudden braking are hit, transmit the information for displaying a warning message to the mobile communication device 5 of a vehicle, which is passing through the intersection, on the basis of the analysis result of the data analyzing unit 32 .
  • the data analyzing unit 32 may detect the traveling speed of the vehicle on the basis of the information of tables of vehicle attribution information transmitted from the signal information control apparatus 2 of the plurality of adjacent signals 1 and may specify a vehicle on which the sudden braking is hit.
  • the data analyzing unit 32 determines a vehicle indicating the same license plate number, vehicle type, vehicle body color, and the like to be the same vehicle on the basis of the information of tables of vehicle attribution information received from the signal information control apparatus 2 _ 1 , 2 _ 2 , and 2 _ 3 mounted in the signals 1 _ 1 , 1 _ 2 , and 1 _ 3 which are disposed continuously in the traveling direction of the lane.
  • the data analyzing unit 32 can determine whether or not the vehicle decelerates rapidly by comparing the traveling speed when the vehicle travels between the signals 1 _ 1 and 1 _ 2 , the traveling speed when the vehicle travels between the signals 1 _ 2 and 1 _ 3 , and the traveling speed immediately before the signal 1 _ 3 , and the like.
  • the data analyzing unit 32 may detect a vehicle traveling in a state deviating from the middle lane, a vehicle traveling while passing other vehicles, a bicycle or a person crossing the roadway, and the like by analyzing the captured image data and the detection rate thereof statistically. Such information is useful information in that a location, at which dangerous driving and the like occur, can be discovered.
  • the data analyzing unit 32 may determine a possibility of collision by calculating the traveling speeds of vehicles entering the intersection from the opposite directions. For example, the data analyzing unit 32 detects at least two vehicles, each of which is the object to be analyzed and enter the intersection from different directions, on the basis of the determination result information, calculates a possibility of collision of the vehicles at the intersection on the basis of the moving speeds of the vehicles, and acquire it as analysis result information.
  • signal information control apparatus 2 _ 11 , 2 _ 12 , and 2 _ 13 which are mounted in signal 1 _ 11 , 1 _ 12 , and 1 _ 13 disposed continuously in the traveling direction of a first lane
  • signal information control apparatus 2 _ 14 , 2 _ 15 , and 2 _ 16 which are mounted in signal 1 _ 14 , 1 _ 15 , and 1 _ 16 disposed continuously in a second lane crossing the first lane at the intersection U.
  • the signals 1 _ 13 and 1 _ 16 are set at the same intersection U and control the flow of traffic in the first and second lanes, respectively.
  • the data analyzing unit 32 determines a vehicle indicating the same license plate number, vehicle type, vehicle body color, and the like to be the same vehicle on the basis of the information of the table of vehicle attribution information and detects a vehicle A entering the intersection U from the first lane and a vehicle B entering the intersection U from the second lane.
  • the data analyzing unit 32 calculates the traveling speeds of the vehicles A and B and determines whether or not the timing at which the vehicles A and B enter the intersection U is the same when the vehicles A and B enter the intersection U at the traveling speeds.
  • the data analyzing unit 32 determines that the possibility of collision is high and transmits the analysis result information, which indicates transmission of a message prompting slowing down because of the risk of collision, to the signal information control apparatus 2 of the signal 1 in the lane in which the vehicles A and B are traveling.
  • the signal information control apparatus 2 which receives this analysis result information transmits a message, which prompts slowing down because of the risk of collision, to the vehicle A or B traveling in the communications area.
  • the data analyzing unit 32 may calculate the existence of traffic congestion and the length of traffic congestion on the basis of the number of vehicles and the traveling speed of the table of signal surrounding information.
  • the data analyzing unit 32 generates the information regarding road congestion caused by vehicles on the basis of the determination result information and acquires it as analysis result information.
  • the data analyzing unit 32 calculates the length of these vehicles in the traveling direction of the lane. In addition, the data analyzing unit 32 reads the road information stored in the storage unit 34 in advance, specifies the road where the traffic congestion is occurring, and generates the congestion information indicating the road where traffic congestion is occurring. The data analyzing unit 32 transmits the congestion information to the mobile communication device 5 through the network NW.
  • the mobile communication device 5 is a device with a car navigation function, for example.
  • the mobile communication device 5 receives the congestion information and outputs the information indicating that traffic congestion is occurring on the basis of this congestion information.
  • the mobile communication device 5 notifies a user of a path change when it is determined that traffic congestion is occurring at the current position at the time of traveling or in a path to the destination on the basis of the congestion information.
  • the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information are useful information in that the traffic congestion can be reduced when the user changes a path according to the above.
  • the data analyzing unit 32 may detect an illegally parked vehicle on the basis of the traveling speed and the vehicle attribution information of the table of signal surrounding information.
  • the data analyzing unit 32 acquires the information regarding a vehicle in violation of traffic rules, among vehicles which are objects to be analyzed, as analysis result information on the basis of the determination result information.
  • the data analyzing unit 32 detects a parked vehicle on the basis of the traveling speed in a plurality of items of the captured image data.
  • the data analyzing unit 32 reads the information including the license plate number, the vehicle type, and the like, which indicates the attributes included in the vehicle attribution information, from the table of vehicle attribution information and acquires it as traffic violation information.
  • the data analyzing unit 32 can acquire useful information by data analysis in order to crack down on traffic violations, such as speeding, signal violation, and other driving violations, as well as parking violations.
  • the data analyzing unit 32 reads the information specifying the vehicle from the table of vehicle attribution information and acquires it as traffic violation information by speeding.
  • the data analyzing unit 32 reads the information specifying the vehicle from the table of vehicle attribution information and acquires it as traffic violation information by ignoring the signal.
  • the data analyzing unit 32 when the data analyzing unit 32 observes the traveling path and the traveling speed of each vehicle and detects a vehicle turning right at the intersection where right turn is prohibited or a vehicle entering the intersection without a halt on the basis of the information transmitted from the plurality of signal information control apparatus 2 , the data analyzing unit 32 reads the information specifying the vehicle from the table of vehicle attribution information and acquires it as traffic violation information.
  • driving of each vehicle may be controlled on the basis of the analysis result information acquired by the data analyzing unit 32 as described above.
  • the mobile communication device 5 is connected to a driving unit of the vehicle and controls a traveling direction, a traveling speed, and the like of the vehicle according to the analysis result information received from the signal information control server 3 .
  • the mobile communication device 5 of the corresponding vehicle controls the driving unit of the vehicle to reduce the speed.
  • the data analyzing unit 32 may transmit the captured image data to the mobile communication device 5 together with the analysis result information. Then, the mobile communication device 5 mounted in the vehicle can receive, for example, the congestion information and the captured image data, which is an image of the road under traffic congestion, from the signal information control server 3 .
  • the data analyzing unit 32 receives the information indicating the traveling path of the vehicle from the mobile communication device 5 and acquires the captured image data transmitted from the signal information control apparatus 2 of the signal 1 , which corresponds to the road along which the vehicle travels from now in this traveling path, by search. Then, the data analyzing unit 32 transmits the captured image data to the mobile communication device 5 . As a result, the user can check the situation of the traveling path by an image.
  • the data analyzing unit 32 of the signal information control server 3 may acquire the information for analyzing a new store open plan by data analysis and output it to a computer which displays the information regarding the new store open plan, for example.
  • This computer displays on a display screen, for example, the trade area information, the property information, and the information indicating the traffic conditions around stores or the information indicating features of passers-by.
  • this computer displays a map designated by the user on the display screen, reads the information relevant to this map from a database, and displays it.
  • the information relevant to this map is information indicating the volume of people passing by or the volume of traffic in this area or the attributes or features regarding this area.
  • the data analyzing unit 32 of the signal information control server 3 acquires the information indicating the features of the area by analysis by associating it with this map.
  • the data analyzing unit 32 acquires the information, which can be used in marketing analysis of the area where the signal is placed, as analysis result information on the basis of the number or attributes of objects to be analyzed based on the determination result information.
  • the data analyzing unit 32 calculates a volume of people passing by or the volume of traffic in the area where the signal 1 is placed, during in each period of time of a day such as morning, daytime, evening, and nighttime and acquires it as an analysis result. For example, if the volume of traffic of vehicles is large in the morning and evening, it can be estimated that those who commute in vehicles passes through the area with the signal 1 . Therefore, this is useful information in that the features of the trade area can be analyzed.
  • the data analyzing unit 32 acquires the information indicating the features of persons passing through this area in vehicles, as an analysis result, on the basis of the vehicle type, driver's sex, or driver's age of the table of vehicle attribution information. For example, if the vehicle type is a one-box car, the driver's age is in the twenties or thirties, and the sex is female, it can be estimated that a possibility of a housewife of a large family is high. Therefore, this is useful information in that the features of the trade area can be analyzed.
  • the data analyzing unit 32 acquires the information indicating the features of persons passing through this area, as an analysis result, on the basis of the sex, age, moving method, and belongings of the table of person attribution information. For example, if the sex is female, the age is “twenties to thirties”, and moving method is “pushing a baby carriage”, it can be estimated that a possibility of a housewife with an infant is high. Therefore, this is useful information in that the features of the trade area can be analyzed.
  • the signal 1 is provided at the place where people or vehicles come and go in many cases. Accordingly, by using the analysis result by data analysis, marketing trends according to the characteristics of the area where the signal 1 is placed can be analyzed from the overall perspective.
  • franchise business such as a convenience store or a pharmacy
  • a franchise business company performs evaluation and settlement at the time of new store open planning using the store location map information, surrounding information, trade area information, population, trade area, expected sales, a layout pattern, and the like in materials of paper media when performing new store opening.
  • a franchise candidate also examines the profitability or growth potential in store opening using the same data.
  • the data analyzing unit 32 may analyze a change in volume of people passing by according to the weather on the basis of the weather, temperature, and the number of people of the table of signal surrounding information.
  • the data analyzing unit 32 may analyze the atmosphere of the area statistically using the information indicating the noise level of the table of signal surrounding information. For example, the area with a low average noise level can be estimated to be a quiet residential area.
  • the storage unit 34 may store the information for performing pattern recognition, and the data analyzing unit 32 may perform pattern recognition on the captured image data transmitted from the signal information control apparatus 2 .
  • the information which specifies the type or brand of clothes may be prepared in advance as pattern information stored in the storage unit 34 , and the data analyzing unit 32 may determine the type or brand of clothing of a person included in the captured image data.
  • the data analyzing unit 32 of the signal information control server 3 may bill the user of the mobile communication device 5 to which the analysis result information was transmitted.
  • the data analyzing unit 32 transmits the analysis result information to the mobile communication device 5 of the user and also stores having transmitted the analysis result information to the user in the storage unit 34 . Then, for example, after the elapse of a fixed period, the data analyzing unit 32 transmits service charges, of which payment is requested to each user, to the billing center or the like according to the number of times of transmission of the analysis result information to each user, type of the transmitted information, and the like.
  • This billing center is a center which collects the charges of the service which transmits to a user the analysis result information described previously.
  • a server which can communicate with the signal information control server 3 is provided in the billing center.
  • This server stores the personal information (for example, a user name or an identification number of a mobile communication device) on a user who joined the service for transmission of analysis result information and the service content (for example, type of analysis result information whose transmission is requested by the user), and the information is transmitted to the signal information control server 3 .
  • the signal information control apparatus 2 can acquire the detailed information, such as the type or attributes of an object to be analyzed, included in an image on the basis of the captured image data photographed by the imaging apparatus 6 .
  • the signal information control apparatus 2 can generate a table of signal surrounding information, a table of vehicle attribution information, and a table of person attribution information matched with the type, attributes, and the like of an object to be analyzed.
  • the data analyzing unit 32 can acquire the characteristics resulting from behavior patterns of people in the area where the signal 1 is placed, by analysis, using the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information.
  • the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the analysis result information acquired in this way may be used to monitor the traffic situation and to understand marketing trends according to economic trends, such as a store open plan, as described above.
  • the present invention is not limited to the embodiments described above, and may have the following configuration.
  • the data analyzing unit 32 may be mounted in each signal information control apparatus 2 .
  • the data analyzing unit 32 mounted in the signal information control apparatus 2 may use the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data which are stored in the storage unit 24 by its own signal information control apparatus 2 .
  • the data analyzing unit 32 mounted in the signal information control apparatus 2 may perform data analysis described above using both the information transmitted from other signal information control apparatus 2 and the information of its own storage unit 24 .
  • the present invention is not limited to this, and the signal information control apparatus 2 may be mounted in the signal information control server 3 .
  • the signal information control apparatus 2 mounted in the signal information control server 3 receives the captured image data transmitted from the imaging apparatus 6 mounted in the signal 1 and performs the same processing as described above.
  • the signal information control apparatus 2 may have a configuration including a display device, such as a liquid crystal display or an electroluminescent display panel, and may display the information according to the analysis result information on data analysis received from the signal information control server 3 .
  • a display device such as a liquid crystal display or an electroluminescent display panel
  • the signal 1 may be a movable signal placed in construction sites or the like.
  • the imaging apparatus 6 may be a camera capable of performing imaging in a range of 360°.
  • the signal 1 may also be a signal when a person crosses the road (otherwise, roadway) or a signal for a train in the railroad without being limited to only the vehicle.
  • the mobile communication device 5 is not limited to a device mounted in a vehicle. Alternatively or additionally, the mobile communication device 5 may include a personal digital assistant, a personal computer, and other various devices with communication functions. Even if not on the vehicle, the data from the signal information control apparatus can be received through the communication device 5 .
  • the object to be analyzed (moving object) detected by the image processing unit 21 is a vehicle or a person
  • the object to be analyzed (moving object) is not limited to this.
  • the object to be analyzed (moving object) may be an animal, an insect, a floating object, and the like.
  • FIG. 9 is a schematic block diagram showing the configuration of a signal system 201 in the second embodiment of the present invention.
  • a signal control apparatus 1100 provided in each of a plurality of signals and a high-order control apparatus 1200 are connected to each other through a communication network 1300 .
  • the signal control apparatus 1100 as a signal 1100 .
  • the high-order control apparatus 1200 transmits a control signal, which is for controlling each signal 1100 , to the signal 1100 through the communication network 1300 .
  • the high-order control apparatus 1200 can control each of the plurality of signals 1100 .
  • the plurality of signals 1100 has the same configuration. Therefore, the configuration of one signal 1100 will be described herein.
  • the signal 1100 includes an imaging unit 210 , a control unit 220 , a signal display unit 230 , a power supply unit 240 , and a sound pickup unit 250 .
  • the imaging unit 210 and the control unit 220 are connected to each other through an I/F (interface) 270 .
  • the control unit 220 and the signal display unit 230 are connected to each other through an I/F 271 .
  • the sound pickup unit 250 and the control unit 220 are connected to each other through an I/F (interface) 272 .
  • the signal display unit 230 includes a first light emitting section 231 , a second light emitting section 232 , and a third light emitting section 233 .
  • the first light emitting section 231 lights green (or blue) and indicates that “may move” (hereinafter, referred to as “movable”) at the time of lighting.
  • the second light emitting section 232 lights yellow and indicate that “stop at the stop position at the time of lighting. However, you may move when it is not possible to stop at the stop position” (hereinafter, referred to as “stop”).
  • the third light emitting section 233 lights red and indicates that “should not move” (hereinafter, referred to as “not movable”) at the time of lighting.
  • the imaging unit 210 includes an imaging apparatus, such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor and outputs a captured image to the control unit 220 through the I/F 270 .
  • the imaging unit 210 is fixed to the signal 1100 .
  • the imaging unit 210 is fixed to the signal 1100 so as to be located at the upper, lower, left, or right side of the signal display unit 230 .
  • this imaging unit 210 may be fixed to the signal 1100 integrally with the signal display unit 230 .
  • the imaging unit 210 performs imaging in the azimuth of 360° in the horizontal direction.
  • the imaging unit 210 may perform photographing in the azimuth of 360° by combining a plurality of imaging apparatuses.
  • the imaging unit 210 may perform photographing in the azimuth of 360° by performing image processing of an image captured through a specular member with a shape of a triangular pyramid, a sphere, or the like.
  • the sound pickup unit 250 is a sound pickup device, such as a microphone, and outputs pickup sound to the control unit 220 through the I/F (interface) 272 .
  • the sound pickup unit 250 may be a sound pickup device which includes a plurality of microphones and which picks up a sound so that the direction of a sound source can be specified.
  • the sound pickup unit 250 is fixed to the signal 1100 .
  • the sound pickup unit 250 is fixed to the signal 1100 so as to be located at the upper, lower, left, or right side of the signal display unit 230 .
  • this sound pickup unit 250 may be fixed to the signal 1100 integrally with the signal display unit 230 .
  • the power supply unit 240 supplies electric power to the imaging unit 210 , the control unit 220 , the signal display unit 230 , and the sound pickup unit 250 which are respective components provided in the signal 1100 .
  • the power supply unit 240 includes a power supply section 241 , a battery section 242 , and a power switching section 243 .
  • Electric power is supplied from the outside of the signal 1100 to the power supply section 241 through a power line.
  • Electric power (electric charge) is accumulated in the battery section 242 , and the battery section 242 outputs the accumulated electric power.
  • the battery section 242 is charged by electric power supplied to the power supply section 241 in a period for which electric power is supplied to the power supply section 241 from the outside.
  • the battery section 242 is a secondary battery, for example.
  • the power switching section 243 supplies electric power selected from either the power supply section 241 or the battery section 242 to each component provided in the signal 1100 .
  • the power switching section 243 detects a voltage or current of electric power supplied to the power supply section 241 and switches the power supply section 241 , which supplies electric power to each component provided in the signal 1100 , to either the power supply section 241 or the battery section 242 on the basis of this detected voltage or current.
  • mode switching by the power switching section 243 will be described using FIG. 10 .
  • the following explanation will be given assuming that the case where electric power is supplied from the power supply section 241 to each component provided in the signal 1100 by the power switching section 243 is called a “first mode” and the case where electric power is supplied from the battery section 242 to each component provided in the signal 1100 by the power switching section 243 is called a “second mode”.
  • the power switching section 243 changes the state to the first mode if first startup conditions are satisfied and changes the state to the second mode if second startup conditions are satisfied.
  • the first startup conditions may be conditions in which a voltage or current of electric power supplied to the power supply section 241 is larger than the threshold value set in advance, for example.
  • the second startup conditions may be conditions in which a voltage or current of electric power supplied to the power supply section 241 is equal to or lower the threshold value set in advance, for example. That is, the first startup conditions are conditions in which electric power from the outside is supplied to the signal 1100 , and the second startup conditions are conditions in which electric power from the outside is not supplied to the signal 1100 .
  • the power switching section 243 detects a voltage or current of electric power supplied to the power supply section 241 when electric power is supplied from the power supply section 241 to each component provided in the signal 1100 (when the state is the first mode). Then, when the detected voltage or current becomes equal to or lower than a threshold value set in advance (when the second condition is satisfied), the power switching section 243 changes the power supply to the battery section 242 (changes the state to the second mode).
  • the second condition corresponds to a case where a power line through which electric power is supplied to the signal 1100 is cut or a case where the facility which supplies electric power to the signal 1100 fails in a disaster or the like.
  • the power switching section 243 detects a voltage or current of electric power supplied to the power supply section 241 when electric power is supplied from the battery section 242 to each component provided in the signal 1100 (when the state is the second mode). Then, when the detected voltage or current becomes equal to or higher than a threshold value set in advance (when the first condition is satisfied), the power switching section 243 changes the power supply to the power supply section 241 (changes the state to the first mode).
  • the first condition corresponds to a case where the cut power line through which electric power is supplied to the signal 1100 is connected or a case where the failure of the facility which supplies electric power to the signal 1100 ends by restoration work after a disaster.
  • the power switching section 243 changes the state between the first and second modes.
  • the power switching section 243 outputs to the control unit 220 the information indicating that the state is changed from the first mode to the second mode and the information indicating that the state is changed from the second mode to the first mode.
  • the power switching section 243 outputs to the control unit 220 the information indicating that the current state is the first mode or the second mode. Using this information, the control unit 220 can determine whether the current mode is the first mode or the second mode.
  • the control unit 220 includes a detection section 221 , a display control section 222 , and a communication section 223 .
  • the communication section 223 receives a control signal for changing the display state of the signal display unit 230 from the high-order control apparatus 1200 through the communication network 1300 .
  • This control signal is control information indicating “movable”, “stop”, or “not movable”, for example. That is, this control signal is control information indicating that the first light emitting section 231 , the second light emitting section 232 , or the third light emitting section 233 is made to emit light.
  • the display control section 222 changes the display state of the signal display unit 230 through the I/F 271 on the basis of the control signal received from the high-order control apparatus 1200 by the communication section 223 .
  • the display control section 222 makes the first light emitting section 231 provided in the signal display unit 230 light and extinguishes the second and third light emitting sections 232 and 233 .
  • the detection section 221 detects the volume of traffic on the basis of an image captured by the imaging unit 210 .
  • the display control section 222 changes the display state of the signal display unit 230 on the basis of the volume of traffic detected by the detection section 221 .
  • the detection section 221 detects the volume of traffic in each of a plurality of lanes on the basis of an image of the plurality of lanes captured by the imaging unit 210 .
  • the detection section 221 detects a vehicle one by one in each different lane by image processing or pattern matching technique, as an example.
  • the detection section 221 detects the volume of traffic in each of the plurality of lanes by detecting the number of vehicles per unit time which travel in each lane.
  • the display control section 222 changes the display state of the signal display unit 230 on the basis of a result of comparison of the volumes of traffic in the plurality of lanes detected by the detection section 221 .
  • the display control section 222 changes the display state of the signal display unit 230 on the basis of the result of comparison of the volumes of traffic in the plurality of lanes detected by the detection section 221 so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic.
  • the control unit 220 changes the display state of the signal display unit 230 , which is fixed to the signal 1100 , on the basis of the control signal received from the high-order control apparatus 1200 by the communication section 223 .
  • the control unit 220 changes the display state of the signal display unit 230 , which is fixed to the signal 1100 , on the basis of an image captured by the imaging unit 210 .
  • the lane A which is one lane of the two lanes is a lane which is a one-way street from right to left on the plane of FIG. 11 .
  • the lane B which is one lane of the two lanes is a lane which is a one-way street from top to bottom on the plane of FIG. 11 .
  • the lanes A and B cross each other at the crossroads CRS.
  • a signal 1100 - 1 is placed before the crossroads CRS in the lane A.
  • a signal 1100 - 2 is placed before the crossroads CRS in the lane B.
  • the signals 1100 - 1 and 1100 - 2 have the same configuration as the signal 1100 described above using FIG. 9 .
  • FIG. 11 there is a stop line L 1 before the signal 1100 - 1 and a stop line L 2 before the signal 1100 - 2 at the crossroads CRS. Therefore, at the crossroads CRS, a vehicle stops before the stop line L 1 and a vehicle stops before the stop line L 2 according to the signal display units 230 of the signals 1100 - 1 and 1100 - 2 .
  • the signal display unit 230 of each of the signals 1100 - 1 and 1100 - 2 includes the first light emitting section 231 , which lights green (or blue) indicating “movable”, and the third light emitting section 233 , which lights red indicating “not movable”.
  • control unit 220 of each of the signals 1100 - 1 and 1100 - 2 executes the extinguishing of the first light emitting section 231 and the lighting of the third light emitting section 233 almost simultaneously for the signal display unit 230 provided in each signal.
  • control unit 220 of each of the signals 1100 - 1 and 1100 - 2 executes the lighting of the first light emitting section 231 and the extinguishing of the third light emitting section 233 almost simultaneously for the signal display unit 230 provided in each signal.
  • each of the signals 1100 - 1 and 1100 - 2 performs imaging in the azimuth of 360° in the horizontal direction, as described above. Therefore, the signals 1100 - 1 and 1100 - 2 can detect the volumes of traffic in the lanes A and B, respectively.
  • the signals 1100 - 1 and 1100 - 2 repeat display states of “movable” and “not movable” alternately through each signal display unit 230 , for example. Explanation herein will be given assuming that periods of the display states of “movable” and “not movable” are the same in length of time.
  • the signal 1100 - 1 indicates “movable” through its own signal display unit 230
  • the signal 1100 - 2 indicates “not movable” through its own signal display unit 230
  • the signal 1100 - 1 indicates “not movable” through its own signal display unit 230
  • the signal 1100 - 2 indicates “movable” through its own signal display unit 230 .
  • the signals 1100 - 1 and 1100 - 2 repeat the same operation in periods T 3 , T 4 , . . . .
  • the periods T 1 , T 2 , T 4 , . . . are assumed to be the same in length of time.
  • the detection section 221 of the signal 1100 - 1 detects the volume of traffic in each of the plurality of lanes on the basis of an image of the plurality of lanes captured by the imaging unit 210 of the signal 1100 - 1 .
  • the detection section 221 of the signal 1100 - 2 detects the volume of traffic in each of the plurality of lanes on the basis of an image of the plurality of lanes captured by the imaging unit 210 of the signal 1100 - 2 .
  • the detection section 221 of the signal 1100 - 1 and the detection section 221 of the signal 1100 - 2 detect that five vehicles have passed through the lane A and no vehicle has passed through the lane B in the period T 1 , respectively.
  • the detection section 221 of the signal 1100 - 1 and the detection section 221 of the signal 1100 - 2 detect that no vehicle has passed through the lane A and one vehicle has passed through the lane B in the period T 2 , respectively.
  • the detection section 221 of the signal 1100 - 1 and the detection section 221 of the signal 1100 - 2 detect the volumes of traffic in the lanes A and B, respectively.
  • the display control section 222 of the signal 1100 - 1 compares the volumes of traffic in the plurality of lanes detected by the detection section 221 of the signal 1100 - 1 , and determines that the volume of traffic in the lane A is larger than that in the lane B in this case.
  • the display control section 222 of the signal 1100 - 1 may compare the volume of traffic in each lane on the basis of the sum or the average number of vehicles traveling in each lane in a period set in advance, such as the period T 1 to the period T 4 .
  • the display control section 222 of the signal 1100 - 1 changes the display state of the signal display unit 230 on the basis of this result so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic. That is, in this case, since the volume of traffic in the lane A is larger than that in the lane B, the display control section 222 of the signal 1100 - 1 changes the display state of the signal display unit 230 so that vehicles traveling in the lane A can move with priority over vehicles traveling in the lane B.
  • the display control section 222 of the signal 1100 - 2 compares the volumes of traffic in the plurality of lanes detected by the detection section 221 of the signal 1100 - 2 , and determines that the volume of traffic in the lane A is larger than that in the lane B in this case in the same manner as the display control section 222 of the signal 1100 - 1 does. Then, the display control section 222 of the signal 1100 - 2 changes the display state of the signal display unit 230 on the basis of this result so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic.
  • the display control section 222 of the signal 1100 - 2 changes the display state of the signal display unit 230 so that vehicles traveling in the lane A can move with priority over vehicles traveling in the lane B in the same manner as the display control section 222 of the signal 1100 - 1 does.
  • volume of traffic in a plurality of lanes are the same value as long as a measurement period is the same even if a measurement apparatus is different like the signals 1100 - 1 or the signal 1100 - 2 . Therefore, values of the volumes of traffic in a plurality of lanes detected by the detection section 221 of the signal 1100 - 1 and the detection section 221 of the signal 1100 - 2 are the same value.
  • each of the display control section 222 of the signal 1100 - 1 and the display control section 222 of the signal 1100 - 2 can change the display state of the signal display unit 230 so that vehicles traveling in the lane A can move with priority over vehicles traveling in the lane B.
  • the signals 1100 - 1 and 1100 - 2 operate independently of each other and even when the signals 1100 - 1 and 1100 - 2 do not receive control signals from the high-order control apparatus, it is possible to change the display state of the signal display unit 230 appropriately on the basis of the volume of traffic detected for each of the plurality of lanes.
  • the display control section 222 of the signal 1100 - 1 and the display control section 222 of the signal 1100 - 2 perform the following operations, as an example, when changing the display state of the signal display unit 230 so that vehicles traveling in the lane A can move with priority over vehicles traveling in the lane B.
  • the display control section 222 of each of the signals 1100 - 1 and 1100 - 2 changes the display state of the signal display unit 230 , on the basis of the volume of traffic in each of the plurality of lanes detected by the detection section 221 of the corresponding signal 1100 , so that a period for which a vehicle can move in the lane with the high volume of traffic becomes longer than that in the lane with the low volume of traffic as the ratio or difference of the volumes of traffic in the lanes increases.
  • the display control section 222 of each of the signals 1100 - 1 and 1100 - 2 changes the display state of the signal display unit 230 , on the basis of the volume of traffic in each of the plurality of lanes detected by the detection section 221 of the corresponding signal 1100 , so that a period for which a vehicle cannot move in the lane with the low volume of traffic becomes longer than that in the lane with the high volume of traffic as the ratio or difference of the volumes of traffic in the lanes increases.
  • each display control section 222 of each of the signals 1100 - 1 and 1100 - 2 changes a period of the display state by the signal display unit 230 as shown in FIG. 13 , as an example.
  • each display control section 222 changes a period of the display state by the signal display unit 230 so that the time lengths of periods T 11 and T 13 , in which the signal display unit 230 (lane A) of the signal 1100 - 1 indicates “movable” and the signal display unit 230 (lane B) of the signal 1100 - 2 indicates “not movable” become longer than those of the periods T 1 to T 4 shown in FIG. 12 .
  • each display control section 222 changes a period of the display state by the signal display unit 230 so that the time lengths of periods T 12 and T 14 , in which the signal display unit 230 (lane A) of the signal 1100 - 1 indicates “not movable” and the signal display unit 230 (lane B) of the signal 1100 - 2 indicates “movable” become shorter than those of the periods T 1 to T 4 shown in FIG. 12 . Subsequently, each display control section 222 repeats changing the display state of each signal display unit 230 in the same manner as in the case of the periods T 11 to T 14 until the volume of traffic changes.
  • the display control section 222 of the signal 1100 can change the display state of the signal display unit 230 on the basis of the volume of traffic in each of the plurality of lanes detected by the detection section 221 so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic.
  • the operation of the signal 1100 in FIG. 14 is an operation of the signal 1100 when changing from the first mode to the second mode, for example, as described using FIG. 10 .
  • the imaging unit 210 captures an image and outputs the captured image to the control unit 220 through the I/F 270 (step S 10 ). Then, the detection section 221 of the control unit 220 detects the volume of traffic on the basis of the image captured by the imaging unit 210 (step S 20 ). Then, the display control section 222 of the control unit 220 compares the volume of traffic in each of the plurality of lanes detected by the detection section 221 (step S 30 ).
  • step S 40 the display control section 222 of the control unit 220 performs signal control by changing the display state of the signal display unit 230 on the basis of the comparison result in step S 30 (step S 40 ).
  • step S 40 for example, the display control section 222 of the control unit 220 changes the display state of the signal display unit 230 on the basis of the result of comparison of the volumes of traffic in the plurality of lanes detected by the detection section 221 so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic.
  • the display control section 222 of the control unit 220 changes the display state of the signal display unit 230 through the I/F 271 .
  • the signal 1100 repeats the processing from step S 10 until it changes to the first mode.
  • the signal 1100 changes the display state of the signal display unit 230 fixed to the signal 1100 on the basis of the image captured by the imaging unit 210 . Therefore, the signal 1100 according to the present embodiment can appropriately perform control to change the display state of the signal display unit 230 even when the signal 1100 cannot communicate with the high-order control apparatus and even when the signal 1100 cannot receive a control signal from the high-order control apparatus.
  • control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 so that the display state of the signal 1100 - 1 is different from the display state of the signal 1100 - 2 .
  • the signals 1100 - 1 and 1100 - 2 need to control the signal display unit 230 of each signal 1100 using the control unit 220 of the corresponding signal 1100 so that a vehicle traveling in the lane A and a vehicle traveling in the lane B do not collide with each other at the crossroads CRS.
  • the control unit 220 of each signal 1100 needs to control the signal display unit 230 of the signal 1100 using the control unit 220 of the corresponding signal 1100 so that a vehicle traveling in the lane A and a vehicle traveling in the lane B do not move toward the crossroads CRS at the same timing.
  • both the display state of the signal 1100 - 1 and the display state of the signal 1100 - 2 indicate “movable” at the same timing.
  • both the display state of the signal 1100 - 1 and the display state of the signal 1100 - 2 may indicate “not movable”.
  • control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 such that the display state of the signal 1100 - 1 is different from the display state of the signal 1100 - 2 and both the display states do not indicate “movable” at the same timing.
  • a vehicle traveling in the lane A and a vehicle traveling in the lane B can alternately enter the crossroads CRS without the collision between the vehicle traveling in the lane A and the vehicle traveling in the lane B at the crossroads CRS.
  • each device measure the value of the physical quantity, which is called the volume of traffic in each of the plurality of lanes, as the same value even if apparatus which measure the volume of traffic are different like the signal 1100 - 1 or the signal 1100 - 2 .
  • a start time such as the period T 1 for which the number of vehicles is measured, and an end time need to synchronize with each other in the signals 1100 - 1 and 1100 - 2 .
  • the signals 1100 - 1 and 1100 - 2 may be made to be able to detect a reference time by time measurement using a radio-controlled timepiece or by time measurement using a GPS (Global Positioning System). Then, each of the signals 1100 - 1 and 1100 - 2 measures an elapsed time from the detected reference time using a timepiece section present thereinside. Then, each of the signals 1100 - 1 and 1100 - 2 measures a start time, such as the period T 1 for which the number of vehicles is measured, and an end time on the basis of the elapsed time measured by each timepiece section. In this way, the signals 1100 - 1 and 1100 - 2 may make the start time and the end time of the period T 1 synchronized which the number of vehicles is measured between the signals 1100 - 1 and 1100 - 2 .
  • a start time such as the period T 1 for which the number of vehicles is measured
  • the signals 1100 - 1 and 1100 - 2 can measure the value of the physical quantity, which is called the volume of traffic in each of the plurality of lanes, as the same value. Accordingly, the signals 1100 - 1 and 1100 - 2 can change the display state of the signal display unit 230 appropriately.
  • the signals 1100 - 1 and 1100 - 2 compared the volumes of traffic in a plurality of lanes when periods from the period T 1 to the period T 4 elapsed and changed the display state of the signal display unit 230 the on the basis of the comparison result.
  • the timing at which the volumes of traffic in the plurality of lanes are compared is not limited to this.
  • the signals 1100 - 1 and 1100 - 2 may compare the volumes of traffic in the plurality of lanes every period set in advance.
  • This “period set in advance” may be an arbitrary period set in advance.
  • this “period set in advance” may be set on the basis of a period, for which the display state of the signal display unit 230 is changed, by the control unit 220 of each signal 1100 , as in the periods T 1 to T 4 .
  • this “period set in advance” may be each of the periods T 1 to T 4 .
  • each of the signals 1100 - 1 and 1100 - 2 may compares the number of traveling vehicles in the lane A with the number of stopped vehicles in the lane B in the period T 1 and change the display state of the signal display unit 230 on the basis of this comparison result.
  • each of the signals 1100 - 1 and 1100 - 2 compares the number of traveling vehicles in the lane A with the number of stopped vehicles in the lane B in each of the periods T 2 , T 3 , T 4 , . . . as in the case of the period T 1 . Then, each of the signals 1100 - 1 and 1100 - 2 may change the display state of the signal display unit 230 on the basis of the comparison result so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic.
  • the signals 1100 - 1 and 1100 - 2 may compare the volumes of traffic in the plurality of lanes.
  • the detection section 221 may detect the number of vehicles, which is the number of vehicles stopped before the crossroads CRS and is the number of vehicles stopped in each lane, as the volume of traffic.
  • the display control section 222 may change the display state of the signal display unit 230 so as to reduce the number of stopped vehicles according to stopped vehicles.
  • the signal 1100 according to the present embodiment is not limited to such a case of two lanes which are one-way streets, and may cope with the arbitrary number of lanes which are not one-way streets.
  • FIG. 15 an example of the arbitrary number of lanes which are not one-way streets will be described using FIG. 15 .
  • the same reference numerals are given to sections corresponding to each section in FIG. 9 or 11 , and the explanation will be omitted.
  • lanes A 1 and A 2 which are opposite lanes and lanes B 1 and B 2 which are opposite lanes cross each other at the crossroads CRS.
  • signals 1100 - 1 and 1100 - 3 and signals 1100 - 2 and 1100 - 4 are placed before the crossroads CRS in the lanes A 1 and A 2 and the lanes B 1 and B 2 .
  • the signals 1100 - 1 , 1100 - 2 , 1100 - 3 , and 1100 - 4 have the same configuration as the signal 1100 described using FIG. 9 , as in the case of FIG. 11 .
  • the signals 1100 - 1 , 1100 - 2 , 1100 - 3 , and 1100 - 4 need to control the signal display unit 230 of each signal 1100 by the control unit 220 of the corresponding signal 1100 so that a vehicle traveling in the lane called the lane A 1 or the lane A 2 and a vehicle traveling in the lane called the lane B 1 or the lane B 2 do not collide with each other at the crossroads CRS, as in the case of FIG. 11 .
  • control unit 220 of each signal 1100 needs to control the signal display unit 230 of the signal 1100 using the control unit 220 of the corresponding signal 1100 so that a vehicle traveling in the lane called the lane A 1 or the lane A 2 and a vehicle traveling in the lane called the lane B 1 or the lane B 2 do not move toward the crossroads CRS at the same timing.
  • the control unit 220 of each of the signals 1100 - 1 and 1100 - 3 changes the display state of the signal display unit 230 to the same display state at the same timing.
  • the control unit 220 of each of the signals 1100 - 2 and 1100 - 4 changes the display state of the signal display unit 230 to the same display state at the same timing.
  • the control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 so that the display states of the signals 1100 - 1 and 1100 - 3 are different from the display states of the signals 1100 - 2 and 1100 - 4 .
  • both the display states of the signals 1100 - 1 and 1100 - 3 and the display states of the signals 1100 - 2 and 1100 - 4 indicate “movable” at the same timing. This is because vehicles traveling in different lanes may collide with each other at the crossroads CRS if both the display states of the signals 1100 - 1 and 1100 - 3 and the display states of the signals 1100 - 2 and 1100 - 4 indicate “movable” at the same timing.
  • both the display states of the signals 1100 - 1 and 1100 - 3 and the display states of the signals 1100 - 2 and 1100 - 4 may indicate “not movable”.
  • control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 such that the display states of the signals 1100 - 1 and 1100 - 3 are different from the display states of the signals 1100 - 2 and 1100 - 4 and the display states of the signals 1100 - 1 and 1100 - 3 and the display states of the signals 1100 - 2 and 1100 - 4 do not indicate “movable” at the same timing.
  • control unit 220 of each of the signals 1100 - 1 and 1100 - 3 changes the display state of the signal display unit 230 as in the case of the lane A described using FIG. 12 and the control unit 220 of each of the signals 1100 - 2 and 1100 - 4 changes the display state of the signal display unit 230 as in the case of the lane B described using FIG. 12 . Therefore, also in the case of FIG.
  • the signals 1100 - 1 to 1100 - 4 can appropriately perform control to change the display state of the signal display unit even when the signals 1100 - 1 to 1100 - 4 cannot communicate with the high-order control apparatus and even when the signals 1100 - 1 to 1100 - 4 cannot receive a control signal from the high-order control apparatus, as in the case of FIG. 12 .
  • a vehicle which turns right at the crossroads CRS from the lane A 1 to travel in the lane B 2 there may be a vehicle which turns right at the crossroads CRS from the lane A 1 to travel in the lane B 2 . That is, a vehicle which turns right at the crossroads CRS may also be present.
  • the display control section 222 changes the display state of the signal display unit 230 so that priority is given to right turn (right turn signal). Then, for example, it is possible to reduce an increase in the number of vehicles which cannot turn right since vehicles traveling in the lane A 1 cannot turn right at the crossroads CRS and stop near the stop line L 1 . Accordingly, the display control section 222 can reduce (alleviate) the traffic congestion caused by turning right at the crossroads CRS.
  • the display control section 222 changes the display state of the signal display unit 230 so that a right turn signal time is adjusted. As a result, a vehicle easily turns right at the crossroads CRS. In addition, it is possible to reduce an increase in the number of vehicles which cannot turn right since vehicles cannot turn right at the crossroads CRS and stop near the stop line L 1 . Accordingly, the display control section 222 can reduce the traffic congestion caused by turning right at the crossroads CRS.
  • the detection section 221 detects a vehicle turning right on the basis of a captured image.
  • the detection section 221 detects a vehicle, which is stopped within the crossroads CRS or before the crossroads CRS and which is located at the right end of the lane, as a vehicle turning right on the basis of a captured image.
  • a direction indictor which indicates a direction when a vehicle changes course by blinking and which is provided in a vehicle
  • the detection section 221 may detect a vehicle turning right by determining whether or not a vehicle turns right on the basis of an image of the direction indictor provided in the vehicle, the image being captured by the imaging unit 210 .
  • the detection section 221 may detect a vehicle turning right by arbitrarily combining such a method of detecting a vehicle turning right.
  • the display control section 222 changes the display state of the signal display unit 230 so that priority is given to right turn or the right turn signal time is adjusted.
  • the “display control section 222 changes the display state of the signal display unit 230 so that priority is given to right turn (right turn signal)” is that the display control section 222 changes the display state of the signal display unit 230 so that a vehicle turning right can move with priority over a vehicle which does not turn right or over the case where there is no vehicle turning right at the crossroads CRS.
  • the “vehicle which does not turn right” referred to herein is a vehicle going straight in the lane without changing the course or a vehicle turning left, for example.
  • the signal display unit 230 includes a light emitting section corresponding to right turn.
  • the display control section 222 changes the display state of the signal display unit 230 so that priority is given to a vehicle turning right by controlling the display state of the light emitting section corresponding to right turn.
  • the “display control section 222 changes the display state of the signal display unit 230 so that the right turn signal time is adjusted” is that, for example, the display control section 222 adjusts a time for changing the display state of the signal display unit 230 , so that the period of time for the vehicle turning right becomes longer compared to a case there are no vehicle turning right or the vehicle do not turn right at the crossroads CRS.
  • the “display control section 222 changes the display state of the signal display unit 230 so that the right turn signal time is adjusted” is that, for example, the display control section 222 changes the display state of the signal display unit 230 by controlling the display state of a light emitting section corresponding to right turn, so that a period of time for the vehicle turning right becomes longer compared to a case there are no vehicle turning right or the vehicle do not turn right at the crossroads CRS.
  • the detection section 221 may detect the type of a vehicle on the basis of a captured image. For example, an emergency vehicle has a red light. Therefore, the detection section 221 may determine whether or not a vehicle has a red light on the basis of a captured image and detect the type of the vehicle as an emergency vehicle when the vehicle has a red light.
  • This emergency vehicle is an automobile for firefighting, an automobile for emergencies, or a police car, for example.
  • the emergency vehicle may have a predetermined color, such as red or white. Therefore, the detection section 221 may determine whether or not the vehicle type is an emergency vehicle by combining the color of a vehicle with the criteria for determining whether or not the vehicle has a red light. For example, the detection section 221 determines whether the color of a vehicle is red or white on the basis of a captured image. In addition, when the color of a vehicle is red or white and the vehicle has a red light, the detection section 221 may detect the type of the vehicle as an emergency vehicle.
  • the display control section 222 may change the display state of the signal display unit 230 so that the emergency vehicle can move with priority over vehicles whose vehicle types are not emergency vehicles.
  • the emergency vehicle can move in the lane with priority over vehicles whose vehicle types are not emergency vehicles.
  • the detection section 221 may detect the type of a vehicle on the basis of a captured image and a sound picked up by the sound pickup unit 250 . For example, when the type of a vehicle is an emergency vehicle, this vehicle may sound the siren. Therefore, the detection section 221 may determine whether or not a siren sound is included in the sound picked up by the sound pickup unit 250 and determine whether or not the type of a vehicle is an emergency vehicle by combining this determination result and a determination ermined result based on the captured image described above.
  • the detection section 221 can detect the direction of an emergency vehicle including a sound source which sounds the siren. In this case, the detection section 221 can detect an emergency vehicle more accurately on the basis of the detected direction of the emergency vehicle and the direction of the emergency vehicle detected on the basis of the image.
  • the display control section 222 can change the display state of the signal display unit 230 more accurately so that the emergency vehicle can move preferentially in the second mode.
  • the display control section 222 can also estimate the lane, in which the emergency vehicle travels from now, more accurately. Accordingly, in the second mode, the display control section 222 can change the display state of the signal display unit 230 so that the emergency vehicle can move preferentially. In this manner, the emergency vehicle can travel in the lane preferentially.
  • each light emitting section corresponding to one signal may include a plurality of light emitting elements.
  • This light emitting section corresponding to one signal is the first light emitting section 231 , the second light emitting section 232 , or the third light emitting section 233 described above.
  • the plurality of light emitting elements is a plurality of LEDs (Light Emitting Diodes). That is, an emission method of the signal display unit 230 of the signal 1100 described above may be a method using an LED.
  • the display control section 222 of the control unit 220 when making the light emitting section of the signal display unit 230 light, makes some of the plurality of light emitting elements provided in the light emitting section emit light or extinguished while making the plurality of light emitting elements emit light such that the position of the place of emission or extinguishing in the light emitting section changes. That is, the display control section 222 of the control unit 220 makes the plurality of light emitting elements emit light by moving only some lighting regions (or extinguished regions) instead of lighting all regions of respective colors of a signal which can light.
  • the display control section 222 of the control unit 220 makes the light emitting section of the signal display unit 230 lights as described above, the emission place of the light emitting section that lights moves instead of simply lighting. Therefore, even if the afternoon sun, the morning sun, or the like shines into the signal 1100 , it becomes easy for a user who observes the signal 1100 to see which of the plurality of light emitting sections provided in the signal display unit 230 emits light.
  • the display control section 222 of the control unit 220 makes some of the plurality of light emitting elements provided in the light emitting section emit light or extinguished while making the plurality of light emitting elements emit light such that the place of emission or extinguishing rotates, moves, enlarges, or is reduced in the light emitting section. In this case, it becomes easier for a user who observes the signal 1100 to see which of the plurality of light emitting sections provided in the signal display unit 230 emits light even if the afternoon sun, the morning sun, or the like shines into the signal 1100 .
  • the display control section 222 of the control unit 220 makes the light emitting section emit light such that the brightness of an extinguished light emitting element, which is a light emitting element provided in the light emitting section that lights, changes.
  • the display control section 222 of the control unit 220 may make some light emitting elements emit light or extinguished as described above while changing the brightness of the extinguished light emitting element sequentially from the low brightness to the high brightness instead of simply maintaining the extinguished state.
  • the display control section 222 of the control unit 220 may repeat this brightness change by changing the brightness of the light emitting element sequentially from the low brightness to the high brightness and then changing the brightness of the light emitting element sequentially from the high brightness to the low brightness.
  • the respective signals 1100 operate independently of each other in the above explanation, the respective signals 1100 may communicate with each other.
  • the respective signals 1100 may communicate with each other through the communication network 1300 described using FIG. 9 .
  • communication through the communication network 1300 may be impossible in a disaster. Therefore, the signals 1100 may also communicate with each other through a communication network different from the communication network 1300 .
  • the communication network in this case may be a radio communication network.
  • At least the signals 1100 - 1 and 1100 - 2 related to the crossroads CRS can communicate with each other through such a communication network.
  • at least the signals 1100 - 1 to 1100 - 4 related to the crossroads CRS can communicate with each other.
  • each signal 1100 changes the display state of each signal display unit 230 so that vehicles traveling in different lanes do not collide with each other at the crossroads CRS, as described using FIG. 11 or 15 .
  • the control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 so that the display states of the signals 1100 corresponding to the lanes, which are not opposite lanes, are different and the display states do not indicate “movable” at the same timing.
  • the imaging unit 210 may be fixed to the signal 1100 so as to image at least the lane in which traffic is controlled by the display state of the signal display unit 230 . That is, the signal 1100 detects the volume of traffic only in the lane in which traffic is controlled by the signal 1100 . Then, the detected volume of traffic is transmitted to another signal 1100 . In this way, all of the signals 1100 related to the crossroads may detect the volume of traffic of all the lanes related to the crossroads.
  • each signal 1100 can detect the volume of traffic in each lane on the basis of an image captured by the imaging unit 210 of each signal. Therefore, as described using FIG. 11 or 15 , the signal 1100 can appropriately perform control to change the display state of the signal display unit even when the signal 1100 cannot communicate with the high-order control apparatus and even when the signal 1100 cannot receive a control signal from the high-order control apparatus.
  • the present invention is not limited to this.
  • the signals 1100 placed in a line one by one in the same lane may communicate with each other.
  • each of the signals 1100 placed in a line one by one may control its own signal display unit 230 so that vehicles, which travel in the same lane and pass through the signals 1100 placed in a line one by one, can move without making the vehicle “not movable” by any of the signals 1100 .
  • a radio communication unit provided in each signal 1100 may perform relay transmission.
  • the signals 1100 placed adjacent to each other can communicate with each other, but also the signals 1100 located far from each other can communicate with each other.
  • the power switching section 243 changed the state between the first and second modes on the basis of a voltage or current of electric power supplied to the power supply section 241 .
  • the power switching section 243 may change the state between the first and second modes on the basis of a control signal for mode change without being limited to this.
  • This control signal for mode change may be received through radio communication, for example.
  • this control signal for mode change may be broadcast in a disaster or the like. Therefore, each signal 1100 can change the mode reliably without depending on a voltage or current of electric power supplied to the power supply section 241 in a disaster.
  • the signal 1100 may be a temporary signal for construction.
  • the signal 1100 may operate by supply of electric power only from the battery section 242 .
  • the signal 1100 changes the display state of the signal display unit 230 fixed to the signal 1100 on the basis of an image captured by the imaging unit 210 . Therefore, just by installing in the lane the signal 1100 which is such a temporary signal for construction, the user can appropriately perform control to change the display state of the signal display unit 230 even when there is no communication between the high-order control apparatus 1200 and the signal 1100 and it is not possible to receive a control signal from the high-order control apparatus 1200 . Accordingly, the signal 1100 can control traffic appropriately. For this reason, such a signal 1100 is suitable for construction.
  • the power supply unit 240 of the signal 1100 may supply electric power only from the battery section 242 to each component provided in the signal 1100 .
  • the power switching section 243 may change the state to the second mode after electric power is supplied and may also continue changing the state to the second mode thereafter.
  • the signal 1100 when the signal 1100 is a temporary signal for construction, electric power may not be supplied from the outside.
  • the power supply unit 240 of the signal 1100 supplies electric power only from the battery section 242 to each component provided in the signal 1100 , the display control section 222 can change the display state of the signal display unit 230 even if thus electric power is not supplied from the outside.
  • the display control section 222 changes the display state of the signal display unit 230 through the I/F 271 on the basis of a control signal received from the high-order control apparatus 1200 by the communication section 223 .
  • the method used when the display control section 222 changes the display state of the signal display unit 230 in the first mode is not limited to this.
  • the display control section 222 may change the display state of the signal display unit 230 on the basis of the volume of traffic detected by the detection section 221 . That is, also in the first mode, the display control section 222 may change the display state of the signal display unit 230 on the basis of the volume of traffic detected by the detection section 221 , in the same manner as in the case of the second mode.
  • the display control section 222 may change the display state of the signal display unit 230 on the basis of a predetermined timing set in advance.
  • the signal 1100 according to the present embodiment may also be applied to the case where vehicles pass through the right side of the lane in the same manner as in the case where vehicles pass through the left side of the lane.
  • right turn in the case of left-hand traffic described above is left turn.
  • the right turn in the case of left-hand traffic or left turn in the case of right-hand traffic described above is that a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now.
  • the “display control section 222 changes the display state of the signal display unit 230 so that priority is given to right turn” described above means that “when a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now, the control unit 220 (or the display control section 222 ) changes the display state of the signal display unit 230 on the basis of an image captured by the imaging unit 210 so that priority is given to the vehicle which changes course.
  • the “display control section 222 changes the display state of the signal display unit 230 so that the right turn signal time is adjusted” described above means that “when a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now, the control unit 220 (or the display control section 222 ) changes the display state of the signal display unit 230 , so that the period of “movable” time for the vehicle which changes course is adjusted in a case when a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now on the basis of an image captured by the imaging unit 210 .
  • the signal 1100 according to the present embodiment can make the vehicle easily change the course so as to cross the opposite lane to the lane in which the vehicle has traveled until now in both the cases of left-hand traffic and right-hand traffic.
  • the signal 1100 according to the present embodiment can reduce (alleviate) traffic congestion caused when vehicles change course so as to cross the opposite lane to the lane in which the vehicles have traveled until now.
  • the signal 1100 according to the present embodiment corresponds to both left-hand traffic and right-hand traffic
  • the control unit 220 may determine to which of the left-hand traffic and the right-hand traffic the signal 1100 corresponds on the basis of the setting.
  • the signal 1100 may determine whether the lane in which traffic is controlled by the signal 1100 is left-hand traffic or right-hand traffic on the basis of an image captured by the imaging unit 210 . In addition, on the basis of this determination result, the signal 1100 may set to which of left-hand traffic and right-hand traffic the signal 1100 corresponds, through its own setting unit.
  • the signal 1100 may also be a signal when a person crosses the road (otherwise, roadway) or a signal for a train on the railroad without being limited to only the vehicle.
  • each component provided in the signal control apparatus 1100 in FIG. 9 may be realized by dedicated hardware.
  • each component may be configured by a memory and a CPU (central processing unit) and the function may be realized by loading a program for realizing the function of each component into the memory and executing it.
  • processing by each component may be executed by recording a program for realizing the embodiment of the present invention in a computer-readable recording medium, reading the program recorded in the recording medium into a computer system, and executing the read program.
  • the “computer system” referred to herein may include hardware, such as an OS (Operating System) or a surrounding device.
  • the “computer system” may also include a homepage presenting environment (or display environment) if a WWW system is used.
  • the “computer-readable recording medium” refers to writable nonvolatile memories such as a flexible disk, a magneto-optical disc, a ROM, and a flash memory, portable media such as a CD-ROM, and a recording device such as a hard disk built in a computer system.
  • the ‘computer-readable recording medium’ may include a recording medium that stores a program dynamically for a short period of time like a network, such as the Internet, or a communication line when a program is transmitted through a communication line such as a telephone line, and include a recording medium that stores a program for a predetermined period of time like a volatile memory (for example, a DRAM (Dynamic Random Access Memory)) in a computer system serving as a server or a client in this case.
  • the program may be transmitted from a computer system, which has a storage device or the like that stores the program, to another computer system through a transmission medium or a transmission wave in the transmission medium.
  • the ‘transmission medium’ that transmits a program refers to a medium with a function of transmitting information like a network (communication network), such as the Internet, or a communication line, such as a telephone line.
  • the above program may be a program for realizing some of the functions described above or may be a so-called difference file (difference program) capable of realizing the above functions by combination with a program already recorded in the computer system.

Abstract

An information control apparatus including a determination unit that determines at least an attribute of an object to be analyzed on the basis of captured image data acquired by an imaging apparatus fixed to a signal and that generates determination result information. The information control apparatus further includes a surrounding information generating unit that generates a surrounding information table of the signal which corresponds to the determination result information, a data analyzing unit that generates analysis result information on the basis of the surrounding information table generated by the surrounding information generating unit, and an output unit that outputs the analysis result information to an external apparatus.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a non-provisional application claiming priority to and the benefit of U.S. provisional application Nos. 61/508,026, filed on Jul. 14, 2011 and 61/508,536, filed on Jul. 15, 2011, and U.S. non-provisional application Ser. No. 13/198,676, filed on Aug. 4, 2011. This application is also claiming priority to and the benefit of Japanese Patent Application Nos. 2010-177724, filed on Aug. 6, 2010 and 2010-177725 filed on Aug. 6, 2010. The entire contents of which are incorporated herein by reference.
  • BACKGROUND
  • 1. Field of the Invention
  • The present invention relates to an information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, signal, and program.
  • 2. Description of Related Art
  • For example, there is a television camera apparatus for traffic monitoring in which a camera is mounted in a traffic signal and which takes photographs near the intersection and monitors the traffic situation using the photographed image such as described in Japanese Unexamined Patent Application Publication No. 11-261990.
  • In addition, conventionally, a traffic signal receives a control signal for changing a display state of a signal display unit from a high-order control apparatus and changes the display state of the signal display unit on the basis of the received control signal, for example, such as described in Japanese Unexamined Patent Application Publication No. 10-97696.
  • SUMMARY
  • However, the television camera apparatus for traffic monitoring disclosed in Japanese Unexamined Patent Application Publication No. 11-261990 displays a photographed image on a monitor, and a person needs to view and confirm the photographed image in order to acquire the information regarding a vehicle within the photographed image, for example. When a person extracts the information regarding a vehicle and the like within the photographed image on the basis of an image photographed in a traffic signal, there has been a problem in that a large amount of time and effort are required.
  • Moreover, in the traffic signal disclosed in Japanese Unexamined Patent Application Publication No. 10-97696, there is a problem in that the display state of the signal display unit cannot be appropriately changed when a control signal cannot be received from the high-order control apparatus at the time of disaster, for example.
  • The aspect related to the present invention has been made in view of the above point, and it is an object of the aspect related to the present invention to provide an information control apparatus capable of acquiring the information regarding a vehicle and the like within an image photographed near a traffic signal, data analyzing apparatus, signal, server, information control system, and program.
  • It is another object of the aspect related to the present invention to provide signal control apparatus, traffic signal, and program that are capable of appropriately changing a display state of a signal display unit even if a control signal for changing the display state of the signal display unit cannot be received.
  • An aspect of the present invention has been made to solve the above-described problems and is characterized in that a determination unit which determines at least an attribute of objects to be analyzed based on the captured image data acquired by an imaging apparatus fixed to a signal, and an output unit which outputs the determination result information of the determination unit to a data analyzing unit that generates an analyzing result information which is at least based on the attributes of the object to be analyzed, are provided.
  • In addition, another aspect of the present invention is characterized in that a control unit which changes a display state of a signal display unit fixed to a signal based on an image captured by an imaging unit, and a battery unit which supplies electric power to each component of the own apparatus, are provided.
  • According to an aspect of the present invention, it is possible to acquire the information regarding a vehicle and the like within an image photographed near a signal.
  • In addition, according to another aspect of the present invention, it is possible to appropriately change a display state of a signal display unit even if a control signal for changing the display state of the signal display unit cannot be received.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic diagram showing the configuration of an information control system related to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing each configuration of the information control system related to the first embodiment of the present invention.
  • FIG. 3 shows an example of a table of signal surrounding information related to the first embodiment of the present invention.
  • FIG. 4 shows an example of a table of vehicle attribution information related to the first embodiment of the present invention.
  • FIG. 5 shows an example of a table of person attribution information related to the first embodiment of the present invention.
  • FIG. 6 is a flow chart for explaining the processing flow of a signal information control apparatus related to the first embodiment of the present invention.
  • FIG. 7 is a flow chart for explaining the processing flow of a signal information control server related to the first embodiment of the present invention.
  • FIG. 8 is a flow chart for explaining the processing flow of a mobile communication device related to the first embodiment of the present invention.
  • FIG. 9 is a block diagram showing the configuration of a signal system and a signal control apparatus according to a second embodiment of the present invention.
  • FIG. 10 is a state transition diagram illustrating the transition between modes of a signal according to the second embodiment of the present invention.
  • FIG. 11 is an explanatory view showing a crossroads (first example) as an example in the second embodiment of the present invention.
  • FIG. 12 is an explanatory view showing an example of the detected traffic volume in the second embodiment of the present invention.
  • FIG. 13 is an explanatory view showing an example of periods of lighting and extinguishing of a signal lamp in the second embodiment of the present invention.
  • FIG. 14 is an operation diagram showing the operation of the signal according to the second embodiment of the present invention.
  • FIG. 15 is an explanatory view showing a crossroads (second example) as an example in the second embodiment of the present invention.
  • DESCRIPTION OF EMBODIMENTS First Embodiment
  • A first embodiment of the present invention will be described in detail with reference to the drawings. FIG. 1 shows the configuration of an information control system related to the first embodiment.
  • As shown in FIG. 1, an information control system 100 includes a plurality of signal information control apparatus 2_1, 2_2, . . . , 2_n mounted in a plurality of signals (traffic signals) 1_1, 1_2, . . . , 1_n and a signal information control server 3. The plurality of signal information control apparatus 2_1, 2_2, . . . , 2_n and the signal information control server 3 are communicably connected to each other through a network NW.
  • In addition, the plurality of signal information control apparatus 2_1, 2_2, . . . , 2_n and the signal information control server 3 can communicate with a plurality of mobile communication devices 5_1, . . . , 5_m, which is mounted in a plurality of vehicles 4_1, . . . , 4_m, through the network NW. In addition, although a communication device with a car navigation function and the like mounted in a vehicle is described as an example of the mobile communication devices 5_1, . . . , 5_m herein, the present invention is not limited to this, and each of the mobile communication devices 5_1, . . . , 5_m may be a personal computer or the like of a user in a vehicle.
  • Next, the configuration of the plurality of signals 1_1, 1_2, . . . , 1_n, the plurality of signal information control apparatus 2_1, 2_2, . . . , 2_n, and the mobile communication devices 5_1, . . . , 5_m will be described with reference to FIG. 2. FIG. 2 is a block diagram showing an example of the configuration.
  • Moreover, for convenience of explanation, the following explanation will be given in a state where an example of a signal applicable to the plurality of signals 1_1, 1_2, . . . , 1_n is set as a signal (traffic signal) 1, an example of a signal information control apparatus applicable to the plurality of signal information control apparatus 2_1, 2_2, . . . , 2_n is set as a signal information control apparatus 2, and an example of a mobile communication device applicable to the plurality of mobile communication devices 5_1, . . . , 5_m is set as a the mobile communication device 5.
  • As shown in FIG. 2, the signal information control apparatus 2 and an imaging apparatus 6 are fixed to the signal 1. The signal 1 and the signal information control apparatus 2 are connected to each other through an I/F (interface) 71. The signal information control apparatus 2 and the imaging apparatus 6 are connected to each other through an I/F 72.
  • This signal 1 includes a signal display unit 11 and a control device 12. The signal display unit 11 includes light emitting sections which emit light of green (or blue), red, yellow, and the like and is controlled by the control device 12 so that the light emitting section of each color emits light at a predetermined timing. The control device 12 controls the light emitting section of each color of the signal display unit 11 so that predetermined signal display can be performed. This control device 12 may control the light emitting section of each color to emit light according to a timing determined in advance, or may control the light emitting section of each color to emit light according to a control signal input from the signal information control apparatus 2.
  • The imaging apparatus 6 includes an imaging unit 61 which is a camera capable of capturing a moving image or an image, for example. This imaging unit 61 captures an image or a video image near an intersection where the signal 1 is fixed, and outputs the captured image, which is obtained by imaging, to the signal information control apparatus 2 through the I/F 72. In addition, when an object to be analyzed is a vehicle, a high-resolution camera capable of detecting the specific features indicating the license plate number, vehicle type, and the like by image processing is used as the imaging apparatus 6. In addition, it is preferable for the imaging apparatus 6 to have high sensitivity to detect the specific features of an object to be analyzed even in the case of imaging at night and to have high sensitivity to detect the specific features even in the case of a vehicle traveling at the high speed.
  • The imaging unit 61 acquires and outputs the captured image data at intervals of 1/60 second, for example.
  • The signal information control apparatus 2 includes an image processing unit 21, a determination unit 22, a signal surrounding information generating unit 23, a storage unit 24, a control unit 25, a communication unit 26, a temperature sensor 27, a timepiece unit 28, and a microphone 29. A signal ID (identification) which is a unique identification number is given to the signal information control apparatus 2. This signal ID is information for identifying each signal information control apparatus 2 and is also information matched with the position where the signal information control apparatus 2 is placed.
  • In the temperature sensor 27, a sensor section which detects the temperature is mounted in the signal 1 in a state exposed to the outside of the signal 1, and the temperature sensor 27 detects a temperature near the signal 1 and outputs the temperature information indicating this temperature to the signal surrounding information generating unit 23.
  • The timepiece unit 28 measures date and time and outputs the information indicating the measured date and time to the signal surrounding information generating unit 23.
  • The microphone 29 is mounted in the signal 1 in a state exposed to the outside of the signal 1, and the microphone 29 detects a sound near the signal 1 and outputs the sound information indicating this sound to the signal surrounding information generating unit 23.
  • The captured image data acquired by the imaging apparatus 6 is input to the image processing unit 21 through the I/F 72, and the image processing unit 21 detects an object to be analyzed which is present within an image of the captured image data. For example, the image processing unit 21 detects an object which moves (moving object), such as a vehicle or a person, as an object to be analyzed.
  • In addition, the image processing unit 21 calculates a motion vector of the captured image data which continues in time series, and detects an image region corresponding to the moving object on the basis of the motion vector and also detects the moving speed of the moving object which is an object to be analyzed.
  • The image processing unit 21 assigns a unique image ID to each item of the input captured image data and also outputs the captured image data, the image ID, and the information indicating the detected moving speed to the signal surrounding information generating unit 23 in a state matched with each other. In addition, the information which is output from the image processing unit 21 and which is obtained by matching the captured image data, the image ID, and the information indicating the moving speed with each other is called image processing result information hereinafter.
  • The data of an image region corresponding to the object to be analyzed (here, a moving object) is input from the image processing unit 21 to the determination unit 22. The determination unit 22 determines the information regarding the moving object included in the image region by performing pattern recognition on the data of the image region and outputs the determination result information indicating this determination result.
  • For example, the determination unit 22 determines the number of moving objects, type (a vehicle or a person) of a moving object, and the attribute of a moving object on the basis of the data of the image region corresponding to the object to be analyzed (moving object) detected by the image processing unit 21 and outputs the determination result information indicating the number, types, and attributes of moving objects.
  • That is, when the type of an object to be analyzed is a vehicle, the determination unit 22 acquires the attributes of the vehicle, such as the type (a bicycle, a large motorbike, a motor scooter, a sedan type automobile, a minivan type automobile, a light truck, or a heavy truck), a vehicle body color, the license plate number, the number of occupants, driver's sex, driver's age, and the like by pattern recognition, for example. When the type of an object to be analyzed is a person, the determination unit 22 acquires the attributes of the person, such as, for example, sex, age, height, clothing, moving method (on foot, a bicycle, or a motorbike), belongings (kinds of belongings, such as a baby carriage or a stick), and the like by pattern recognition.
  • In addition, the determination unit 22 determines the weather at the time of imaging by analyzing the captured image data input from the image processing unit 21. For example, the determination unit 22 determines the weather at the time of imaging among sunny, cloudy, rain, and snow on the basis of the brightness, color of the sky, and the existence of rain or snow, or the like of captured image data.
  • The signal surrounding information generating unit 23 writes the input information in, for example, a table of signal surrounding information shown in FIG. 3, a table of vehicle attribution information shown in FIG. 4, and a table of person attribution information shown in FIG. 5 all of which are stored in the storage unit 24 in advance.
  • The image processing result information acquired by the image processing unit 21 and the determination result information acquired by the determination unit 22 are input to the signal surrounding information generating unit 23, and the signal surrounding information generating unit 23 writes the image processing result information and the determination result information in each corresponding table of the signal surrounding information, the vehicle attribution information, and the person attribution information.
  • In addition, the signal surrounding information generating unit 23 writes each item of the captured image data in each corresponding table of the signal surrounding information table, the vehicle attribution information table, and the person attribution information table together with the temperature information indicating the temperature near the signal 1 detected by the temperature sensor 27, the time information indicating a time measured by the timepiece unit 28, and the sound information indicating the sound near the signal 1 acquired by the microphone 29, respectively.
  • Here, the table of signal surrounding information will be described with reference to FIG. 3. In addition, an example of the information based on the captured image data acquired at intervals of 5 minutes among the captured image data continuously acquired at intervals of 1/60 second by the imaging apparatus 6 is shown herein. The present invention is not limited to the above-described configuration, and image processing result information and determination result information based on all items of the captured image data may be made to match the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information. In addition, the information based on the captured image data acquired at certain fixed intervals may be made to match each table, or only the information acquired on the basis of the captured image data acquired when a moving object is detected by the image processing unit 21 may be made to match each table.
  • As shown in FIG. 3, the table of signal surrounding information is a table in which an image ID, date and time, the number of vehicles, the number of persons, vehicle attribution information, person attribution information, the weather, temperature, a noise level, and a signal lighting color are matched with each other.
  • The date and time is information indicating the date and time at which the captured image data of a corresponding image ID is acquired by the imaging apparatus 6. Moreover, regarding the date and time, date and time at which the captured image data is input from the imaging apparatus 6 to the signal information control apparatus 2 through the I/F 72 may be set as an imaging timing. The date and time is date and time measured by the timepiece unit 28.
  • The number of vehicles is the number of vehicles included in the captured image data of the corresponding image ID.
  • The number of persons is the number of persons included in the captured image data of the corresponding image ID.
  • The vehicle attribution information includes a vehicle ID of each vehicle included in the captured image data of the corresponding image ID. This vehicle ID is unique information assigned to each vehicle within an image, and is an identifier for matching each vehicle with its attribution information with reference to the table of vehicle attribution information shown in FIG. 4.
  • The person attribution information includes a person ID of each person included in the captured image data of the corresponding image ID. This person ID is unique information assigned to each person within an image, and is an identifier for matching each person with its attribution information with reference to the table of person attribution information shown in FIG. 5.
  • The weather is information indicating the weather determined by the determination unit 22. In addition, when the information indicating the weather of an area where the signal 1 is placed is received from an external server, which is connected thereto through the network NW, by the communication unit 26, the weather information may be the information obtained by the communication unit 26.
  • The temperature is information indicating a temperature detected by the temperature sensor 27 when the captured image data is imaged by the imaging apparatus 6.
  • The noise level is information indicating the sound volume of sound information determined by the signal information control apparatus 2 on the basis of the sound information acquired by the microphone 29.
  • The signal lighting color indicates a color (green, red, yellow) lit by the signal display unit 11 of the signal 1 when the captured image data of the corresponding image ID is imaged. The information indicating the signal lighting color is included in a control signal output from the control unit 25 to the signal 1, is output from the control unit 25 to the determination unit 22, and is input from the determination unit 22 to the signal surrounding information generating unit 23 together with determination result information.
  • Next, the table of vehicle attribution information will be described with reference to FIG. 4.
  • As shown in FIG. 4, the table of vehicle attribution information is a table in which a vehicle ID, a license plate number, a vehicle type, a vehicle body color, the number of occupants, driver's sex, driver's age, and traveling speed are matched with each other.
  • The vehicle ID is information which specifies each vehicle included in the captured image data of the corresponding image ID.
  • The license plate number, the vehicle type, the vehicle body color, the number of occupants, the driver's sex, the driver's age, and the traveling speed are information indicating the attributes of a vehicle indicated by the vehicle ID.
  • Next, the table of person attribution information will be described with reference to FIG. 5.
  • As shown in FIG. 5, the person attribution information table is a table in which a person ID, sex, age, height, clothing, moving method (on foot, a bicycle, or a motorbike), belongings (kinds of belongings, such as a baby carriage or a stick), and the walking speed are matched with each other.
  • The person ID is information which specifies each person included in the captured image data of the corresponding image ID.
  • The age, sex, height, clothing, moving method, belongings, and walking speed are information indicating the attributes of a person indicated by the person ID.
  • Referring back to FIG. 2, the storage unit 24 stores a table of signal surrounding information, a table of vehicle attribution information, a table of person attribution information, and the captured image data matched therewith. In addition, the storage unit 24 stores a signal ID assigned in advance to each signal information control apparatus 2.
  • The control unit 25 generates a control signal for controlling the lighting timing of a light emitting section of each color of the signal display unit 11 so that predetermined signal display of the signal 1 is performed, and outputs the control signal to the signal 1 through the I/F 71.
  • The communication unit 26 is communicably connected to the signal information control server 3 through the network NW. The communication unit 26 transmits the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data, which are stored in the storage unit 24, to the signal information control server 3 periodically or in response to the request from the signal information control server 3. The communication unit 26 transmits the transmission information so as to match the signal ID.
  • A power supply unit 73 supplies stored electric power to the signal 1, the signal information control apparatus 2, and the imaging apparatus 6.
  • The signal information control server 3 includes a communication unit 31, a data analyzing unit 32, an output unit 33, and a storage unit 34.
  • The communication unit 31 is communicably connected to the signal information control apparatus 2 through the network NW. The communication unit 31 outputs to the data analyzing unit 32 the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data received from the signal information control apparatus 2.
  • The data analyzing unit 32 stores the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data, which have been received from the signal information control apparatus 2 through the communication unit 31, in the storage unit 34. The data analyzing unit 32 performs various kinds of data analyses, which will be described later, on the basis of the information stored in the storage unit 34 and generates the analysis result information, which is based on the attributes of the object to be analyzed, on the basis of the determination result information and the like.
  • For example, the output unit 33 is a display device, such as a liquid crystal display, or a data communication unit that transmits the information, image data, or the like to an external device or the mobile communication device 5 and outputs the analysis result information generated by the data analyzing unit 32. For example, when the analysis result information corresponding to a certain vehicle is generated by the data analyzing unit 32, the output unit 33 transmits the analysis result information to the vehicle. In addition, the output unit 33 transmits the captured image data corresponding to the certain vehicle to this vehicle on the basis of the analysis result information.
  • The storage unit 34 stores the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data received from the signal information control apparatus 2. This storage unit 34 includes a table in which each signal ID and the position, at which the signal 1 indicated by the signal ID is placed, are matched with each other.
  • The mobile communication device 5 includes a communication unit 51, a control unit 52, and an output unit 53.
  • The communication unit 51 is communicably connected to the signal information control apparatus 2 and the signal information control server 3 through the network NW. The communication unit 51 outputs to the control unit 52 the captured image data received from the signal information control apparatus 2 or the analysis result information received from the signal information control server 3.
  • The control unit 52 performs control to output the captured image data and the analysis result information, which have been received through the communication unit 51, to the output unit 53.
  • The output unit 53 is a data output unit that outputs the data to a display device or an external display device, for example, and is controlled by the control unit 52 and outputs the captured image data and the analysis result information.
  • Next, processing of the signal information control apparatus 2 will be described with reference to FIG. 6. FIG. 6 is a flow chart for explaining an example of the processing flow of the signal information control apparatus 2.
  • As shown in FIG. 6, the imaging unit 61 of the imaging apparatus 6 images an image near the intersection. Then, the captured image data of the captured image is input to the image processing unit 21 of the signal information control apparatus 2 through the I/F 72 (step ST1). The image processing unit 21 assigns a unique image ID to the input captured image data. For example, the following explanation will be given using the case where the image processing unit 21 assigns an image ID “0002” to the input captured image data as an example.
  • The image processing unit 21 calculates a motion vector of the captured image data (image ID “0002”) and the captured image data (for example, an image with an image ID “0001”) acquired in the past which continues in time series. The image processing unit 21 detects an image region corresponding to the moving object on the basis of the calculated motion vector and also calculates the moving speed of the moving object. For example, the image processing unit 21 detects 20 image regions corresponding to the moving object and calculates the moving speed of each image region.
  • That is, the image processing unit 21 acquires the image processing result information including the captured image data, the image ID “0002”, data indicating the image region corresponding to the moving object (for example, information which specifies corresponding pixels and the pixel value), and information indicating the moving speed of each image region (step ST2).
  • Then, the image processing unit 21 matches the captured image data, the image ID “0002”, and the data indicating the image region corresponding to the moving object with each other and outputs them to the determination unit 22. In addition, the image processing unit 21 matches the captured image data, the image ID “0002”, and the information indicating the moving speed of each image region with each other and outputs them to the signal surrounding information generating unit 23.
  • Then, the determination unit 22 determines the information regarding the moving object included in the image region by performing pattern recognition on the data of the image region corresponding to the moving object input from the image processing unit 21 and outputs the determination result information indicating the determination result (step ST3).
  • For example, the determination unit 22 determines that the data of each of the plurality of image regions corresponding to the moving object is data of an image region indicating the data of an image region, which indicates 15 vehicles, and five persons by performing pattern recognition for determining the type and the number of moving objects.
  • In addition, the determination unit 22 determines the attributes indicated by the data of the plurality of image regions corresponding to the vehicle, such as the vehicle type of the vehicle, a vehicle body color, the license plate number, the number of occupants, driver's sex, and driver's age, by performing pattern recognition for determining these attributes of the moving object (vehicle) determined in advance. In addition, the determination unit 22 determines the attributes indicated by the data of the plurality of image regions corresponding to a person, such as the age of the person, by performing pattern recognition for determining these attributes of the moving object (person) determined in advance. In addition, the determination unit 22 determines the weather at the time of imaging by analyzing the captured data input from the image processing unit 21.
  • Then, the signal surrounding information generating unit 23 generates the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information on the basis of the image processing result information acquired by the image processing unit 21, the determination result information acquired by the determination unit 22, the temperature information indicating the temperature near the signal 1 detected by the temperature sensor 27, the time information indicating a time measured by the timepiece unit 28, and the sound information indicating the sound near the signal 1 acquired by the microphone 29. That is, the signal surrounding information generating unit 23 stores the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information in the storage unit 24 so as to match the input information (step ST4).
  • Then, the communication unit 26 gives a signal ID to the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data, which are stored in the storage unit 24, and transmits them to the signal information control server 3 (step ST5).
  • Then, the communication unit 26 determines whether or not the analysis result information, which is a result of data analysis of the signal information control server 3, has been received (step ST6). When the analysis result information has been received (step ST6-YES), the communication unit 26 transmits the analysis result information to the mobile communication device 5 by radio communication (step ST7).
  • Then, when the new captured image data is input from the imaging apparatus 6 (step ST8-YES), the signal information control apparatus 2 returns to step ST2 again to repeat processing.
  • Next, processing of the signal information control server 3 will be described with reference to FIG. 7. FIG. 7 is a flow chart for explaining an example of the processing flow of the signal information control server 3.
  • As shown in FIG. 7, the communication unit 31 of the signal information control server 3 transmits a signal which requests transmission of the acquired information, for example, to the signal information control apparatus 2 and receives the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data received from the signal information control apparatus 2 (step ST21). The communication unit 31 outputs the received information to the data analyzing unit 32.
  • The data analyzing unit 32 stores the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data, which have been received from the signal information control apparatus 2 through the communication unit 31, in the storage unit 34 and performs desired data analysis, which will be described later, on the basis of the information stored in the storage unit 34 (step ST22).
  • When transmitting the analysis result information acquired by data analysis to the mobile communication device 5 (step ST23-YES), the data analyzing unit 32 transmits the analysis result information to the signal information control apparatus 2, which is indicated by the signal ID and from which the data has been transmitted in step ST21, through the communication unit 31 (step ST24).
  • In addition, when transmitting the analysis result information acquired by data analysis to the output unit 33 (step ST25-YES), the data analyzing unit 32 displays an image indicating the analysis result on a display screen of the output unit 33, for example (step ST26). In addition, as described above, the output unit 33 may transmit the analysis result information to the corresponding mobile communication device 5 according to the analysis result information, or may transmit the captured image data to the corresponding mobile communication device 5. In this case, the output unit 33 may transmit the analysis result information or the captured image data to the corresponding mobile communication device 5 directly through the network NW, or may transmit the analysis result information or the captured image data to the corresponding mobile communication device 5 indirectly through the signal information control apparatus 2.
  • Here, when data analysis is not ended, the process returns to step ST22 to repeat processing (step ST27-NO).
  • Next, processing of the mobile communication device 5 will be described with reference to FIG. 8. FIG. 8 is a flow chart for explaining an example of the processing flow of the mobile communication device 5.
  • The communication unit 51 of the mobile communication device 5 performs radio communication with the communication unit 26 of the signal information control apparatus 2. When the transmission information is received in the area of communication with the signal information control apparatus 2 (step ST41-YES), the communication unit 51 of the mobile communication device 5 outputs the information to the control unit 52.
  • For example, when the communication unit 51 receives from the signal information control apparatus 2 the analysis result information which is transmitted from the signal information control server 3 to the signal information control apparatus 2, the control unit 52 displays an image indicating the analysis result on a display screen of the output unit 53, for example (step ST42).
  • Moreover, as described above, the mobile communication device 5 may receive the information or data, which is directly transmitted from the output unit 33 of the signal information control server 3, without being limited to the above method.
  • Here, data analysis by the data analyzing unit 32 of the signal information control server 3 will be described. The data analyzing unit 32 can execute at least one of the data analyses described below.
  • <Data Analysis for Monitoring the Driving Conditions of a Traveling Vehicle>
  • The data analyzing unit 32 of the signal information control server 3 determines a dangerous area where a traffic accident tends to occur by data analysis, for example.
  • For example, the data analyzing unit 32 calculates the incidence rate of sudden braking by a vehicle, which is an object to be analyzed, on the basis of the determination result information by statistical processing and acquires it as analysis result information. In addition, the data analyzing unit 32 calculates the incidence rate of sudden braking on the basis of the position corresponding to the signal ID by statistical processing for each area and acquires it as analysis result information.
  • Specifically, the data analyzing unit 32 counts the number of vehicles, on which the sudden braking are hit, by the change in the traveling speed of each vehicle on the basis of the traveling speed of the table of vehicle attribution information. The data analyzing unit 32 performs statistical processing of the rate of vehicles, on which the sudden braking are hit, for each intersection.
  • The analysis result information indicating the rate of vehicles on which the sudden braking are hit for each area, which has been acquired as described above, is useful information in that it is predicted that there is a possibility of a traffic accident at the intersection with a high rate of vehicles on which the sudden braking are hit and warning display or the like indicating the danger at the intersection can be performed.
  • In addition, the signal information control server 3 may make the signal information control apparatus 2, which is mounted in the signal 1 at the intersection with a high rate of vehicles on which the sudden braking are hit, transmit the information for displaying a warning message to the mobile communication device 5 of a vehicle, which is passing through the intersection, on the basis of the analysis result of the data analyzing unit 32.
  • In addition, the data analyzing unit 32 may detect the traveling speed of the vehicle on the basis of the information of tables of vehicle attribution information transmitted from the signal information control apparatus 2 of the plurality of adjacent signals 1 and may specify a vehicle on which the sudden braking is hit.
  • For example, the data analyzing unit 32 determines a vehicle indicating the same license plate number, vehicle type, vehicle body color, and the like to be the same vehicle on the basis of the information of tables of vehicle attribution information received from the signal information control apparatus 2_1, 2_2, and 2_3 mounted in the signals 1_1, 1_2, and 1_3 which are disposed continuously in the traveling direction of the lane. The data analyzing unit 32 can determine whether or not the vehicle decelerates rapidly by comparing the traveling speed when the vehicle travels between the signals 1_1 and 1_2, the traveling speed when the vehicle travels between the signals 1_2 and 1_3, and the traveling speed immediately before the signal 1_3, and the like.
  • In addition, without being limited to the vehicles on which the sudden braking are hit, the data analyzing unit 32 may detect a vehicle traveling in a state deviating from the middle lane, a vehicle traveling while passing other vehicles, a bicycle or a person crossing the roadway, and the like by analyzing the captured image data and the detection rate thereof statistically. Such information is useful information in that a location, at which dangerous driving and the like occur, can be discovered.
  • In addition, the data analyzing unit 32 may determine a possibility of collision by calculating the traveling speeds of vehicles entering the intersection from the opposite directions. For example, the data analyzing unit 32 detects at least two vehicles, each of which is the object to be analyzed and enter the intersection from different directions, on the basis of the determination result information, calculates a possibility of collision of the vehicles at the intersection on the basis of the moving speeds of the vehicles, and acquire it as analysis result information.
  • Specifically, it is assumed that signal information control apparatus 2_11, 2_12, and 2_13, which are mounted in signal 1_11, 1_12, and 1_13 disposed continuously in the traveling direction of a first lane, and signal information control apparatus 2_14, 2_15, and 2_16, which are mounted in signal 1_14, 1_15, and 1_16 disposed continuously in a second lane crossing the first lane at the intersection U. In this case, the signals 1_13 and 1_16 are set at the same intersection U and control the flow of traffic in the first and second lanes, respectively.
  • The data analyzing unit 32 determines a vehicle indicating the same license plate number, vehicle type, vehicle body color, and the like to be the same vehicle on the basis of the information of the table of vehicle attribution information and detects a vehicle A entering the intersection U from the first lane and a vehicle B entering the intersection U from the second lane. The data analyzing unit 32 calculates the traveling speeds of the vehicles A and B and determines whether or not the timing at which the vehicles A and B enter the intersection U is the same when the vehicles A and B enter the intersection U at the traveling speeds. When the entrance timing is the same, the data analyzing unit 32 determines that the possibility of collision is high and transmits the analysis result information, which indicates transmission of a message prompting slowing down because of the risk of collision, to the signal information control apparatus 2 of the signal 1 in the lane in which the vehicles A and B are traveling. The signal information control apparatus 2 which receives this analysis result information transmits a message, which prompts slowing down because of the risk of collision, to the vehicle A or B traveling in the communications area.
  • In addition, the data analyzing unit 32 may calculate the existence of traffic congestion and the length of traffic congestion on the basis of the number of vehicles and the traveling speed of the table of signal surrounding information.
  • For example, the data analyzing unit 32 generates the information regarding road congestion caused by vehicles on the basis of the determination result information and acquires it as analysis result information.
  • When a plurality of vehicles whose traveling speeds are equal to or lower than a fixed speed are detected, the data analyzing unit 32 calculates the length of these vehicles in the traveling direction of the lane. In addition, the data analyzing unit 32 reads the road information stored in the storage unit 34 in advance, specifies the road where the traffic congestion is occurring, and generates the congestion information indicating the road where traffic congestion is occurring. The data analyzing unit 32 transmits the congestion information to the mobile communication device 5 through the network NW.
  • The mobile communication device 5 is a device with a car navigation function, for example. The mobile communication device 5 receives the congestion information and outputs the information indicating that traffic congestion is occurring on the basis of this congestion information. In addition, the mobile communication device 5 notifies a user of a path change when it is determined that traffic congestion is occurring at the current position at the time of traveling or in a path to the destination on the basis of the congestion information. The table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information are useful information in that the traffic congestion can be reduced when the user changes a path according to the above.
  • In addition, the data analyzing unit 32 may detect an illegally parked vehicle on the basis of the traveling speed and the vehicle attribution information of the table of signal surrounding information.
  • For example, the data analyzing unit 32 acquires the information regarding a vehicle in violation of traffic rules, among vehicles which are objects to be analyzed, as analysis result information on the basis of the determination result information.
  • In the determination result of the determination unit 22, the data analyzing unit 32 detects a parked vehicle on the basis of the traveling speed in a plurality of items of the captured image data. When it is determined that the parked vehicle is a vehicle which has been parked exceeding a period of time in a no parking area, the data analyzing unit 32 reads the information including the license plate number, the vehicle type, and the like, which indicates the attributes included in the vehicle attribution information, from the table of vehicle attribution information and acquires it as traffic violation information.
  • In addition, the data analyzing unit 32 can acquire useful information by data analysis in order to crack down on traffic violations, such as speeding, signal violation, and other driving violations, as well as parking violations.
  • For example, when it is determined that a specific vehicle is traveling in a speed equal to or higher than the regulation speed on the basis of the information transmitted from the plurality of signal information control apparatus 2, the data analyzing unit 32 reads the information specifying the vehicle from the table of vehicle attribution information and acquires it as traffic violation information by speeding.
  • In addition, when it is determined that a vehicle passing through the intersection is passing therethrough in a red state on the basis of a signal lighting color of the table of signal surrounding information transmitted from the signal information control apparatus 2, the data analyzing unit 32 reads the information specifying the vehicle from the table of vehicle attribution information and acquires it as traffic violation information by ignoring the signal.
  • In addition, when the data analyzing unit 32 observes the traveling path and the traveling speed of each vehicle and detects a vehicle turning right at the intersection where right turn is prohibited or a vehicle entering the intersection without a halt on the basis of the information transmitted from the plurality of signal information control apparatus 2, the data analyzing unit 32 reads the information specifying the vehicle from the table of vehicle attribution information and acquires it as traffic violation information.
  • In addition, driving of each vehicle may be controlled on the basis of the analysis result information acquired by the data analyzing unit 32 as described above. In this case, the mobile communication device 5 is connected to a driving unit of the vehicle and controls a traveling direction, a traveling speed, and the like of the vehicle according to the analysis result information received from the signal information control server 3. For example, when it is determined that the possibility of collision is high by the analysis result information of the data analyzing unit 32, the mobile communication device 5 of the corresponding vehicle controls the driving unit of the vehicle to reduce the speed.
  • In addition, the data analyzing unit 32 may transmit the captured image data to the mobile communication device 5 together with the analysis result information. Then, the mobile communication device 5 mounted in the vehicle can receive, for example, the congestion information and the captured image data, which is an image of the road under traffic congestion, from the signal information control server 3.
  • In addition, even if there is no traffic congestion, the data analyzing unit 32 receives the information indicating the traveling path of the vehicle from the mobile communication device 5 and acquires the captured image data transmitted from the signal information control apparatus 2 of the signal 1, which corresponds to the road along which the vehicle travels from now in this traveling path, by search. Then, the data analyzing unit 32 transmits the captured image data to the mobile communication device 5. As a result, the user can check the situation of the traveling path by an image.
  • <Data Analysis for Acquiring Available Data in a Store Open Plan>
  • The data analyzing unit 32 of the signal information control server 3 may acquire the information for analyzing a new store open plan by data analysis and output it to a computer which displays the information regarding the new store open plan, for example.
  • This computer displays on a display screen, for example, the trade area information, the property information, and the information indicating the traffic conditions around stores or the information indicating features of passers-by.
  • For example, this computer displays a map designated by the user on the display screen, reads the information relevant to this map from a database, and displays it. The information relevant to this map is information indicating the volume of people passing by or the volume of traffic in this area or the attributes or features regarding this area.
  • The data analyzing unit 32 of the signal information control server 3 acquires the information indicating the features of the area by analysis by associating it with this map.
  • For example, the data analyzing unit 32 acquires the information, which can be used in marketing analysis of the area where the signal is placed, as analysis result information on the basis of the number or attributes of objects to be analyzed based on the determination result information.
  • On the basis of the information indicating the date and time of the table of signal surrounding information and the information indicating the number of vehicles and the number of people, the data analyzing unit 32 calculates a volume of people passing by or the volume of traffic in the area where the signal 1 is placed, during in each period of time of a day such as morning, daytime, evening, and nighttime and acquires it as an analysis result. For example, if the volume of traffic of vehicles is large in the morning and evening, it can be estimated that those who commute in vehicles passes through the area with the signal 1. Therefore, this is useful information in that the features of the trade area can be analyzed.
  • In addition, the data analyzing unit 32 acquires the information indicating the features of persons passing through this area in vehicles, as an analysis result, on the basis of the vehicle type, driver's sex, or driver's age of the table of vehicle attribution information. For example, if the vehicle type is a one-box car, the driver's age is in the twenties or thirties, and the sex is female, it can be estimated that a possibility of a housewife of a large family is high. Therefore, this is useful information in that the features of the trade area can be analyzed.
  • In addition, the data analyzing unit 32 acquires the information indicating the features of persons passing through this area, as an analysis result, on the basis of the sex, age, moving method, and belongings of the table of person attribution information. For example, if the sex is female, the age is “twenties to thirties”, and moving method is “pushing a baby carriage”, it can be estimated that a possibility of a housewife with an infant is high. Therefore, this is useful information in that the features of the trade area can be analyzed.
  • In addition, the signal 1 is provided at the place where people or vehicles come and go in many cases. Accordingly, by using the analysis result by data analysis, marketing trends according to the characteristics of the area where the signal 1 is placed can be analyzed from the overall perspective.
  • As a result, in franchise business, such as a convenience store or a pharmacy, it is possible to make a plan for a new store opening at a favorable location. For example, a franchise business company performs evaluation and settlement at the time of new store open planning using the store location map information, surrounding information, trade area information, population, trade area, expected sales, a layout pattern, and the like in materials of paper media when performing new store opening. In addition, a franchise candidate also examines the profitability or growth potential in store opening using the same data.
  • As described above, by acquiring the information indicating the characteristics of the area from the captured image data by each signal information control apparatus 2, various kinds of information regarding a new store open plan for supporting a new store open plan can be acquired.
  • In addition, the data analyzing unit 32 may analyze a change in volume of people passing by according to the weather on the basis of the weather, temperature, and the number of people of the table of signal surrounding information.
  • In addition, the data analyzing unit 32 may analyze the atmosphere of the area statistically using the information indicating the noise level of the table of signal surrounding information. For example, the area with a low average noise level can be estimated to be a quiet residential area.
  • In addition, the storage unit 34 may store the information for performing pattern recognition, and the data analyzing unit 32 may perform pattern recognition on the captured image data transmitted from the signal information control apparatus 2. For example, the information which specifies the type or brand of clothes may be prepared in advance as pattern information stored in the storage unit 34, and the data analyzing unit 32 may determine the type or brand of clothing of a person included in the captured image data.
  • <Billing for Analysis Result Information Transmission>
  • The data analyzing unit 32 of the signal information control server 3 may bill the user of the mobile communication device 5 to which the analysis result information was transmitted.
  • For example, when the analysis result information to be transmitted to a user who joined the service for transmission of specific analysis result information in advance is acquired as a result of data analysis, the data analyzing unit 32 transmits the analysis result information to the mobile communication device 5 of the user and also stores having transmitted the analysis result information to the user in the storage unit 34. Then, for example, after the elapse of a fixed period, the data analyzing unit 32 transmits service charges, of which payment is requested to each user, to the billing center or the like according to the number of times of transmission of the analysis result information to each user, type of the transmitted information, and the like. This billing center is a center which collects the charges of the service which transmits to a user the analysis result information described previously. A server which can communicate with the signal information control server 3 is provided in the billing center. This server stores the personal information (for example, a user name or an identification number of a mobile communication device) on a user who joined the service for transmission of analysis result information and the service content (for example, type of analysis result information whose transmission is requested by the user), and the information is transmitted to the signal information control server 3.
  • Since the imaging apparatus 6 related to the present embodiment has high resolution and high sensitivity as described above, the signal information control apparatus 2 can acquire the detailed information, such as the type or attributes of an object to be analyzed, included in an image on the basis of the captured image data photographed by the imaging apparatus 6. As a result, the signal information control apparatus 2 can generate a table of signal surrounding information, a table of vehicle attribution information, and a table of person attribution information matched with the type, attributes, and the like of an object to be analyzed. In addition, the data analyzing unit 32 can acquire the characteristics resulting from behavior patterns of people in the area where the signal 1 is placed, by analysis, using the table of signal surrounding information, the table of vehicle attribution information, and the table of person attribution information.
  • Therefore, the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the analysis result information acquired in this way may be used to monitor the traffic situation and to understand marketing trends according to economic trends, such as a store open plan, as described above.
  • According to an aspect related to the present invention, it is possible to comprehensively acquire the information regarding the area by acquiring such useful information from the signal 1 placed in most areas where people live.
  • In addition, the present invention is not limited to the embodiments described above, and may have the following configuration.
  • For example, although the configuration where the data analyzing unit 32 is mounted in the signal information control server 3 has been described as an example, the data analyzing unit 32 may be mounted in each signal information control apparatus 2. In this case, the data analyzing unit 32 mounted in the signal information control apparatus 2 may use the table of signal surrounding information, the table of vehicle attribution information, the table of person attribution information, and the captured image data which are stored in the storage unit 24 by its own signal information control apparatus 2. In addition, the data analyzing unit 32 mounted in the signal information control apparatus 2 may perform data analysis described above using both the information transmitted from other signal information control apparatus 2 and the information of its own storage unit 24.
  • As a result, since communication with the signal information control server 3 can be reduced, the processing speed can be improved.
  • In addition, although the configuration in which the signal information control apparatus 2 is mounted in the signal 1 has been described, the present invention is not limited to this, and the signal information control apparatus 2 may be mounted in the signal information control server 3. In this case, the signal information control apparatus 2 mounted in the signal information control server 3 receives the captured image data transmitted from the imaging apparatus 6 mounted in the signal 1 and performs the same processing as described above.
  • In addition, the signal information control apparatus 2 may have a configuration including a display device, such as a liquid crystal display or an electroluminescent display panel, and may display the information according to the analysis result information on data analysis received from the signal information control server 3.
  • In addition, the signal 1 may be a movable signal placed in construction sites or the like. The imaging apparatus 6 may be a camera capable of performing imaging in a range of 360°.
  • In addition, although only a signal for a vehicle has been described as the signal 1 in the above, the signal 1 may also be a signal when a person crosses the road (otherwise, roadway) or a signal for a train in the railroad without being limited to only the vehicle.
  • The mobile communication device 5 is not limited to a device mounted in a vehicle. Alternatively or additionally, the mobile communication device 5 may include a personal digital assistant, a personal computer, and other various devices with communication functions. Even if not on the vehicle, the data from the signal information control apparatus can be received through the communication device 5.
  • In addition, although the case where the object to be analyzed (moving object) detected by the image processing unit 21 is a vehicle or a person has been described, the object to be analyzed (moving object) is not limited to this. For example, the object to be analyzed (moving object) may be an animal, an insect, a floating object, and the like.
  • Second Embodiment
  • Hereinafter, a second embodiment of the present invention will be described with reference to the drawings. FIG. 9 is a schematic block diagram showing the configuration of a signal system 201 in the second embodiment of the present invention. In the signal system 201, a signal control apparatus 1100 provided in each of a plurality of signals and a high-order control apparatus 1200 are connected to each other through a communication network 1300. Hereinafter, explanation will be given referring to the signal control apparatus 1100 as a signal 1100.
  • Here, the explanation will be given assuming that the plurality of signals 1100 are placed on the road and each of them is identified by the identification information. The high-order control apparatus 1200 transmits a control signal, which is for controlling each signal 1100, to the signal 1100 through the communication network 1300. As a result, the high-order control apparatus 1200 can control each of the plurality of signals 1100.
  • The plurality of signals 1100 has the same configuration. Therefore, the configuration of one signal 1100 will be described herein. The signal 1100 includes an imaging unit 210, a control unit 220, a signal display unit 230, a power supply unit 240, and a sound pickup unit 250. The imaging unit 210 and the control unit 220 are connected to each other through an I/F (interface) 270. The control unit 220 and the signal display unit 230 are connected to each other through an I/F 271. The sound pickup unit 250 and the control unit 220 are connected to each other through an I/F (interface) 272.
  • The signal display unit 230 includes a first light emitting section 231, a second light emitting section 232, and a third light emitting section 233. The first light emitting section 231 lights green (or blue) and indicates that “may move” (hereinafter, referred to as “movable”) at the time of lighting. The second light emitting section 232 lights yellow and indicate that “stop at the stop position at the time of lighting. However, you may move when it is not possible to stop at the stop position” (hereinafter, referred to as “stop”). The third light emitting section 233 lights red and indicates that “should not move” (hereinafter, referred to as “not movable”) at the time of lighting.
  • The imaging unit 210 includes an imaging apparatus, such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor and outputs a captured image to the control unit 220 through the I/F 270. The imaging unit 210 is fixed to the signal 1100. For example, the imaging unit 210 is fixed to the signal 1100 so as to be located at the upper, lower, left, or right side of the signal display unit 230. In addition, this imaging unit 210 may be fixed to the signal 1100 integrally with the signal display unit 230.
  • As an example, the imaging unit 210 performs imaging in the azimuth of 360° in the horizontal direction. In this case, the imaging unit 210 may perform photographing in the azimuth of 360° by combining a plurality of imaging apparatuses. In addition, the imaging unit 210 may perform photographing in the azimuth of 360° by performing image processing of an image captured through a specular member with a shape of a triangular pyramid, a sphere, or the like.
  • The sound pickup unit 250 is a sound pickup device, such as a microphone, and outputs pickup sound to the control unit 220 through the I/F (interface) 272. For example, the sound pickup unit 250 may be a sound pickup device which includes a plurality of microphones and which picks up a sound so that the direction of a sound source can be specified. The sound pickup unit 250 is fixed to the signal 1100. For example, the sound pickup unit 250 is fixed to the signal 1100 so as to be located at the upper, lower, left, or right side of the signal display unit 230. In addition, this sound pickup unit 250 may be fixed to the signal 1100 integrally with the signal display unit 230.
  • The power supply unit 240 supplies electric power to the imaging unit 210, the control unit 220, the signal display unit 230, and the sound pickup unit 250 which are respective components provided in the signal 1100.
  • The power supply unit 240 includes a power supply section 241, a battery section 242, and a power switching section 243. Electric power is supplied from the outside of the signal 1100 to the power supply section 241 through a power line. Electric power (electric charge) is accumulated in the battery section 242, and the battery section 242 outputs the accumulated electric power. For example, the battery section 242 is charged by electric power supplied to the power supply section 241 in a period for which electric power is supplied to the power supply section 241 from the outside. The battery section 242 is a secondary battery, for example.
  • For example, the power switching section 243 supplies electric power selected from either the power supply section 241 or the battery section 242 to each component provided in the signal 1100. The power switching section 243 detects a voltage or current of electric power supplied to the power supply section 241 and switches the power supply section 241, which supplies electric power to each component provided in the signal 1100, to either the power supply section 241 or the battery section 242 on the basis of this detected voltage or current.
  • Here, mode switching by the power switching section 243 will be described using FIG. 10. In addition, the following explanation will be given assuming that the case where electric power is supplied from the power supply section 241 to each component provided in the signal 1100 by the power switching section 243 is called a “first mode” and the case where electric power is supplied from the battery section 242 to each component provided in the signal 1100 by the power switching section 243 is called a “second mode”.
  • First, when the signal 1100 starts (in FIG. 10, in the case of a state “startup”), the power switching section 243 changes the state to the first mode if first startup conditions are satisfied and changes the state to the second mode if second startup conditions are satisfied.
  • The first startup conditions may be conditions in which a voltage or current of electric power supplied to the power supply section 241 is larger than the threshold value set in advance, for example. In addition, the second startup conditions may be conditions in which a voltage or current of electric power supplied to the power supply section 241 is equal to or lower the threshold value set in advance, for example. That is, the first startup conditions are conditions in which electric power from the outside is supplied to the signal 1100, and the second startup conditions are conditions in which electric power from the outside is not supplied to the signal 1100.
  • Then, for example, the power switching section 243 detects a voltage or current of electric power supplied to the power supply section 241 when electric power is supplied from the power supply section 241 to each component provided in the signal 1100 (when the state is the first mode). Then, when the detected voltage or current becomes equal to or lower than a threshold value set in advance (when the second condition is satisfied), the power switching section 243 changes the power supply to the battery section 242 (changes the state to the second mode).
  • For example, the second condition corresponds to a case where a power line through which electric power is supplied to the signal 1100 is cut or a case where the facility which supplies electric power to the signal 1100 fails in a disaster or the like.
  • In addition, the power switching section 243 detects a voltage or current of electric power supplied to the power supply section 241 when electric power is supplied from the battery section 242 to each component provided in the signal 1100 (when the state is the second mode). Then, when the detected voltage or current becomes equal to or higher than a threshold value set in advance (when the first condition is satisfied), the power switching section 243 changes the power supply to the power supply section 241 (changes the state to the first mode).
  • For example, the first condition corresponds to a case where the cut power line through which electric power is supplied to the signal 1100 is connected or a case where the failure of the facility which supplies electric power to the signal 1100 ends by restoration work after a disaster.
  • After that, the power switching section 243 changes the state between the first and second modes. In addition, the power switching section 243 outputs to the control unit 220 the information indicating that the state is changed from the first mode to the second mode and the information indicating that the state is changed from the second mode to the first mode. Alternatively, the power switching section 243 outputs to the control unit 220 the information indicating that the current state is the first mode or the second mode. Using this information, the control unit 220 can determine whether the current mode is the first mode or the second mode.
  • Returning to the explanation of FIG. 9, the control unit 220 includes a detection section 221, a display control section 222, and a communication section 223. The communication section 223 receives a control signal for changing the display state of the signal display unit 230 from the high-order control apparatus 1200 through the communication network 1300. This control signal is control information indicating “movable”, “stop”, or “not movable”, for example. That is, this control signal is control information indicating that the first light emitting section 231, the second light emitting section 232, or the third light emitting section 233 is made to emit light.
  • In the first mode, the display control section 222 changes the display state of the signal display unit 230 through the I/F 271 on the basis of the control signal received from the high-order control apparatus 1200 by the communication section 223. For example, in the first mode, when the control information indicating that the first light emitting section 231 is made to light is received from the high-order control apparatus 1200 through the communication section 223, the display control section 222 makes the first light emitting section 231 provided in the signal display unit 230 light and extinguishes the second and third light emitting sections 232 and 233.
  • The detection section 221 detects the volume of traffic on the basis of an image captured by the imaging unit 210. In the second mode, the display control section 222 changes the display state of the signal display unit 230 on the basis of the volume of traffic detected by the detection section 221.
  • For example, the detection section 221 detects the volume of traffic in each of a plurality of lanes on the basis of an image of the plurality of lanes captured by the imaging unit 210. In this case, for the image of the plurality of lanes captured by the imaging unit 210, the detection section 221 detects a vehicle one by one in each different lane by image processing or pattern matching technique, as an example. In addition, the detection section 221 detects the volume of traffic in each of the plurality of lanes by detecting the number of vehicles per unit time which travel in each lane.
  • The display control section 222 changes the display state of the signal display unit 230 on the basis of a result of comparison of the volumes of traffic in the plurality of lanes detected by the detection section 221. In this case, the display control section 222 changes the display state of the signal display unit 230 on the basis of the result of comparison of the volumes of traffic in the plurality of lanes detected by the detection section 221 so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic.
  • In this way, in the first mode, the control unit 220 changes the display state of the signal display unit 230, which is fixed to the signal 1100, on the basis of the control signal received from the high-order control apparatus 1200 by the communication section 223. In addition, in the second mode, the control unit 220 changes the display state of the signal display unit 230, which is fixed to the signal 1100, on the basis of an image captured by the imaging unit 210.
  • Next, an example where the display control section 222 changes the display state of the signal display unit 230 on the basis of the volume of traffic detected by the detection section 221 in the second mode will be described using FIGS. 11 to 13.
  • Here, as shown in FIG. 11, a case of two lanes which are one-way streets will be described. The lane A which is one lane of the two lanes is a lane which is a one-way street from right to left on the plane of FIG. 11. The lane B which is one lane of the two lanes is a lane which is a one-way street from top to bottom on the plane of FIG. 11. The lanes A and B cross each other at the crossroads CRS.
  • In addition, in FIG. 11, a signal 1100-1 is placed before the crossroads CRS in the lane A. In addition, a signal 1100-2 is placed before the crossroads CRS in the lane B. The signals 1100-1 and 1100-2 have the same configuration as the signal 1100 described above using FIG. 9.
  • Moreover, in FIG. 11, there is a stop line L1 before the signal 1100-1 and a stop line L2 before the signal 1100-2 at the crossroads CRS. Therefore, at the crossroads CRS, a vehicle stops before the stop line L1 and a vehicle stops before the stop line L2 according to the signal display units 230 of the signals 1100-1 and 1100-2.
  • Next, the volume of traffic detected by the signal 1100 will be described using FIG. 12. Here, a case where the signal display unit 230 of each of the signals 1100-1 and 1100-2 includes the first light emitting section 231, which lights green (or blue) indicating “movable”, and the third light emitting section 233, which lights red indicating “not movable”.
  • In addition, explanation herein will be given assuming that the control unit 220 of each of the signals 1100-1 and 1100-2 executes the extinguishing of the first light emitting section 231 and the lighting of the third light emitting section 233 almost simultaneously for the signal display unit 230 provided in each signal. Moreover, on the contrary, explanation will be given assuming that the control unit 220 of each of the signals 1100-1 and 1100-2 executes the lighting of the first light emitting section 231 and the extinguishing of the third light emitting section 233 almost simultaneously for the signal display unit 230 provided in each signal.
  • In addition, each of the signals 1100-1 and 1100-2 performs imaging in the azimuth of 360° in the horizontal direction, as described above. Therefore, the signals 1100-1 and 1100-2 can detect the volumes of traffic in the lanes A and B, respectively.
  • Here, as shown in FIG. 12, the signals 1100-1 and 1100-2 repeat display states of “movable” and “not movable” alternately through each signal display unit 230, for example. Explanation herein will be given assuming that periods of the display states of “movable” and “not movable” are the same in length of time.
  • For example, in a period T1, the signal 1100-1 indicates “movable” through its own signal display unit 230, and the signal 1100-2 indicates “not movable” through its own signal display unit 230. Then, in a period T2, the signal 1100-1 indicates “not movable” through its own signal display unit 230, and the signal 1100-2 indicates “movable” through its own signal display unit 230. Subsequently, the signals 1100-1 and 1100-2 repeat the same operation in periods T3, T4, . . . . In addition, the periods T1, T2, T4, . . . are assumed to be the same in length of time.
  • Here, the detection section 221 of the signal 1100-1 detects the volume of traffic in each of the plurality of lanes on the basis of an image of the plurality of lanes captured by the imaging unit 210 of the signal 1100-1. In addition, the detection section 221 of the signal 1100-2 detects the volume of traffic in each of the plurality of lanes on the basis of an image of the plurality of lanes captured by the imaging unit 210 of the signal 1100-2.
  • For example, as shown in FIG. 12, the detection section 221 of the signal 1100-1 and the detection section 221 of the signal 1100-2 detect that five vehicles have passed through the lane A and no vehicle has passed through the lane B in the period T1, respectively. In addition, the detection section 221 of the signal 1100-1 and the detection section 221 of the signal 1100-2 detect that no vehicle has passed through the lane A and one vehicle has passed through the lane B in the period T2, respectively. Similarly, also in the periods T3, T4, . . . , the detection section 221 of the signal 1100-1 and the detection section 221 of the signal 1100-2 detect the volumes of traffic in the lanes A and B, respectively.
  • Then, the display control section 222 of the signal 1100-1 compares the volumes of traffic in the plurality of lanes detected by the detection section 221 of the signal 1100-1, and determines that the volume of traffic in the lane A is larger than that in the lane B in this case. For example, the display control section 222 of the signal 1100-1 may compare the volume of traffic in each lane on the basis of the sum or the average number of vehicles traveling in each lane in a period set in advance, such as the period T1 to the period T4.
  • Then, the display control section 222 of the signal 1100-1 changes the display state of the signal display unit 230 on the basis of this result so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic. That is, in this case, since the volume of traffic in the lane A is larger than that in the lane B, the display control section 222 of the signal 1100-1 changes the display state of the signal display unit 230 so that vehicles traveling in the lane A can move with priority over vehicles traveling in the lane B.
  • On the other hand, the display control section 222 of the signal 1100-2 compares the volumes of traffic in the plurality of lanes detected by the detection section 221 of the signal 1100-2, and determines that the volume of traffic in the lane A is larger than that in the lane B in this case in the same manner as the display control section 222 of the signal 1100-1 does. Then, the display control section 222 of the signal 1100-2 changes the display state of the signal display unit 230 on the basis of this result so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic. That is, in this case, since the volume of traffic in the lane A is larger than that in the lane B, the display control section 222 of the signal 1100-2 changes the display state of the signal display unit 230 so that vehicles traveling in the lane A can move with priority over vehicles traveling in the lane B in the same manner as the display control section 222 of the signal 1100-1 does.
  • Here, physical quantities called the volumes of traffic in a plurality of lanes are the same value as long as a measurement period is the same even if a measurement apparatus is different like the signals 1100-1 or the signal 1100-2. Therefore, values of the volumes of traffic in a plurality of lanes detected by the detection section 221 of the signal 1100-1 and the detection section 221 of the signal 1100-2 are the same value. Accordingly, on the basis of the volume of traffic detected for each of the plurality of lanes, each of the display control section 222 of the signal 1100-1 and the display control section 222 of the signal 1100-2 can change the display state of the signal display unit 230 so that vehicles traveling in the lane A can move with priority over vehicles traveling in the lane B. Thus, even when the signals 1100-1 and 1100-2 operate independently of each other and even when the signals 1100-1 and 1100-2 do not receive control signals from the high-order control apparatus, it is possible to change the display state of the signal display unit 230 appropriately on the basis of the volume of traffic detected for each of the plurality of lanes.
  • In addition, the display control section 222 of the signal 1100-1 and the display control section 222 of the signal 1100-2 perform the following operations, as an example, when changing the display state of the signal display unit 230 so that vehicles traveling in the lane A can move with priority over vehicles traveling in the lane B.
  • For example, the display control section 222 of each of the signals 1100-1 and 1100-2 changes the display state of the signal display unit 230, on the basis of the volume of traffic in each of the plurality of lanes detected by the detection section 221 of the corresponding signal 1100, so that a period for which a vehicle can move in the lane with the high volume of traffic becomes longer than that in the lane with the low volume of traffic as the ratio or difference of the volumes of traffic in the lanes increases.
  • In addition, on the contrary, the display control section 222 of each of the signals 1100-1 and 1100-2 changes the display state of the signal display unit 230, on the basis of the volume of traffic in each of the plurality of lanes detected by the detection section 221 of the corresponding signal 1100, so that a period for which a vehicle cannot move in the lane with the low volume of traffic becomes longer than that in the lane with the high volume of traffic as the ratio or difference of the volumes of traffic in the lanes increases.
  • In the case of FIG. 12 described above, the display control section 222 of each of the signals 1100-1 and 1100-2 changes a period of the display state by the signal display unit 230 as shown in FIG. 13, as an example. For example, each display control section 222 changes a period of the display state by the signal display unit 230 so that the time lengths of periods T11 and T13, in which the signal display unit 230 (lane A) of the signal 1100-1 indicates “movable” and the signal display unit 230 (lane B) of the signal 1100-2 indicates “not movable” become longer than those of the periods T1 to T4 shown in FIG. 12.
  • On the contrary, each display control section 222 changes a period of the display state by the signal display unit 230 so that the time lengths of periods T12 and T14, in which the signal display unit 230 (lane A) of the signal 1100-1 indicates “not movable” and the signal display unit 230 (lane B) of the signal 1100-2 indicates “movable” become shorter than those of the periods T1 to T4 shown in FIG. 12. Subsequently, each display control section 222 repeats changing the display state of each signal display unit 230 in the same manner as in the case of the periods T11 to T14 until the volume of traffic changes.
  • Thus, the display control section 222 of the signal 1100 can change the display state of the signal display unit 230 on the basis of the volume of traffic in each of the plurality of lanes detected by the detection section 221 so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic.
  • <Operation of the Signal 1100 when Changing to the Second Mode>
  • Next, an operation of the signal 1100 in the second mode will be described using FIG. 14. In addition, the operation of the signal 1100 in FIG. 14 is an operation of the signal 1100 when changing from the first mode to the second mode, for example, as described using FIG. 10.
  • First, the imaging unit 210 captures an image and outputs the captured image to the control unit 220 through the I/F 270 (step S10). Then, the detection section 221 of the control unit 220 detects the volume of traffic on the basis of the image captured by the imaging unit 210 (step S20). Then, the display control section 222 of the control unit 220 compares the volume of traffic in each of the plurality of lanes detected by the detection section 221 (step S30).
  • Then, the display control section 222 of the control unit 220 performs signal control by changing the display state of the signal display unit 230 on the basis of the comparison result in step S30 (step S40). In step S40, for example, the display control section 222 of the control unit 220 changes the display state of the signal display unit 230 on the basis of the result of comparison of the volumes of traffic in the plurality of lanes detected by the detection section 221 so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic. In addition, in this step S40, the display control section 222 of the control unit 220 changes the display state of the signal display unit 230 through the I/F 271.
  • Subsequently, the signal 1100 repeats the processing from step S10 until it changes to the first mode.
  • In this way, in the second mode, the signal 1100 changes the display state of the signal display unit 230 fixed to the signal 1100 on the basis of the image captured by the imaging unit 210. Therefore, the signal 1100 according to the present embodiment can appropriately perform control to change the display state of the signal display unit 230 even when the signal 1100 cannot communicate with the high-order control apparatus and even when the signal 1100 cannot receive a control signal from the high-order control apparatus.
  • In addition, as shown in FIG. 12 or 13, the control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 so that the display state of the signal 1100-1 is different from the display state of the signal 1100-2.
  • Meanwhile, in the case of FIG. 11, the signals 1100-1 and 1100-2 need to control the signal display unit 230 of each signal 1100 using the control unit 220 of the corresponding signal 1100 so that a vehicle traveling in the lane A and a vehicle traveling in the lane B do not collide with each other at the crossroads CRS. For example, the control unit 220 of each signal 1100 needs to control the signal display unit 230 of the signal 1100 using the control unit 220 of the corresponding signal 1100 so that a vehicle traveling in the lane A and a vehicle traveling in the lane B do not move toward the crossroads CRS at the same timing.
  • That is, it is not allowed that both the display state of the signal 1100-1 and the display state of the signal 1100-2 indicate “movable” at the same timing. In addition, both the display state of the signal 1100-1 and the display state of the signal 1100-2 may indicate “not movable”.
  • Therefore, the control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 such that the display state of the signal 1100-1 is different from the display state of the signal 1100-2 and both the display states do not indicate “movable” at the same timing.
  • As a result, a vehicle traveling in the lane A and a vehicle traveling in the lane B can alternately enter the crossroads CRS without the collision between the vehicle traveling in the lane A and the vehicle traveling in the lane B at the crossroads CRS.
  • Meanwhile, as described above using FIG. 12, it is necessary to make each device measure the value of the physical quantity, which is called the volume of traffic in each of the plurality of lanes, as the same value even if apparatus which measure the volume of traffic are different like the signal 1100-1 or the signal 1100-2. In order to do so, a start time, such as the period T1 for which the number of vehicles is measured, and an end time need to synchronize with each other in the signals 1100-1 and 1100-2.
  • For this synchronization, the signals 1100-1 and 1100-2 may be made to be able to detect a reference time by time measurement using a radio-controlled timepiece or by time measurement using a GPS (Global Positioning System). Then, each of the signals 1100-1 and 1100-2 measures an elapsed time from the detected reference time using a timepiece section present thereinside. Then, each of the signals 1100-1 and 1100-2 measures a start time, such as the period T1 for which the number of vehicles is measured, and an end time on the basis of the elapsed time measured by each timepiece section. In this way, the signals 1100-1 and 1100-2 may make the start time and the end time of the period T1 synchronized which the number of vehicles is measured between the signals 1100-1 and 1100-2.
  • Thus, even if devices which the volume of traffic are different, the signals 1100-1 and 1100-2 can measure the value of the physical quantity, which is called the volume of traffic in each of the plurality of lanes, as the same value. Accordingly, the signals 1100-1 and 1100-2 can change the display state of the signal display unit 230 appropriately.
  • In addition, in the explanation of FIG. 12, the signals 1100-1 and 1100-2 compared the volumes of traffic in a plurality of lanes when periods from the period T1 to the period T4 elapsed and changed the display state of the signal display unit 230 the on the basis of the comparison result. However, the timing at which the volumes of traffic in the plurality of lanes are compared is not limited to this. For example, the signals 1100-1 and 1100-2 may compare the volumes of traffic in the plurality of lanes every period set in advance.
  • This “period set in advance” may be an arbitrary period set in advance. In addition, this “period set in advance” may be set on the basis of a period, for which the display state of the signal display unit 230 is changed, by the control unit 220 of each signal 1100, as in the periods T1 to T4.
  • In addition, this “period set in advance” may be each of the periods T1 to T4. For example, each of the signals 1100-1 and 1100-2 may compares the number of traveling vehicles in the lane A with the number of stopped vehicles in the lane B in the period T1 and change the display state of the signal display unit 230 on the basis of this comparison result.
  • In addition, each of the signals 1100-1 and 1100-2 compares the number of traveling vehicles in the lane A with the number of stopped vehicles in the lane B in each of the periods T2, T3, T4, . . . as in the case of the period T1. Then, each of the signals 1100-1 and 1100-2 may change the display state of the signal display unit 230 on the basis of the comparison result so that vehicles traveling in the lane with the high volume of traffic can move with priority over vehicles traveling in the lane with the low volume of traffic.
  • In addition, when the sum of the number of vehicles traveling in a plurality of lanes becomes equal to or larger than the number set in advance, the signals 1100-1 and 1100-2 may compare the volumes of traffic in the plurality of lanes.
  • In addition, although the number of vehicles traveling in the lane or the number of vehicles traveling for a unit time has been described herein, the volume of traffic is not limited to these. For example, the detection section 221 may detect the number of vehicles, which is the number of vehicles stopped before the crossroads CRS and is the number of vehicles stopped in each lane, as the volume of traffic. In this case, the display control section 222 may change the display state of the signal display unit 230 so as to reduce the number of stopped vehicles according to stopped vehicles.
  • <Case of a Plurality of Lanes>
  • In addition, although the case of two lanes which are one-way streets has been described in the above explanation using FIG. 11, the signal 1100 according to the present embodiment is not limited to such a case of two lanes which are one-way streets, and may cope with the arbitrary number of lanes which are not one-way streets.
  • Here, an example of the arbitrary number of lanes which are not one-way streets will be described using FIG. 15. In FIG. 15, the same reference numerals are given to sections corresponding to each section in FIG. 9 or 11, and the explanation will be omitted.
  • In FIG. 15, lanes A1 and A2 which are opposite lanes and lanes B1 and B2 which are opposite lanes cross each other at the crossroads CRS. In addition, signals 1100-1 and 1100-3 and signals 1100-2 and 1100-4 are placed before the crossroads CRS in the lanes A1 and A2 and the lanes B1 and B2. The signals 1100-1, 1100-2, 1100-3, and 1100-4 have the same configuration as the signal 1100 described using FIG. 9, as in the case of FIG. 11.
  • Moreover, in FIG. 15, there is a stop line L1 before the signal 1100-1 and a stop line L3 before the signal 1100-3 at the crossroads CRS. In addition, there is a stop line L2 before the signal 1100-2 and a stop line L4 before the signal 1100-4 at the crossroads CRS. For this reason, a vehicle in each lane stops before the stop lines L1 to L4 according to the lighting of the third light emitting section of the signal display unit 230 of each of the signals 1100-1 to 1100-4.
  • Meanwhile, also in the case of FIG. 15, the signals 1100-1, 1100-2, 1100-3, and 1100-4 need to control the signal display unit 230 of each signal 1100 by the control unit 220 of the corresponding signal 1100 so that a vehicle traveling in the lane called the lane A1 or the lane A2 and a vehicle traveling in the lane called the lane B1 or the lane B2 do not collide with each other at the crossroads CRS, as in the case of FIG. 11.
  • For example, the control unit 220 of each signal 1100 needs to control the signal display unit 230 of the signal 1100 using the control unit 220 of the corresponding signal 1100 so that a vehicle traveling in the lane called the lane A1 or the lane A2 and a vehicle traveling in the lane called the lane B1 or the lane B2 do not move toward the crossroads CRS at the same timing.
  • Therefore, as an example, the control unit 220 of each of the signals 1100-1 and 1100-3 changes the display state of the signal display unit 230 to the same display state at the same timing. In addition, the control unit 220 of each of the signals 1100-2 and 1100-4 changes the display state of the signal display unit 230 to the same display state at the same timing. In addition, the control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 so that the display states of the signals 1100-1 and 1100-3 are different from the display states of the signals 1100-2 and 1100-4.
  • Moreover, as described above, it is not allowed that both the display states of the signals 1100-1 and 1100-3 and the display states of the signals 1100-2 and 1100-4 indicate “movable” at the same timing. This is because vehicles traveling in different lanes may collide with each other at the crossroads CRS if both the display states of the signals 1100-1 and 1100-3 and the display states of the signals 1100-2 and 1100-4 indicate “movable” at the same timing. In addition, both the display states of the signals 1100-1 and 1100-3 and the display states of the signals 1100-2 and 1100-4 may indicate “not movable”.
  • Therefore, the control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 such that the display states of the signals 1100-1 and 1100-3 are different from the display states of the signals 1100-2 and 1100-4 and the display states of the signals 1100-1 and 1100-3 and the display states of the signals 1100-2 and 1100-4 do not indicate “movable” at the same timing.
  • Accordingly, also in this case, the control unit 220 of each of the signals 1100-1 and 1100-3 changes the display state of the signal display unit 230 as in the case of the lane A described using FIG. 12 and the control unit 220 of each of the signals 1100-2 and 1100-4 changes the display state of the signal display unit 230 as in the case of the lane B described using FIG. 12. Therefore, also in the case of FIG. 15, the signals 1100-1 to 1100-4 can appropriately perform control to change the display state of the signal display unit even when the signals 1100-1 to 1100-4 cannot communicate with the high-order control apparatus and even when the signals 1100-1 to 1100-4 cannot receive a control signal from the high-order control apparatus, as in the case of FIG. 12.
  • In addition, as indicated by the reference numeral P in FIG. 15, there may be a vehicle which turns right at the crossroads CRS from the lane A1 to travel in the lane B2. That is, a vehicle which turns right at the crossroads CRS may also be present.
  • Therefore, the display control section 222 changes the display state of the signal display unit 230 so that priority is given to right turn (right turn signal). Then, for example, it is possible to reduce an increase in the number of vehicles which cannot turn right since vehicles traveling in the lane A1 cannot turn right at the crossroads CRS and stop near the stop line L1. Accordingly, the display control section 222 can reduce (alleviate) the traffic congestion caused by turning right at the crossroads CRS.
  • In addition, the display control section 222 changes the display state of the signal display unit 230 so that a right turn signal time is adjusted. As a result, a vehicle easily turns right at the crossroads CRS. In addition, it is possible to reduce an increase in the number of vehicles which cannot turn right since vehicles cannot turn right at the crossroads CRS and stop near the stop line L1. Accordingly, the display control section 222 can reduce the traffic congestion caused by turning right at the crossroads CRS.
  • In addition, when the display control section 222 changes the display state of the signal display unit 230 as described above so that priority is given to right turn or the right turn signal time is adjusted, the detection section 221 detects a vehicle turning right on the basis of a captured image.
  • For example, the detection section 221 detects a vehicle, which is stopped within the crossroads CRS or before the crossroads CRS and which is located at the right end of the lane, as a vehicle turning right on the basis of a captured image. In addition, it is assumed that a direction indictor, which indicates a direction when a vehicle changes course by blinking and which is provided in a vehicle, is imaged by the imaging unit 210. In this case, the detection section 221 may detect a vehicle turning right by determining whether or not a vehicle turns right on the basis of an image of the direction indictor provided in the vehicle, the image being captured by the imaging unit 210. In addition, the detection section 221 may detect a vehicle turning right by arbitrarily combining such a method of detecting a vehicle turning right.
  • In addition, as described above, when a vehicle turning right is detected by the detection section 221, the display control section 222 changes the display state of the signal display unit 230 so that priority is given to right turn or the right turn signal time is adjusted.
  • In addition, the “display control section 222 changes the display state of the signal display unit 230 so that priority is given to right turn (right turn signal)” is that the display control section 222 changes the display state of the signal display unit 230 so that a vehicle turning right can move with priority over a vehicle which does not turn right or over the case where there is no vehicle turning right at the crossroads CRS. The “vehicle which does not turn right” referred to herein is a vehicle going straight in the lane without changing the course or a vehicle turning left, for example.
  • As an example, it is assumed that the signal display unit 230 includes a light emitting section corresponding to right turn. In this case, the display control section 222 changes the display state of the signal display unit 230 so that priority is given to a vehicle turning right by controlling the display state of the light emitting section corresponding to right turn.
  • In addition, the “display control section 222 changes the display state of the signal display unit 230 so that the right turn signal time is adjusted” is that, for example, the display control section 222 adjusts a time for changing the display state of the signal display unit 230, so that the period of time for the vehicle turning right becomes longer compared to a case there are no vehicle turning right or the vehicle do not turn right at the crossroads CRS.
  • As an example, it is assumed that the signal display unit 230 includes a light emitting section corresponding to right turn. In this case, the “display control section 222 changes the display state of the signal display unit 230 so that the right turn signal time is adjusted” is that, for example, the display control section 222 changes the display state of the signal display unit 230 by controlling the display state of a light emitting section corresponding to right turn, so that a period of time for the vehicle turning right becomes longer compared to a case there are no vehicle turning right or the vehicle do not turn right at the crossroads CRS.
  • <Emergency Vehicle Priority>
  • In addition, the detection section 221 may detect the type of a vehicle on the basis of a captured image. For example, an emergency vehicle has a red light. Therefore, the detection section 221 may determine whether or not a vehicle has a red light on the basis of a captured image and detect the type of the vehicle as an emergency vehicle when the vehicle has a red light. This emergency vehicle is an automobile for firefighting, an automobile for emergencies, or a police car, for example.
  • In addition, the emergency vehicle may have a predetermined color, such as red or white. Therefore, the detection section 221 may determine whether or not the vehicle type is an emergency vehicle by combining the color of a vehicle with the criteria for determining whether or not the vehicle has a red light. For example, the detection section 221 determines whether the color of a vehicle is red or white on the basis of a captured image. In addition, when the color of a vehicle is red or white and the vehicle has a red light, the detection section 221 may detect the type of the vehicle as an emergency vehicle.
  • Moreover, in the second mode, when the detected type of the vehicle is an emergency vehicle, the display control section 222 may change the display state of the signal display unit 230 so that the emergency vehicle can move with priority over vehicles whose vehicle types are not emergency vehicles. As a result, the emergency vehicle can move in the lane with priority over vehicles whose vehicle types are not emergency vehicles. Thus, it is preferable in a disaster or the like that an emergency vehicle can travel in the lane preferentially.
  • In addition, when detecting the type of a vehicle on the basis of a captured image, the detection section 221 may detect the type of a vehicle on the basis of a captured image and a sound picked up by the sound pickup unit 250. For example, when the type of a vehicle is an emergency vehicle, this vehicle may sound the siren. Therefore, the detection section 221 may determine whether or not a siren sound is included in the sound picked up by the sound pickup unit 250 and determine whether or not the type of a vehicle is an emergency vehicle by combining this determination result and a determination ermined result based on the captured image described above.
  • In addition, when the sound pickup unit 250 is a sound pickup device which picks up the sound so that the direction of a sound source can be specified, the detection section 221 can detect the direction of an emergency vehicle including a sound source which sounds the siren. In this case, the detection section 221 can detect an emergency vehicle more accurately on the basis of the detected direction of the emergency vehicle and the direction of the emergency vehicle detected on the basis of the image.
  • Since it is possible to detect an emergency vehicle more accurately as described above, the display control section 222 can change the display state of the signal display unit 230 more accurately so that the emergency vehicle can move preferentially in the second mode. In addition, since the direction of an emergency vehicle can be seen more accurately through the detection section 221, the display control section 222 can also estimate the lane, in which the emergency vehicle travels from now, more accurately. Accordingly, in the second mode, the display control section 222 can change the display state of the signal display unit 230 so that the emergency vehicle can move preferentially. In this manner, the emergency vehicle can travel in the lane preferentially. Thus, it is preferable in a disaster or the like that an emergency vehicle can travel in the lane preferentially.
  • Measures Against Afternoon Sun or Morning Sun
  • In the signal display unit 230 that is described above by using FIG. 9, each light emitting section corresponding to one signal may include a plurality of light emitting elements. This light emitting section corresponding to one signal is the first light emitting section 231, the second light emitting section 232, or the third light emitting section 233 described above. In addition, as an example, the plurality of light emitting elements is a plurality of LEDs (Light Emitting Diodes). That is, an emission method of the signal display unit 230 of the signal 1100 described above may be a method using an LED.
  • Moreover, in this case, when making the light emitting section of the signal display unit 230 light, the display control section 222 of the control unit 220 makes some of the plurality of light emitting elements provided in the light emitting section emit light or extinguished while making the plurality of light emitting elements emit light such that the position of the place of emission or extinguishing in the light emitting section changes. That is, the display control section 222 of the control unit 220 makes the plurality of light emitting elements emit light by moving only some lighting regions (or extinguished regions) instead of lighting all regions of respective colors of a signal which can light.
  • For example, when the afternoon sun, the morning sun, or the like shines into the signal 1100, it becomes difficult for a user who observes the signal 1100 to see which of the plurality of light emitting sections provided in the signal display unit 230 emits light. On the other hand, when the display control section 222 of the control unit 220 makes the light emitting section of the signal display unit 230 lights as described above, the emission place of the light emitting section that lights moves instead of simply lighting. Therefore, even if the afternoon sun, the morning sun, or the like shines into the signal 1100, it becomes easy for a user who observes the signal 1100 to see which of the plurality of light emitting sections provided in the signal display unit 230 emits light.
  • Moreover, when making the light emitting section of the signal display unit 230 light, the display control section 222 of the control unit 220 makes some of the plurality of light emitting elements provided in the light emitting section emit light or extinguished while making the plurality of light emitting elements emit light such that the place of emission or extinguishing rotates, moves, enlarges, or is reduced in the light emitting section. In this case, it becomes easier for a user who observes the signal 1100 to see which of the plurality of light emitting sections provided in the signal display unit 230 emits light even if the afternoon sun, the morning sun, or the like shines into the signal 1100.
  • In addition, when making the light emitting section of the signal display unit 230 light, the display control section 222 of the control unit 220 makes the light emitting section emit light such that the brightness of an extinguished light emitting element, which is a light emitting element provided in the light emitting section that lights, changes. For example, when making the light emitting section of the signal display unit 230 light, the display control section 222 of the control unit 220 may make some light emitting elements emit light or extinguished as described above while changing the brightness of the extinguished light emitting element sequentially from the low brightness to the high brightness instead of simply maintaining the extinguished state. In addition, the display control section 222 of the control unit 220 may repeat this brightness change by changing the brightness of the light emitting element sequentially from the low brightness to the high brightness and then changing the brightness of the light emitting element sequentially from the high brightness to the low brightness.
  • As a result, it becomes easier for a user who observes the signal 1100 to see which of the plurality of light emitting sections provided in the signal display unit 230 emits light even if the afternoon sun, the morning sun, or the like shines into the signal 1100.
  • In addition, although the respective signals 1100 operate independently of each other in the above explanation, the respective signals 1100 may communicate with each other. For example, the respective signals 1100 may communicate with each other through the communication network 1300 described using FIG. 9. In addition, communication through the communication network 1300 may be impossible in a disaster. Therefore, the signals 1100 may also communicate with each other through a communication network different from the communication network 1300. The communication network in this case may be a radio communication network.
  • In addition, in the case of FIG. 11, it is preferable that at least the signals 1100-1 and 1100-2 related to the crossroads CRS can communicate with each other through such a communication network. In addition, in the case of FIG. 15, it is preferable that at least the signals 1100-1 to 1100-4 related to the crossroads CRS can communicate with each other.
  • Moreover, also in this case, each signal 1100 changes the display state of each signal display unit 230 so that vehicles traveling in different lanes do not collide with each other at the crossroads CRS, as described using FIG. 11 or 15. For example, the control unit 220 provided in each signal 1100 changes the display state of each signal display unit 230 so that the display states of the signals 1100 corresponding to the lanes, which are not opposite lanes, are different and the display states do not indicate “movable” at the same timing.
  • Thus, as long as the signals 1100 related to the crossroads can communicate with each other, the information called the volume of traffic detected may be transmitted or received therebetween. In this case, in each signal 1100, the imaging unit 210 may be fixed to the signal 1100 so as to image at least the lane in which traffic is controlled by the display state of the signal display unit 230. That is, the signal 1100 detects the volume of traffic only in the lane in which traffic is controlled by the signal 1100. Then, the detected volume of traffic is transmitted to another signal 1100. In this way, all of the signals 1100 related to the crossroads may detect the volume of traffic of all the lanes related to the crossroads.
  • Also in this case, each signal 1100 can detect the volume of traffic in each lane on the basis of an image captured by the imaging unit 210 of each signal. Therefore, as described using FIG. 11 or 15, the signal 1100 can appropriately perform control to change the display state of the signal display unit even when the signal 1100 cannot communicate with the high-order control apparatus and even when the signal 1100 cannot receive a control signal from the high-order control apparatus.
  • In addition, although the case where the signals 1100 related to the crossroads communicate with each other has been described herein, the present invention is not limited to this. For example, the signals 1100 placed in a line one by one in the same lane may communicate with each other. Accordingly, each of the signals 1100 placed in a line one by one may control its own signal display unit 230 so that vehicles, which travel in the same lane and pass through the signals 1100 placed in a line one by one, can move without making the vehicle “not movable” by any of the signals 1100.
  • In addition, when the signals 1100 placed in a line one by one communicate with each other, a radio communication unit provided in each signal 1100 may perform relay transmission. Thus, not only the signals 1100 placed adjacent to each other can communicate with each other, but also the signals 1100 located far from each other can communicate with each other.
  • In addition, in the explanation of FIG. 10, the power switching section 243 changed the state between the first and second modes on the basis of a voltage or current of electric power supplied to the power supply section 241. However, the power switching section 243 may change the state between the first and second modes on the basis of a control signal for mode change without being limited to this. This control signal for mode change may be received through radio communication, for example. In addition, this control signal for mode change may be broadcast in a disaster or the like. Therefore, each signal 1100 can change the mode reliably without depending on a voltage or current of electric power supplied to the power supply section 241 in a disaster.
  • In addition, the signal 1100 may be a temporary signal for construction. In this case, the signal 1100 may operate by supply of electric power only from the battery section 242. Also in this case, the signal 1100 changes the display state of the signal display unit 230 fixed to the signal 1100 on the basis of an image captured by the imaging unit 210. Therefore, just by installing in the lane the signal 1100 which is such a temporary signal for construction, the user can appropriately perform control to change the display state of the signal display unit 230 even when there is no communication between the high-order control apparatus 1200 and the signal 1100 and it is not possible to receive a control signal from the high-order control apparatus 1200. Accordingly, the signal 1100 can control traffic appropriately. For this reason, such a signal 1100 is suitable for construction.
  • In addition, when the signal 1100 is a temporary signal for construction, the power supply unit 240 of the signal 1100 may supply electric power only from the battery section 242 to each component provided in the signal 1100. In this case, in the mode described using FIG. 10, the power switching section 243 may change the state to the second mode after electric power is supplied and may also continue changing the state to the second mode thereafter.
  • For example, when the signal 1100 is a temporary signal for construction, electric power may not be supplied from the outside. As described above, since the power supply unit 240 of the signal 1100 supplies electric power only from the battery section 242 to each component provided in the signal 1100, the display control section 222 can change the display state of the signal display unit 230 even if thus electric power is not supplied from the outside.
  • Moreover, as described using FIGS. 9 and 10, in the first mode, the display control section 222 changes the display state of the signal display unit 230 through the I/F 271 on the basis of a control signal received from the high-order control apparatus 1200 by the communication section 223. However, the method used when the display control section 222 changes the display state of the signal display unit 230 in the first mode is not limited to this.
  • For example, in the first mode, the display control section 222 may change the display state of the signal display unit 230 on the basis of the volume of traffic detected by the detection section 221. That is, also in the first mode, the display control section 222 may change the display state of the signal display unit 230 on the basis of the volume of traffic detected by the detection section 221, in the same manner as in the case of the second mode.
  • In addition, in the first mode, the display control section 222 may change the display state of the signal display unit 230 on the basis of a predetermined timing set in advance.
  • In addition, although the case where vehicles pass through the left side of the lane has been described above, the signal 1100 according to the present embodiment may also be applied to the case where vehicles pass through the right side of the lane in the same manner as in the case where vehicles pass through the left side of the lane. In the case of right-hand traffic, right turn in the case of left-hand traffic described above is left turn. The right turn in the case of left-hand traffic or left turn in the case of right-hand traffic described above is that a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now.
  • In this case, the “display control section 222 changes the display state of the signal display unit 230 so that priority is given to right turn” described above means that “when a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now, the control unit 220 (or the display control section 222) changes the display state of the signal display unit 230 on the basis of an image captured by the imaging unit 210 so that priority is given to the vehicle which changes course.
  • In addition, the “display control section 222 changes the display state of the signal display unit 230 so that the right turn signal time is adjusted” described above means that “when a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now, the control unit 220 (or the display control section 222) changes the display state of the signal display unit 230, so that the period of “movable” time for the vehicle which changes course is adjusted in a case when a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now on the basis of an image captured by the imaging unit 210.
  • In this way, when a vehicle changes course so as to cross the opposite lane to the lane in which the vehicle has traveled until now, the signal 1100 according to the present embodiment can make the vehicle easily change the course so as to cross the opposite lane to the lane in which the vehicle has traveled until now in both the cases of left-hand traffic and right-hand traffic. As a result, in both the cases of left-hand traffic and right-hand traffic, the signal 1100 according to the present embodiment can reduce (alleviate) traffic congestion caused when vehicles change course so as to cross the opposite lane to the lane in which the vehicles have traveled until now.
  • In addition, when the signal 1100 according to the present embodiment corresponds to both left-hand traffic and right-hand traffic, it is possible to set to which of the left-hand traffic and the right-hand traffic the signal 1100 corresponds, through a setting unit provided in the signal 1100, when electric power is supplied to the signal 1100 or when the signal 1100 is shipped. In addition, the control unit 220 may determine to which of the left-hand traffic and the right-hand traffic the signal 1100 corresponds on the basis of the setting.
  • In addition, the signal 1100 may determine whether the lane in which traffic is controlled by the signal 1100 is left-hand traffic or right-hand traffic on the basis of an image captured by the imaging unit 210. In addition, on the basis of this determination result, the signal 1100 may set to which of left-hand traffic and right-hand traffic the signal 1100 corresponds, through its own setting unit.
  • In addition, although only a signal for a vehicle has been described as the signal 1100 in the above, the signal 1100 may also be a signal when a person crosses the road (otherwise, roadway) or a signal for a train on the railroad without being limited to only the vehicle.
  • In addition, each component provided in the signal control apparatus 1100 in FIG. 9, such as the detection section 221, the display control section 222, the communication section 223, and the power switching section 243, may be realized by dedicated hardware. In addition, each component may be configured by a memory and a CPU (central processing unit) and the function may be realized by loading a program for realizing the function of each component into the memory and executing it.
  • In addition, processing by each component may be executed by recording a program for realizing the embodiment of the present invention in a computer-readable recording medium, reading the program recorded in the recording medium into a computer system, and executing the read program. In addition, the “computer system” referred to herein may include hardware, such as an OS (Operating System) or a surrounding device.
  • In addition, the “computer system” may also include a homepage presenting environment (or display environment) if a WWW system is used. In addition, the “computer-readable recording medium” refers to writable nonvolatile memories such as a flexible disk, a magneto-optical disc, a ROM, and a flash memory, portable media such as a CD-ROM, and a recording device such as a hard disk built in a computer system. In addition, the ‘computer-readable recording medium’ may include a recording medium that stores a program dynamically for a short period of time like a network, such as the Internet, or a communication line when a program is transmitted through a communication line such as a telephone line, and include a recording medium that stores a program for a predetermined period of time like a volatile memory (for example, a DRAM (Dynamic Random Access Memory)) in a computer system serving as a server or a client in this case. In addition, the program may be transmitted from a computer system, which has a storage device or the like that stores the program, to another computer system through a transmission medium or a transmission wave in the transmission medium. Here, the ‘transmission medium’ that transmits a program refers to a medium with a function of transmitting information like a network (communication network), such as the Internet, or a communication line, such as a telephone line. In addition, the above program may be a program for realizing some of the functions described above or may be a so-called difference file (difference program) capable of realizing the above functions by combination with a program already recorded in the computer system.
  • While the embodiments of the present invention have been described in detail with reference to the drawings, the specific configuration is not limited to the above-described embodiments, and design and the like which do not depart from the spirit of the present invention are also included.

Claims (15)

What is claimed is:
1. An information control apparatus comprising:
a determination unit that determines at least an attribute of an object to be analyzed on the basis of captured image data acquired by an imaging apparatus fixed to a signal and that generates determination result information;
a surrounding information generating unit that generates a surrounding information table of the signal which corresponds to the determination result information;
a data analyzing unit that generates analysis result information on the basis of the surrounding information table generated by the surrounding information generating unit; and
an output unit that outputs the analysis result information to an external apparatus.
2. The information control apparatus according to claim 1,
wherein the surrounding information table includes ID which is unique information assigned to each of the determination result information respectively.
3. The information control apparatus according to claim 1,
wherein the surrounding information table includes a signal surrounding information table, vehicle attribution information table, person attribution information table, and
wherein the signal surrounding information table includes an unique image ID which is assigned to each of the captured image data, the vehicle attribution information table includes an unique vehicle ID which is assigned to a vehicle in each of the image data, and the person attribution information table includes an unique person ID which is assigned to a person in each of the image data.
4. The information control apparatus according to claim 2,
wherein the signal surrounding information table, the vehicle attribution information table, the person attribution information table and the captured image data are transmitted and received in a state a signal ID is given to each of the signal surrounding information table, the vehicle attribution information table, the person attribution information table and the captured image data, the signal ID being information matched with a position of each of the signal where the captured image data was acquired.
5. The information control apparatus according to claim 1, further comprising:
an image processing unit that calculates a motion vector of the captured image data which continues in time series, and detects a moving speed of the object to be analyzed on the basis of the motion vector.
6. The information control apparatus according to claim 5,
wherein the image processing unit assigns a unique image ID to each of the captured image data and outputs the captured image data, the image ID, and information indicating the detected moving speed to the surrounding information generating unit in a state the captured image data, the image ID, and information indicating the detected moving speed are matched with each other.
7. The information control apparatus according to claim 1,
wherein the output unit outputs the analysis result information to a device which is mounted on a vehicle passing through a position where the signal is placed.
8. The information control apparatus according to claim 7,
wherein the output unit transmits the captured image data, which corresponds to the vehicle, to the vehicle on the basis of the analysis result information.
9. The information control apparatus according to claim 7,
wherein the captured image data includes at least one of an image of a road under traffic congestion and an image of traveling path of the vehicle.
10. The information control apparatus according to claim 7,
wherein the output unit outputs the analysis result information and a control information to the device mounted on a vehicle which is controlled on the basis of control information supplied from external, and
wherein the data analyzing unit generates the control information on the basis of the analysis result information.
11. The information control apparatus according to claim 7,
wherein the determination unit outputs each positions and moving speeds of a plurality of the vehicles to the data analyzing unit when it is determined that the attribute of the object to be analyzed is a vehicle and a number of the object to be analyzed is plural, the each positions and moving speeds of the plurality of the vehicles being included in the determination result information,
wherein the data analyzing unit generates the analysis result information on the basis of the each positions and moving speeds of the plurality of vehicles which is included in the determination result information output from the determination unit, the analysis result information including information indicating a presence of a collision between at least two vehicles among the plurality of the vehicles, and
wherein the output unit outputs the information indicating the presence of the collision which is included in the analysis result information generated by the data analyzing unit to a device mounted on a vehicle passing through a position where the signal is placed.
12. A data analyzing apparatus comprising:
a data analyzing unit that is connected to an output unit, which outputs analysis result information indicating a result determined by a data analyzing unit on the basis of a surrounding information table generated by a surrounding information generating unit, the surrounding information generating unit generates the surrounding information table of a signal which corresponds to a determination result information determined by a determination unit, the determination unit determines at least an attribute of an object to be analyzed on the basis of captured image data acquired by an imaging apparatus fixed to the signal and generates the determination result information.
13. A server comprising the information control apparatus according to claim 1.
14. An information control system comprising:
a plurality of signals each including an information control apparatus comprising:
a determination unit that determines at least an attribute of an object to be analyzed on the basis of captured image data acquired by an imaging apparatus fixed to a signal and that generates determination result information;
a surrounding information generating unit that generates a surrounding information table of the signal which corresponds to the determination result information;
a data analyzing unit that generates analysis result information on the basis of the surrounding information table generated by the surrounding information generating unit; and
an output unit that outputs the analysis result information to an external apparatus;
the data analyzing unit according to claim 12; and
a server that communicates with the plurality of signals.
15. A non-transitory storage medium containing a program for causing a computer to:
determine at least an attribute of an object to be analyzed on the basis of captured image data acquired by an imaging apparatus fixed to a signal and that generates determination result information;
generate a surrounding information table of the signal which corresponds to the determination result information;
generate analysis result information on the basis of the surrounding information table; and
output the analysis result information to an external apparatus.
US14/820,495 2010-08-06 2015-08-06 Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program Abandoned US20150348411A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US14/820,495 US20150348411A1 (en) 2010-08-06 2015-08-06 Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program
US15/907,219 US10977938B2 (en) 2010-08-06 2018-02-27 Signal control apparatus and signal having the same
US17/225,990 US20210225166A1 (en) 2010-08-06 2021-04-08 Information control system and vehicle having mobile communication device

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2010-177724 2010-08-06
JP2010177725A JP2012038089A (en) 2010-08-06 2010-08-06 Information management device, data analysis device, signal, server, information management system, and program
JP2010-177725 2010-08-06
JP2010177724A JP5724241B2 (en) 2010-08-06 2010-08-06 Traffic light control device, traffic light, and program
US201161508026P 2011-07-14 2011-07-14
US201161508536P 2011-07-15 2011-07-15
US13/198,676 US20120033123A1 (en) 2010-08-06 2011-08-04 Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program
US14/820,495 US20150348411A1 (en) 2010-08-06 2015-08-06 Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/198,676 Division US20120033123A1 (en) 2010-08-06 2011-08-04 Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/907,219 Division US10977938B2 (en) 2010-08-06 2018-02-27 Signal control apparatus and signal having the same

Publications (1)

Publication Number Publication Date
US20150348411A1 true US20150348411A1 (en) 2015-12-03

Family

ID=45555894

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/198,676 Abandoned US20120033123A1 (en) 2010-08-06 2011-08-04 Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program
US14/820,495 Abandoned US20150348411A1 (en) 2010-08-06 2015-08-06 Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program
US15/907,219 Active 2031-09-30 US10977938B2 (en) 2010-08-06 2018-02-27 Signal control apparatus and signal having the same
US17/225,990 Pending US20210225166A1 (en) 2010-08-06 2021-04-08 Information control system and vehicle having mobile communication device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/198,676 Abandoned US20120033123A1 (en) 2010-08-06 2011-08-04 Information control apparatus, data analyzing apparatus, signal, server, information control system, signal control apparatus, and program

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/907,219 Active 2031-09-30 US10977938B2 (en) 2010-08-06 2018-02-27 Signal control apparatus and signal having the same
US17/225,990 Pending US20210225166A1 (en) 2010-08-06 2021-04-08 Information control system and vehicle having mobile communication device

Country Status (3)

Country Link
US (4) US20120033123A1 (en)
CN (3) CN105336179B (en)
WO (1) WO2012018109A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10223910B2 (en) * 2016-03-22 2019-03-05 Korea University Research And Business Foundation Method and apparatus for collecting traffic information from big data of outside image of vehicle
US20190355019A1 (en) * 2016-12-05 2019-11-21 Sony Corporation Information processing apparatus and information processing system
CN111798677A (en) * 2020-07-15 2020-10-20 安徽达尔智能控制系统股份有限公司 Traffic incident monitoring and commanding system based on road video
US20220392247A1 (en) * 2019-11-18 2022-12-08 Positive One Corporation Assignment control apparatus, assignment control system, and assignment control method

Families Citing this family (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI493478B (en) * 2012-03-21 2015-07-21 Altek Corp License plate image-pickup device and image exposure adjustment method thereof
JP2013196507A (en) * 2012-03-21 2013-09-30 Toshiba Corp Vehicle detector and vehicle determination method
US10408632B2 (en) * 2012-12-27 2019-09-10 Harman International Industries, Inc. Vehicle navigation
JP6180163B2 (en) * 2013-04-10 2017-08-16 株式会社京三製作所 Sound add-on device for the visually impaired
CN104751654B (en) 2013-12-31 2017-09-26 中国移动通信集团公司 A kind of traffic control method, network side equipment and terminal
US9428202B2 (en) * 2014-04-15 2016-08-30 Via Rail Canada Inc. Train safety system
US9563814B1 (en) 2014-06-27 2017-02-07 Blinker, Inc. Method and apparatus for recovering a vehicle identification number from an image
US10515285B2 (en) 2014-06-27 2019-12-24 Blinker, Inc. Method and apparatus for blocking information from an image
US10572758B1 (en) 2014-06-27 2020-02-25 Blinker, Inc. Method and apparatus for receiving a financing offer from an image
US9754171B1 (en) 2014-06-27 2017-09-05 Blinker, Inc. Method and apparatus for receiving vehicle information from an image and posting the vehicle information to a website
US10579892B1 (en) 2014-06-27 2020-03-03 Blinker, Inc. Method and apparatus for recovering license plate information from an image
US9779318B1 (en) 2014-06-27 2017-10-03 Blinker, Inc. Method and apparatus for verifying vehicle ownership from an image
US10867327B1 (en) 2014-06-27 2020-12-15 Blinker, Inc. System and method for electronic processing of vehicle transactions based on image detection of vehicle license plate
US10733471B1 (en) 2014-06-27 2020-08-04 Blinker, Inc. Method and apparatus for receiving recall information from an image
US9760776B1 (en) 2014-06-27 2017-09-12 Blinker, Inc. Method and apparatus for obtaining a vehicle history report from an image
US9589202B1 (en) 2014-06-27 2017-03-07 Blinker, Inc. Method and apparatus for receiving an insurance quote from an image
US9600733B1 (en) 2014-06-27 2017-03-21 Blinker, Inc. Method and apparatus for receiving car parts data from an image
US9607236B1 (en) 2014-06-27 2017-03-28 Blinker, Inc. Method and apparatus for providing loan verification from an image
US9589201B1 (en) 2014-06-27 2017-03-07 Blinker, Inc. Method and apparatus for recovering a vehicle value from an image
US9558419B1 (en) 2014-06-27 2017-01-31 Blinker, Inc. Method and apparatus for receiving a location of a vehicle service center from an image
US10540564B2 (en) 2014-06-27 2020-01-21 Blinker, Inc. Method and apparatus for identifying vehicle information from an image
US9773184B1 (en) 2014-06-27 2017-09-26 Blinker, Inc. Method and apparatus for receiving a broadcast radio service offer from an image
US9818154B1 (en) 2014-06-27 2017-11-14 Blinker, Inc. System and method for electronic processing of vehicle transactions based on image detection of vehicle license plate
US9594971B1 (en) 2014-06-27 2017-03-14 Blinker, Inc. Method and apparatus for receiving listings of similar vehicles from an image
US9892337B1 (en) 2014-06-27 2018-02-13 Blinker, Inc. Method and apparatus for receiving a refinancing offer from an image
US9832706B2 (en) * 2014-07-30 2017-11-28 Nec Corporation Information dissemination in a multi-technology communication network
CN105894830B (en) * 2014-10-21 2018-12-04 苏州现代文化发展有限公司 A kind of traffic intelligent command and control method and system
JP6421580B2 (en) 2014-12-15 2018-11-14 住友電気工業株式会社 Traffic signal control device, computer program, and traffic signal control method
JP6791117B2 (en) * 2015-02-23 2020-11-25 住友電気工業株式会社 Traffic index generator, traffic index generation method and computer program
JP6418089B2 (en) * 2015-07-08 2018-11-07 オムロン株式会社 Image processing apparatus, traffic management system including the same, and image processing method
JP6524846B2 (en) * 2015-08-04 2019-06-05 オムロン株式会社 Vehicle identification device and vehicle identification system provided with the same
CN105070076B (en) * 2015-09-30 2017-12-12 招商局重庆交通科研设计院有限公司 A kind of special vehicle leased circuit method and system for planning based on V2I
CN105389989A (en) * 2015-11-13 2016-03-09 合肥安奎思成套设备有限公司 Transportation safety online monitoring system
US9868393B2 (en) * 2015-12-10 2018-01-16 International Business Machines Corporation Vehicle accident avoidance system
EP3398182A1 (en) 2015-12-31 2018-11-07 Robert Bosch GmbH Intelligent distributed vision traffic marker and method thereof
US10906463B2 (en) 2016-02-01 2021-02-02 Magna Electronics Inc. Vehicle adaptive lighting system
JP6662653B2 (en) * 2016-02-04 2020-03-11 ソフトバンク株式会社 Road traffic survey system
CN105741572B (en) * 2016-04-01 2018-07-10 苏州玄禾物联网科技有限公司 Maximum vehicle flowrate period traffic lights control method based on Internet of Things
DE102016213013A1 (en) * 2016-07-15 2018-01-18 Robert Bosch Gmbh A method and apparatus for controlling traffic to reduce air pollution
US10109185B1 (en) * 2016-07-25 2018-10-23 360fly, Inc. Method and apparatus for traffic monitoring based on traffic images
JP6892242B2 (en) * 2016-10-27 2021-06-23 株式会社日本マイクロニクス Control server and control system
JP7047769B2 (en) * 2016-12-15 2022-04-05 日本電気株式会社 Information processing system, information processing method and information processing program
KR102471072B1 (en) * 2016-12-21 2022-11-25 삼성전자주식회사 Electronic apparatus and operating method for the same
JP6333497B1 (en) * 2017-02-03 2018-05-30 三菱電機株式会社 Information acquisition system and equipment
CN107146422A (en) * 2017-06-14 2017-09-08 泉州市联控自动化科技有限公司 One kind control intelligent traffic signal device
EP3654276B1 (en) * 2017-07-13 2023-09-13 Nec Corporation Analysis device, analysis method, and program
JP6992342B2 (en) * 2017-09-13 2022-01-13 富士フイルムビジネスイノベーション株式会社 Information processing equipment and programs
EP3506153A1 (en) * 2017-12-28 2019-07-03 Canon Kabushiki Kaisha Information processing apparatus, system, method, and non-transitory computer-readable storage medium
US11270580B2 (en) * 2018-02-23 2022-03-08 Sumitomo Electric Industries, Ltd. Traffic signal control apparatus, traffic signal control method, and computer program
US11107347B2 (en) 2018-04-27 2021-08-31 Cubic Corporation Adaptively controlling traffic movements for driver safety
JP7147255B2 (en) 2018-05-11 2022-10-05 トヨタ自動車株式会社 image display device
IL307426A (en) * 2018-05-16 2023-12-01 Notraffic Ltd System and method for using v2x and sensor data
JP7073991B2 (en) * 2018-09-05 2022-05-24 トヨタ自動車株式会社 Peripheral display device for vehicles
JP7115277B2 (en) * 2018-12-10 2022-08-09 トヨタ自動車株式会社 Behavior monitoring device, behavior monitoring system, and behavior monitoring program
JP2020095565A (en) * 2018-12-14 2020-06-18 トヨタ自動車株式会社 Information processing system, program, and method for processing information
DE102018251778A1 (en) * 2018-12-28 2020-07-02 Robert Bosch Gmbh Method for assisting a motor vehicle
CN109816987B (en) * 2019-01-24 2022-02-18 苏州清听声学科技有限公司 Electronic police law enforcement snapshot system for automobile whistling and snapshot method thereof
CN109714441A (en) * 2019-02-28 2019-05-03 云南开放大学 A kind of vehicle detecting system and method
CN110910639A (en) * 2019-11-25 2020-03-24 福建工程学院 Method and system for traffic facility abnormal information authentication based on block chain technology
CN114170821B (en) * 2021-12-16 2023-03-31 阿波罗智联(北京)科技有限公司 Signal machine performance detection method and device and traffic signal lamp control system
TWI815566B (en) * 2022-07-20 2023-09-11 信昌宏工業股份有限公司 Electronic equipment cloud management system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734337A (en) * 1995-11-01 1998-03-31 Kupersmit; Carl Vehicle speed monitoring system
JP2001291184A (en) * 2000-04-05 2001-10-19 Faasuto Create:Kk Traffic volume investigating system
US20020082806A1 (en) * 1995-01-13 2002-06-27 Kaub Alan R. Traffic safety prediction model
US20050267651A1 (en) * 2004-01-15 2005-12-01 Guillermo Arango System and method for knowledge-based emergency response
US6989766B2 (en) * 2003-12-23 2006-01-24 International Business Machines Corporation Smart traffic signal system
US20060095199A1 (en) * 2004-11-03 2006-05-04 Lagassey Paul J Modular intelligent transportation system
US20080015772A1 (en) * 2006-07-13 2008-01-17 Denso Corporation Drive-assist information providing system for driver of vehicle

Family Cites Families (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3009299B2 (en) 1992-05-28 2000-02-14 松下電器産業株式会社 Car road guidance system
JPH0635896A (en) 1992-07-13 1994-02-10 Toshiba Corp Sales forecasting device
US5434927A (en) * 1993-12-08 1995-07-18 Minnesota Mining And Manufacturing Company Method and apparatus for machine vision classification and tracking
JP2806260B2 (en) 1994-04-28 1998-09-30 日本サミコン株式会社 Traffic lights for one side
EP0680028B1 (en) 1994-04-28 2000-06-28 Nihon Samicon Co. Ltd. Traffic control system for directing alternating one-way passing of vehicles around a road-work site section
US5774569A (en) * 1994-07-25 1998-06-30 Waldenmaier; H. Eugene W. Surveillance system
JP3384425B2 (en) * 1995-08-18 2003-03-10 日本電信電話株式会社 Moving object monitoring and measurement equipment
JPH1097696A (en) 1996-09-20 1998-04-14 Toyo Commun Equip Co Ltd Signal control system
JPH11203590A (en) * 1997-05-28 1999-07-30 Hitachi Denshi Ltd Television system for traffic monitor
JPH1153694A (en) * 1997-07-31 1999-02-26 Toyota Motor Corp Intersection warning device
JPH11261990A (en) 1998-03-12 1999-09-24 Hitachi Denshi Ltd Traffic monitor television camera device
US6466260B1 (en) * 1997-11-13 2002-10-15 Hitachi Denshi Kabushiki Kaisha Traffic surveillance system
JPH11203589A (en) * 1998-01-16 1999-07-30 Omron Corp Traffic image pickup device and traffic monitoring device
JP2000172987A (en) * 1998-12-02 2000-06-23 Mitsubishi Electric Corp Controller for road construction traffic signal
AU1544300A (en) * 1998-12-16 2000-07-03 9022-6523 Quebec Inc. Traffic light backup system using light-emitting diodes
JP2000207676A (en) 1999-01-08 2000-07-28 Nec Corp Traffic accident detector
JP2001155292A (en) 1999-12-01 2001-06-08 Yamatake Corp Traffic state evaluation display method
JP2001155289A (en) 1999-12-01 2001-06-08 Sony Corp Information communication system and its method
US6587778B2 (en) * 1999-12-17 2003-07-01 Itt Manufacturing Enterprises, Inc. Generalized adaptive signal control method and system
JP2001184592A (en) * 1999-12-24 2001-07-06 Hitachi Ltd Vehicle passing supporting device
JP2001229487A (en) 2000-02-15 2001-08-24 Minolta Co Ltd Traffic monitor device
JP2002073940A (en) 2000-08-31 2002-03-12 Fuji Xerox System Service Co Ltd Store opening plan support device
JP2002092797A (en) 2000-09-18 2002-03-29 Nippon Signal Co Ltd:The Traffic information providing system
JP2002140799A (en) 2000-10-31 2002-05-17 Natl Inst For Land & Infrastructure Management Mlit Method for helping prevention of collision in intersection
JP2003022493A (en) * 2001-07-05 2003-01-24 Ntt Docomo Shikoku Inc Signal system, information providing and gathering method using signal, information providing and gathering program using signal, and computer-readable recording medium
JP2003030703A (en) * 2001-07-19 2003-01-31 Mitsubishi Heavy Ind Ltd Camera apparatus and accounting system
JP3779229B2 (en) * 2002-04-01 2006-05-24 住友電気工業株式会社 Identification method, identification device, and traffic control system
CN1186750C (en) * 2002-06-03 2005-01-26 昆明利普机器视觉工程有限公司 A traffic flow detection system based on visual vehicle optical characteristic recognition and matching
JP4048292B2 (en) 2002-06-14 2008-02-20 松下電器産業株式会社 Traffic information display system, captured image transmission device, in-vehicle information display device and storage medium
JP2004185399A (en) * 2002-12-04 2004-07-02 Nippon Telegr & Teleph Corp <Ntt> Traffic environment management server
JP2004348469A (en) * 2003-05-22 2004-12-09 Mitsubishi Heavy Ind Ltd Traffic signal neglect detecting system
JP2005159691A (en) 2003-11-26 2005-06-16 Hitachi Ltd Supervisory system
JP4391839B2 (en) * 2004-01-30 2009-12-24 富士通株式会社 Shooting condition setting program, shooting condition setting method, and shooting condition setting apparatus
US7102538B2 (en) * 2004-04-05 2006-09-05 Kuo-Chin Chen LED signal light
US7333012B1 (en) 2004-05-25 2008-02-19 Martin Khang Nguyen Vehicle monitoring and control using radio frequency identification
JP2006113682A (en) * 2004-10-12 2006-04-27 Toyota Motor Corp Traffic signal controller
JP4701690B2 (en) 2004-11-29 2011-06-15 住友電気工業株式会社 Traffic imaging apparatus and traffic monitoring system
US7317406B2 (en) * 2005-02-03 2008-01-08 Toyota Technical Center Usa, Inc. Infrastructure-based collision warning using artificial intelligence
JP4525915B2 (en) 2005-02-16 2010-08-18 株式会社デンソー Driving assistance device
JP2007042042A (en) * 2005-07-30 2007-02-15 Sigma Denki Kogyo Kk Road traffic information acquisition method, road traffic information display device, and fixing method for led
JP2007047894A (en) * 2005-08-08 2007-02-22 Omron Corp Signal control system and signal controller
JP2009510827A (en) * 2005-09-27 2009-03-12 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Motion detection device
JP2007148849A (en) * 2005-11-29 2007-06-14 Matsushita Electric Ind Co Ltd Signal control system
CN1776768A (en) * 2005-12-08 2006-05-24 曾佑国 Solar wireless intelligent traffic signal lamp control system
US20070276600A1 (en) * 2006-03-06 2007-11-29 King Timothy I Intersection collision warning system
JP2007316827A (en) 2006-05-24 2007-12-06 Toyota Motor Corp Intersection traffic control system
US20070273552A1 (en) * 2006-05-24 2007-11-29 Bellsouth Intellectual Property Corporation Control of traffic flow by sensing traffic states
US7423551B1 (en) * 2006-08-15 2008-09-09 Sharrow John A Method and apparatus for controlling temporary traffic signals
CN101140704A (en) 2006-09-08 2008-03-12 松下电器产业株式会社 Direction testing apparatus
JP4743058B2 (en) 2006-09-14 2011-08-10 住友電気工業株式会社 Traffic signal controller
JP4935287B2 (en) 2006-10-10 2012-05-23 住友電気工業株式会社 Traffic signal controller, traffic signal control system
US8204278B2 (en) * 2006-11-28 2012-06-19 Fujitsu Limited Image recognition method
JP4780043B2 (en) 2007-06-08 2011-09-28 株式会社デンソー Transportation system, road side device, vehicle side device
JP2007280425A (en) * 2007-07-10 2007-10-25 Sumitomo Electric Ind Ltd Emergency vehicle priority control system and control device
JP5053776B2 (en) 2007-09-14 2012-10-17 株式会社デンソー Vehicular visibility support system, in-vehicle device, and information distribution device
US8577605B2 (en) * 2007-10-18 2013-11-05 International Business Machines Corporation Vehicle feedback method and system
JP4930321B2 (en) * 2007-10-25 2012-05-16 株式会社デンソー Potential danger point detection device and in-vehicle danger point notification device
JP2009129006A (en) 2007-11-20 2009-06-11 Yokogawa Electric Corp Image analysis apparatus and image analysis system
CN101470955A (en) * 2007-12-26 2009-07-01 奥城同立科技开发(北京)有限公司 Integrated control system for road junction traffic
CN101232335A (en) * 2007-12-28 2008-07-30 深圳市同洲电子股份有限公司 System and method for processing traffic information
CN101409016A (en) * 2008-02-01 2009-04-15 浙江通衢数码科技有限公司 Control method for urban road traffic
JP4992755B2 (en) 2008-02-25 2012-08-08 株式会社デンソー Intersection driving support system, in-vehicle equipment, and roadside equipment
JP2008217813A (en) * 2008-04-08 2008-09-18 Sumitomo Electric Ind Ltd Collision information providing device and method
JP2010003242A (en) 2008-06-23 2010-01-07 Toyota Motor Corp Communication system
CN101388145B (en) * 2008-11-06 2010-09-15 北京汇大基业科技有限公司 Auto alarming method and device for traffic safety
GB2469679B (en) * 2009-04-23 2012-05-02 Imagination Tech Ltd Object tracking using momentum and acceleration vectors in a motion estimation system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020082806A1 (en) * 1995-01-13 2002-06-27 Kaub Alan R. Traffic safety prediction model
US5734337A (en) * 1995-11-01 1998-03-31 Kupersmit; Carl Vehicle speed monitoring system
JP2001291184A (en) * 2000-04-05 2001-10-19 Faasuto Create:Kk Traffic volume investigating system
US6989766B2 (en) * 2003-12-23 2006-01-24 International Business Machines Corporation Smart traffic signal system
US20050267651A1 (en) * 2004-01-15 2005-12-01 Guillermo Arango System and method for knowledge-based emergency response
US20060095199A1 (en) * 2004-11-03 2006-05-04 Lagassey Paul J Modular intelligent transportation system
US20080015772A1 (en) * 2006-07-13 2008-01-17 Denso Corporation Drive-assist information providing system for driver of vehicle

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10223910B2 (en) * 2016-03-22 2019-03-05 Korea University Research And Business Foundation Method and apparatus for collecting traffic information from big data of outside image of vehicle
US20190355019A1 (en) * 2016-12-05 2019-11-21 Sony Corporation Information processing apparatus and information processing system
US20220392247A1 (en) * 2019-11-18 2022-12-08 Positive One Corporation Assignment control apparatus, assignment control system, and assignment control method
US11842557B2 (en) * 2019-11-18 2023-12-12 Positive One Corporation Assignment control apparatus, assignment control system, and assignment control method
CN111798677A (en) * 2020-07-15 2020-10-20 安徽达尔智能控制系统股份有限公司 Traffic incident monitoring and commanding system based on road video

Also Published As

Publication number Publication date
WO2012018109A1 (en) 2012-02-09
CN103069465A (en) 2013-04-24
US20210225166A1 (en) 2021-07-22
US20180190113A1 (en) 2018-07-05
CN105869411B (en) 2019-08-30
US10977938B2 (en) 2021-04-13
CN103069465B (en) 2016-05-04
US20120033123A1 (en) 2012-02-09
CN105336179B (en) 2019-01-01
CN105869411A (en) 2016-08-17
CN105336179A (en) 2016-02-17

Similar Documents

Publication Publication Date Title
US20210225166A1 (en) Information control system and vehicle having mobile communication device
KR101976110B1 (en) IoT standard-based parking guidance optimization system for smart city
US9631941B2 (en) Information providing system and information providing method
US20190196494A1 (en) Autonomous driving system and autonomous driving method
JP6456838B2 (en) Street device health monitoring
CN112562408A (en) Global path planning method, device and system based on vehicle-road cooperation
JP2012038089A (en) Information management device, data analysis device, signal, server, information management system, and program
KR101176947B1 (en) Crowd information serving system using cctv
US20230343208A1 (en) Pedestrian device, information collection device, base station device, positioning method, user management method, information collection method, and facility monitoring method
CN114446056B (en) Vehicle information code generation and vehicle passing control method, device and equipment
CN105139661A (en) Traffic detection and early warning system and method
CN113574574B (en) Mobile object monitoring system, control server for mobile object monitoring system, and mobile object monitoring method
KR20240008966A (en) Parking sharing system for maximum use of parking spaces
KR102297966B1 (en) Crosswalk walking safety and intelligent smart safety based on cctv and iot sensor
Alam et al. Integration of smart parking in distributed ITS architecture
KR20090090049A (en) Apparatus and method for real-time traffic information service
JP2023171455A (en) Route prediction device, in-vehicle device therewith, route prediction system, route prediction method, and computer program
JP7459781B2 (en) CONTROL DEVICE, METHOD, AND PROGRAM
JP2015215905A (en) Information management device, data analysis device, server, information management system and program
JP2020094959A (en) Route search device, method for searching for route, and route search program
JP7297325B2 (en) vending machine system
JP2017120657A (en) Information management device, data analysis device, signal apparatus, server, information management system, and program
KR102399018B1 (en) bus stop system
WO2023166675A1 (en) Monitoring device, monitoring system, monitoring method and recording medium
KR102366199B1 (en) An apparatus and method for determining u-turn of a vehicle in a prohibited area

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION