WO2015029315A1 - Terminal device, control method, and program - Google Patents

Terminal device, control method, and program Download PDF

Info

Publication number
WO2015029315A1
WO2015029315A1 PCT/JP2014/003763 JP2014003763W WO2015029315A1 WO 2015029315 A1 WO2015029315 A1 WO 2015029315A1 JP 2014003763 W JP2014003763 W JP 2014003763W WO 2015029315 A1 WO2015029315 A1 WO 2015029315A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
unit
terminal device
information
packet signal
Prior art date
Application number
PCT/JP2014/003763
Other languages
French (fr)
Japanese (ja)
Inventor
俊朗 中莖
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2015029315A1 publication Critical patent/WO2015029315A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication

Definitions

  • the present invention relates to communication technology, and more particularly to a terminal device that transmits and receives a signal including predetermined information.
  • a driving support system that automatically outputs an alarm for avoiding danger or avoiding operation has been proposed.
  • the driving support system in order to determine the control state of alarm output or avoidance action, the driving state of the vehicle or obstacles ahead is determined based on various information or captured images acquired by radio communication from the equipment provided on the road side. The presence of an object is detected (for example, refer to Patent Document 1).
  • the penetration rate of equipment that is provided on the road side and transmits various information or photographed images and the penetration rate of equipment that is provided on the vehicle and receives various information or photographed images from the roadside equipment is low, the own vehicle The detection of the running state of the vehicle or the presence of a front obstacle is not sufficiently performed. As a result, danger avoidance is not sufficiently achieved.
  • the present invention has been made in view of such circumstances, and an object thereof is to provide a technique for suppressing the occurrence of an accident in a vehicle even when the penetration rate of the apparatus is low.
  • a terminal device is a terminal device that can be mounted on a vehicle, and includes a first acquisition unit that acquires position information of the vehicle, and an object that exists behind the vehicle.
  • a generation unit that generates a packet signal including a second acquisition unit that acquires detection information, position information acquired by the first acquisition unit, and detection information acquired by the second acquisition unit, and a generation unit that generates the packet signal
  • a notification unit for reporting the packet signal is a terminal device that can be mounted on a vehicle, and includes a first acquisition unit that acquires position information of the vehicle, and an object that exists behind the vehicle.
  • a generation unit that generates a packet signal including a second acquisition unit that acquires detection information, position information acquired by the first acquisition unit, and detection information acquired by the second acquisition unit, and a generation unit that generates the packet signal
  • a notification unit for reporting the packet signal for reporting the packet signal.
  • the occurrence of a vehicle accident can be suppressed even when the penetration rate of the device is low.
  • FIGS. 3A to 3D are diagrams showing frame formats defined in the communication system of FIG. It is a figure which shows another structure of the communication system of FIG. It is a figure which shows the structure of the terminal device mounted in the vehicle of FIG.
  • FIGS. 6A to 6B are diagrams showing an outline of processing in the second acquisition unit in FIG.
  • FIGS. 7A and 7B are diagrams showing another processing outline in the second acquisition unit in FIG.
  • FIG. 10 is a diagram illustrating an outline of processing of the terminal device according to the modification.
  • Embodiments of the present invention relate to a communication system that performs vehicle-to-vehicle communication between terminal devices mounted on a vehicle, and also executes road-to-vehicle communication from a base station device installed at an intersection or the like to a terminal device.
  • a communication system is also called ITS (Intelligent Transport Systems).
  • ITS is stipulated in, for example, a standard for a 700 MHz band intelligent transportation system (Radio Industry Association).
  • the communication system uses an access control function called CSMA / CA (Carrier Sense Multiple Access Avidance) as well as a wireless LAN (Local Area Network) compliant with a standard such as IEEE 802.11. Therefore, the same radio channel is shared by a plurality of terminal devices.
  • CSMA / CA Carrier Sense Multiple Access Avidance
  • a wireless LAN Local Area Network
  • IEEE 802.11 IEEE 802.11
  • the terminal device broadcasts a packet signal that stores information such as the speed or position of the vehicle.
  • the other terminal device receives the packet signal and recognizes the approach of the vehicle based on the above-described information.
  • the base station apparatus in order to reduce interference between road-vehicle communication and vehicle-to-vehicle communication, the base station apparatus repeatedly defines a frame including a plurality of subframes. The base station apparatus selects any of a plurality of subframes for road-to-vehicle communication, and broadcasts a packet signal in which control information and the like are stored during the period of the head portion of the selected subframe.
  • the control information includes information related to a period during which the base station apparatus broadcasts a packet signal (hereinafter referred to as “road vehicle transmission period”).
  • the terminal device specifies a road and vehicle transmission period based on the control information, and broadcasts a packet signal by the CSMA method in a period other than the road and vehicle transmission period (hereinafter referred to as “vehicle transmission period”).
  • vehicle transmission period a period other than the road and vehicle transmission period
  • road-to-vehicle communication and vehicle-to-vehicle communication are time-division multiplexed.
  • a terminal device that cannot receive control information from the base station device that is, a terminal device that exists outside the area formed by the base station device transmits a packet signal by the CSMA method regardless of the frame configuration.
  • a vehicle equipped with a terminal device is provided with a sensor for detecting a rear object.
  • An object is a vehicle, a pedestrian, or the like.
  • the terminal device also includes information on the detected object behind (hereinafter referred to as “detection information”) in the packet signal to notify the packet signal.
  • the terminal device that has received this packet signal recognizes not only the presence of the vehicle on which the transmission source terminal device is mounted, but also the presence of an object that exists behind the vehicle. That is, even if a terminal device is not mounted on a vehicle or the like that is an object, its presence is recognized.
  • FIG. 1 shows a configuration of a communication system 100 according to an embodiment of the present invention. This corresponds to a case where one intersection is viewed from above.
  • the communication system 100 includes a base station device 10, a first vehicle 12a, a second vehicle 12b, a third vehicle 12c, a fourth vehicle 12d, a fifth vehicle 12e, a sixth vehicle 12f, and a seventh vehicle 12g, collectively referred to as a vehicle 12. , The eighth vehicle 12h, and the network 202.
  • the eighth vehicle 12h and the network 202.
  • the eighth vehicle 12h and the network 202.
  • An area 212 is formed around the base station apparatus 10, and an outside area 214 is formed outside the area 212.
  • the road that goes in the horizontal direction of the drawing that is, the left and right direction
  • intersects the vertical direction of the drawing that is, the road that goes in the up and down direction, at the central portion.
  • the upper side of the drawing corresponds to the direction “north”
  • the left side corresponds to the direction “west”
  • the lower side corresponds to the direction “south”
  • the right side corresponds to the direction “east”.
  • the intersection of the two roads is an “intersection”.
  • the first vehicle 12a and the second vehicle 12b are traveling from left to right
  • the third vehicle 12c and the fourth vehicle 12d are traveling from right to left
  • the fifth vehicle 12e and the sixth vehicle 12f are traveling from the top to the bottom
  • the seventh vehicle 12g and the eighth vehicle 12h are traveling from the bottom to the top.
  • the base station apparatus 10 is fixedly installed at an intersection.
  • the base station device 10 controls communication between terminal devices.
  • the base station device 10 receives a frame including a plurality of subframes based on a signal received from a GPS (Global Positioning System) satellite (not shown) or a frame formed by another base station device 10 (not shown). Generate repeatedly.
  • the road vehicle transmission period can be set at the head of each subframe.
  • the base station apparatus 10 selects a subframe in which the road and vehicle transmission period is not set by another base station apparatus 10 from among a plurality of subframes in the frame.
  • the base station apparatus 10 sets a road and vehicle transmission period at the beginning of the selected subframe.
  • the base station apparatus 10 notifies the packet signal in the set road and vehicle transmission period.
  • a plurality of packet signals may be notified.
  • the packet signal includes, for example, accident information, traffic jam information, signal information, and the like. Note that the packet signal also includes information related to the timing when the road and vehicle transmission period is set and control information related to the frame.
  • the terminal device 14 is mounted on the vehicle 12 and movable as described above. Moreover, the terminal device 14 can be held by a pedestrian.
  • the terminal apparatus 14 estimates that the terminal apparatus 14 exists in the area 212.
  • the terminal device 14 generates a frame based on the control information included in the packet signal, in particular, the information on the timing when the road and vehicle transmission period is set and the information on the frame.
  • the frame generated in each of the plurality of terminal devices 14 is synchronized with the frame generated in the base station device 10.
  • the terminal device 14 notifies the packet signal in the vehicle transmission period that is a period different from the road and vehicle transmission period.
  • CSMA / CA is executed in the vehicle transmission period.
  • the terminal device 14 estimates that the terminal device 14 is outside the area 214 by not receiving the packet signal from the base station device 10, the terminal device 14 performs CSMA / CA regardless of the frame configuration.
  • a packet signal is broadcast.
  • FIG. 2 shows the configuration of the base station apparatus 10.
  • the base station apparatus 10 includes an antenna 20, an RF unit 22, a modem unit 24, a processing unit 26, a control unit 28, and a network communication unit 30.
  • the processing unit 26 includes a frame defining unit 32, a selecting unit 34, and a generating unit 36.
  • the RF unit 22 receives a packet signal from the terminal device 14 (not shown) or another base station device 10 by the antenna 20 as a reception process.
  • the RF unit 22 performs frequency conversion on the received radio frequency packet signal to generate a baseband packet signal. Further, the RF unit 22 outputs a baseband packet signal to the modem unit 24.
  • baseband packet signals are formed by in-phase and quadrature components, so two signal lines should be shown, but here only one signal line is shown for clarity. Shall be shown.
  • the RF unit 22 also includes an LNA (Low Noise Amplifier), a mixer, an AGC, and an A / D conversion unit.
  • LNA Low Noise Amplifier
  • the RF unit 22 performs frequency conversion on the baseband packet signal input from the modem unit 24 as a transmission process, and generates a radio frequency packet signal. Further, the RF unit 22 transmits a radio frequency packet signal from the antenna 20 during the road-vehicle transmission period.
  • the RF unit 22 also includes a PA (Power Amplifier), a mixer, and a D / A conversion unit. For example, the 700 MHz band is used as the radio frequency.
  • the modem unit 24 demodulates the baseband packet signal from the RF unit 22 as a reception process. Further, the modem unit 24 outputs the demodulated result to the processing unit 26. The modem unit 24 also modulates the data from the processing unit 26 as a transmission process. Further, the modem unit 24 outputs the modulated result to the RF unit 22 as a baseband packet signal.
  • the modem unit 24 since the communication system 100 corresponds to the OFDM (Orthogonal Frequency Division Multiplexing) modulation method, the modem unit 24 also executes FFT (Fast Fourier Transform) as reception processing and IFFT (Inverse TransFastFast) as transmission processing. Also execute.
  • the frame defining unit 32 receives a signal from a GPS satellite (not shown), and acquires time information based on the received signal.
  • the frame defining unit 32 generates a plurality of frames based on the time information. For example, the frame defining unit 32 generates ten “100 msec” frames by dividing the “1 sec” period into ten on the basis of the timing indicated by the time information. By repeating such processing, the frame is defined to be repeated.
  • the frame defining unit 32 may detect control information from the demodulation result and generate a frame based on the detected control information. Such processing corresponds to generating a frame synchronized with the timing of the frame formed by another base station apparatus 10.
  • FIGS. 3A to 3D show frame formats defined in the communication system 100.
  • FIG. FIG. 3A shows the structure of the frame.
  • the frame is formed of N subframes indicated as the first subframe to the Nth subframe.
  • the terminal device 14 forms a frame by multiplexing a plurality of subframes that can be used for notification for a plurality of hours.
  • N may be other than 8.
  • the selection unit 34 selects a subframe in which a road and vehicle transmission period is to be set from among a plurality of subframes included in the frame. More specifically, the selection unit 34 receives a frame defined by the frame defining unit 32. The selection unit 34 receives an instruction regarding the selected subframe via an interface (not shown). The selection unit 34 selects a subframe corresponding to the instruction. Apart from this, the selection unit 34 may automatically select a subframe. At this time, the selection unit 34 inputs a demodulation result from another base station device 10 or the terminal device 14 (not shown) via the RF unit 22 and the modem unit 24. The selection part 34 extracts the demodulation result from the other base station apparatus 10 among the input demodulation results. The selection unit 34 specifies the subframe that has not received the demodulation result by specifying the subframe that has received the demodulation result.
  • the selection unit 34 selects one subframe at random.
  • the selection unit 34 acquires reception power corresponding to the demodulation result, and gives priority to subframes with low reception power.
  • FIG. 3B shows a configuration of a frame generated by the first base station apparatus 10a (not shown).
  • the first base station apparatus 10a sets a road and vehicle transmission period at the beginning of the first subframe.
  • the 1st base station apparatus 10a sets a vehicle transmission period following the road and vehicle transmission period in a 1st sub-frame.
  • the vehicle transmission period is a period during which the terminal device 14 can notify the packet signal. That is, the first base station apparatus 10a can notify the packet signal in the road and vehicle transmission period which is the first period of the first subframe, and the terminal apparatus in the vehicle and vehicle transmission period other than the road and vehicle transmission period in the frame. It is specified that 14 can broadcast the packet signal. Furthermore, the first base station apparatus 10a sets only the vehicle transmission period from the second subframe to the Nth subframe.
  • FIG. 3C shows a configuration of a frame generated by the second base station apparatus 10b (not shown).
  • the second base station apparatus 10b sets a road and vehicle transmission period at the beginning of the second subframe.
  • the second base station apparatus 10b sets the vehicle transmission period from the first stage of the road and vehicle transmission period in the second subframe, from the first subframe and the third subframe to the Nth subframe.
  • FIG. 3D shows a configuration of a frame generated by a third base station apparatus 10c (not shown).
  • the third base station apparatus 10c sets a road and vehicle transmission period at the beginning of the third subframe.
  • the third base station apparatus 10c sets the vehicle transmission period from the first stage of the road and vehicle transmission period in the third subframe, the first subframe, the second subframe, and the fourth subframe to the Nth subframe.
  • the plurality of base station apparatuses 10 select different subframes, and set the road and vehicle transmission period at the head portion of the selected subframe.
  • the selection unit 34 outputs the selected subframe number to the generation unit 36.
  • the generation unit 36 receives a subframe number from the selection unit 34.
  • the generation unit 36 sets a road and vehicle transmission period in the subframe of the received subframe number, and generates a packet signal to be notified during the road and vehicle transmission period.
  • the generation unit 36 generates them.
  • the packet signal is composed of control information and a payload.
  • the control information includes a subframe number in which a road and vehicle transmission period is set.
  • the payload includes, for example, accident information, traffic jam information, signal information, and the like. These data are acquired from the network 202 (not shown) by the network communication unit 30.
  • the processing unit 26 broadcasts the packet signal to the modem unit 24 and the RF unit 22 during the road and vehicle transmission period.
  • the control unit 28 controls processing of the entire base station device 10.
  • This configuration can be realized in terms of hardware by a CPU, memory, or other LSI of any computer, and in terms of software, it can be realized by a program loaded in the memory, but here it is realized by their cooperation.
  • Draw functional blocks Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms only by hardware, or by a combination of hardware and software.
  • FIG. 4 shows another configuration of the communication system 100.
  • the base station apparatus 10 is omitted, and an intersection is arranged at the center.
  • the arrangement of vehicles 12 (first vehicle 12a to seventh vehicle 12g) is different from that in FIG.
  • the first vehicle 12a, the third vehicle 12c, and the seventh vehicle 12g are traveling from left to right
  • the second vehicle 12b, the fourth vehicle 12d, and the fifth vehicle 12e are traveling from right to left.
  • the vehicle 12f is traveling from top to bottom.
  • the first vehicle 12a and the second vehicle 12b are in a state of waiting for a right turn.
  • the fourth vehicle 12d and the fifth vehicle 12e exist in the blind spot
  • the third vehicle 12c and the seventh vehicle 12g exist in the blind spot.
  • the terminal device 14 is mounted on the first vehicle 12a, the second vehicle 12b, and the sixth vehicle 12f.
  • the terminal devices 14 mounted on the first vehicle 12a, the second vehicle 12b, and the sixth vehicle 12f are the first terminal device 14a, the second terminal device 14b, and the sixth terminal device 14f, respectively.
  • the first vehicle 12a is also equipped with a rear sensor and a front sensor, which form a first rear beam 90a and a first front beam 92a, respectively.
  • the second vehicle 12b and the sixth vehicle 12f are the same as the first vehicle 12a, and a second rear beam 90b, a second front beam 92b, a sixth rear beam 90f, and a sixth front beam 92f are formed. Note that the terminal device 14 is not mounted on the remaining vehicles 12.
  • the third vehicle 12c and the seventh vehicle 12g are detected by the first rear beam 90a, and detection information indicating that is acquired by the first terminal device 14a.
  • the 1st terminal device 14a also acquires the positional information on the 1st vehicle 12a, and alert
  • the second terminal device 14b receives the packet signal from the first terminal device 14a, and extracts position information and detection information from the packet signal.
  • the second terminal device 14b recognizes the presence of the first vehicle 12a from the position information, and recognizes the presence of the third vehicle 12c and the seventh vehicle 12g from the detection information. As a result, even if the terminal device 14 is not mounted on the third vehicle 12c and the seventh vehicle 12g, the presence thereof is informed. The same applies to the packet signal from the second terminal device 14b to the first terminal device 14a.
  • the presence of the first pedestrian 16a, the second pedestrian 16b, and the third pedestrian 16c who are walking on the pedestrian crossing is detected by the sixth front beam 92f, and the detection information indicating that is the sixth terminal. Acquired by the device 14f.
  • the sixth terminal device 14f also informs the information of the pedestrian 16 by notifying the packet signal including the position information and the detection information.
  • FIG. 5 shows the configuration of the terminal device 14 mounted on the vehicle 12.
  • the vehicle 12 may include a pedestrian.
  • the terminal device 14 includes an antenna 50, an RF unit 52, a modem unit 54, a processing unit 56, and a control unit 58.
  • the processing unit 56 includes a timing specifying unit 60, a transfer determination unit 62, a first acquisition unit 64, a generation unit 66, a notification unit 70, a second acquisition unit 80, and a third acquisition unit 82.
  • the timing specifying unit 60 includes an extraction unit 72 and a carrier sense unit 74.
  • the rear sensor 84 is connected to the second acquisition unit 80
  • the front sensor 86 is connected to the third acquisition unit 82.
  • the antenna 50, the RF unit 52, and the modem unit 54 execute the same processing as the antenna 20, the RF unit 22, and the modem unit 24 in FIG. Here, the difference will be mainly described.
  • the modem unit 54 and the processing unit 56 receive a packet signal from another terminal device 14 or the base station device 10 (not shown) in the reception process. As described above, the modem unit 54 and the processing unit 56 receive a packet signal from the base station apparatus 10 during the road-to-vehicle transmission period, and receive packet signals from other terminal apparatuses 14 during the vehicle-to-vehicle transmission period. To do.
  • the extraction unit 72 specifies the timing of the subframe in which the road and vehicle transmission period is arranged when the demodulation result from the modem unit 54 is a packet signal from the base station device 10 (not shown). In that case, the extraction part 72 estimates that it exists in the area 212 of FIG. The extraction unit 72 generates a frame based on the subframe timing and the content of the message header of the packet signal. As a result, the extraction unit 72 generates a frame synchronized with the frame formed in the base station device 10. When the notification source of the packet signal is another terminal device 14, the extraction unit 72 omits the synchronized frame generation process. If the extraction unit 72 exists in the area 212, the extraction unit 72 specifies the remaining vehicle transmission period after specifying the road and vehicle transmission period in use. The extraction unit 72 outputs information on frame and subframe timing and vehicle transmission period to the carrier sense unit 74.
  • the extraction unit 72 when the extraction unit 72 has not received a packet signal from the base station apparatus 10, that is, when a frame synchronized with the base station apparatus 10 has not been generated, the extraction unit 72 estimates that it is outside the area 214 in FIG. When the extraction unit 72 exists outside the area 214, the extraction unit 72 selects a timing unrelated to the frame configuration, and instructs the carrier sense unit 74 to execute carrier sense unrelated to the frame configuration.
  • the carrier sense unit 74 receives information on frame and subframe timing and vehicle transmission period from the extraction unit 72.
  • the carrier sense unit 74 determines the transmission timing by starting CSMA / CA within the vehicle transmission period. This is equivalent to setting NAV (Network Allocation Vector) for the road and vehicle transmission period and performing carrier sense outside the period in which NAV is set.
  • NAV Network Allocation Vector
  • the carrier sense unit 74 performs transmission timing by executing CSMA / CA without considering the frame configuration. To decide.
  • the carrier sense unit 74 notifies the modem 54 and the RF unit 52 of the determined transmission timing, and broadcasts the packet signal generated by the generation unit 66 described later.
  • the transfer determination unit 62 controls transfer of control information.
  • the transfer determination unit 62 extracts information to be transferred from the control information.
  • the transfer determination unit 62 generates information to be transferred based on the extracted information. Here, the description of this process is omitted.
  • the transfer determination unit 62 outputs information to be transferred, that is, a part of the control information, to the generation unit 66.
  • the first acquisition unit 64 includes a GPS receiver, a gyroscope, a vehicle speed sensor, and the like (not shown), and the presence position of the vehicle 12 (not shown), that is, the vehicle 12 on which the terminal device 14 is mounted, based on data supplied from them.
  • the traveling direction, the moving speed, etc. (hereinafter collectively referred to as “position information”) are acquired.
  • the existence position is indicated by latitude and longitude. That is, the presence position of the vehicle 12 is shown as an absolute value. Since a known technique may be used for these acquisitions, description thereof is omitted here.
  • the GPS receiver, gyroscope, vehicle speed sensor, and the like may be outside the terminal device 14.
  • the first acquisition unit 64 outputs the position information to the generation unit 66.
  • the rear sensor 84 is installed at the rear part of the vehicle 12 and detects the presence of an object existing behind the vehicle 12.
  • the rear sensor 84 is, for example, a millimeter wave radar, and its detection range is shown as a first rear beam 90a in FIG. Note that the first rear beam 90a is described for the sake of clarity, and the presence of an object existing in a portion outside the first rear beam 90a may be detected.
  • the rear sensor 84 detects the position of the object by a known technique.
  • the rear sensor 84 may be an image capturing device, and may detect an object from the captured image. At that time, the type of the object may be detected by image processing. The types are classified as pedestrians, motorcycles, small / medium-sized vehicles, and large vehicles.
  • the rear sensor 84 outputs information on the detected object (hereinafter referred to as “detection information”) to the second acquisition unit 80.
  • detection information information on the detected object
  • all of them may be included in the detection information, and at least one of them, for example, only one may be included in the detection information.
  • the second acquisition unit 80 acquires detection information of an object existing behind the vehicle 12 from the rear sensor 84.
  • FIGS. 6A and 6B show an outline of processing in the second acquisition unit 80.
  • FIG. FIG. 6A shows an area formed behind the vehicle 12. This is an area where an object can be detected by the rear beam 90. As illustrated, the area is divided into a first area 120, a second area 122, a third area 124, and a fourth area 126.
  • the second acquisition unit 80 specifies whether the position of the object is included in the first area 120, the second area 122, the third area 124, or the fourth area 126.
  • FIG. 6B shows the data structure of the table stored in the second acquisition unit 80. As shown, a content column 130 and a code column 132 are included. The second acquisition unit 80 derives a code based on the identified area.
  • FIGS. 7A to 7B show another processing outline in the second acquisition unit 80.
  • FIG. FIG. 7A shows an interval formed behind the vehicle 12. This is an interval obtained by discretely dividing the distance from the vehicle 12. As illustrated, the interval is divided into a first interval 140, a second interval 142, a third interval 144, a fourth interval 146, and a fifth interval 148 in the direction away from the vehicle 12.
  • the second acquisition unit 80 specifies whether the position of the object is included in the first interval 140, the second interval 142, the third interval 144, the fourth interval 146, or the fifth interval 148.
  • FIG. 7B shows the data structure of another table stored in the second acquisition unit 80. As shown, a content column 150 and a code column 152 are included.
  • the second acquisition unit 80 derives a code based on the specified interval. Returning to FIG.
  • the second acquisition unit 80 includes the derived combination of the two codes in the detection information instead of the position of the object. That is, in the detection information, the presence position of the object is indicated as a relative value from the vehicle 12, and the presence position of the object is indicated as code information.
  • FIG. 8 shows the data structure of yet another table stored in the second acquisition unit 80.
  • a content column 160 and a code column 162 are included.
  • the second acquisition unit 80 derives a code from the type of object.
  • an imaging device is used as the rear sensor 84.
  • the second acquisition unit 80 includes the derived code in the detection information instead of the type of the object. That is, the detection information indicates the type of the object as code information.
  • the front sensor 86 is installed in the front part of the vehicle 12 and detects the presence of an object existing in front of the vehicle 12.
  • the third acquisition unit 82 acquires detection information of an object that exists in front of the vehicle 12. Since the 3rd acquisition part 82 and the front sensor 86 should just perform the process similar to the 2nd acquisition part 80 and the back sensor 84, description is abbreviate
  • the generation unit 66 receives position information from the first acquisition unit 64, detection information from the second acquisition unit 80, and detection information from the third acquisition unit 82. Further, the generation unit 66 receives part of the control information from the transfer determination unit 62. The generating unit 66 generates a packet signal by storing a part of the received control information in the control information and storing the position information and the detection information in the payload. As described above, the processing unit 56, the modem unit 54, and the RF unit 52 notify the packet signal generated by the generation unit 66.
  • the extraction unit 72 extracts the position information and detection information included in the packet signal from the other terminal device 14 and outputs the position information and detection information to the notification unit 70.
  • the notification unit 70 acquires position information and detection information from the extraction unit 72.
  • the notification unit 70 detects the approach of the other vehicle 12 based on the position information of the other terminal device 14 acquired from the extraction unit 72 and the position information input from the first acquisition unit 64.
  • the notification unit 70 notifies the driver of the approach of another vehicle 12 through a monitor or a speaker.
  • the notification unit 70 specifies the position and type of the object based on various codes included in the detection information.
  • the content is specified from the code by executing the reverse process of the second acquisition unit 80.
  • the position of the object is shown as a relative position with respect to the other vehicle 12, by considering the relative position based on the position of the other vehicle 12 derived as described above, Identify the position of the object. Due to the encoding on the transmitting side, the position of the object is not exact. Furthermore, the notification unit 70 also notifies the position and type of the identified object.
  • the control unit 58 controls the operation of the terminal device 14.
  • FIG. 9 is a flowchart showing a reception procedure in the terminal device 14.
  • the extraction unit 72 extracts position information from the packet signal (S10), and the notification unit 70 specifies the position of another vehicle 12 (S12).
  • the extraction unit 72 extracts the detection information from the packet signal (S16).
  • the notification unit 70 specifies the position of the object (S18). If detection information is not included in the packet signal (N in S14), the process is terminated.
  • the detection information of the rear object is also notified in addition to the position information, the existence of the rear object can be notified even if the rear object does not include the terminal device.
  • the presence of an object behind is informed, the occurrence of a vehicle accident can be suppressed even when the penetration rate of the terminal device is low.
  • the vehicle location is indicated as an absolute value
  • the object location is indicated as a relative value from the vehicle, so the amount of information can be reduced.
  • the presence position of the object is indicated as code information
  • the amount of information can be reduced.
  • the type of object is indicated as code information
  • the amount of information can be reduced.
  • the detection information of the object ahead is also notified, the occurrence of a vehicle accident can be further suppressed.
  • FIG. 10 shows another processing outline in the second acquisition unit 80 and the third acquisition unit 82.
  • FIG. 10 shows the absolute area position with respect to the vehicle position with the vehicle 12 as the center. This is an area position obtained by discretely dividing the direction (azimuth) and distance from the vehicle 12. As shown in the figure, the area position is determined according to the absolute direction and distance of the vehicle 12, using the area position as a code such as the north area N1 to N15, the east area E1 to E9, the south area S1 to S15, and the west area W1 to W9. It is divided and defined.
  • the direction of the vehicle 12 is determined from an electronic compass (not shown) mounted on the vehicle 12 or a travel locus by GPS.
  • the vehicle 12 detects an object in the detection range 90F by the front sensor 86, and detects an object in the detection range 90R by the rear sensor 84.
  • the second acquisition unit 80 determines whether the object detected by the rear sensor 84 is included in any of the area positions of the north areas N1 to N15, the east areas E1 to E9, the south areas S1 to S15, and the west areas W1 to W9. Identify.
  • the third acquisition unit 82 determines whether the object detected by the front sensor 86 is included in any of the area positions of the north areas N1 to N15, the east areas E1 to E9, the south areas S1 to S15, and the west areas W1 to W9. Identify.
  • the second acquisition unit 80 and the third acquisition unit 82 include the derived code information in each detection information. That is, in the detection information at this time, the detection position of the object is indicated as direction and distance information from the vehicle 12, and the presence position of the object is indicated as code information. Since the code information indicates an absolute area position with respect to the vehicle position, the detection information for the rear object and the detection information for the front object can be used in common without distinguishing them.
  • the generation unit 66 receives position information from the first acquisition unit 64, detection information from the second acquisition unit 80, and detection information from the third acquisition unit 82.
  • the subsequent processing is the same as in the previous embodiment.
  • the position of the object detected around the other vehicle is indicated as absolute area position information (code information) based on the position information of the other vehicle. It can be reduced, and the mapping of the object on the map becomes easy.
  • the second acquisition unit 80 acquires the detection information of the rear object
  • the third acquisition unit 82 acquires the detection information of the front object.
  • the present invention is not limited to this, and for example, the detection information of the front object may not be acquired by the third acquisition unit 82. In that case, the front sensor 86 and the third acquisition unit 82 may not be included. According to this modification, the configuration can be simplified.
  • the detection information includes the position and type of the object.
  • the present invention is not limited to this.
  • the type of the object may not be included, and only the position of the object may be included. According to this modification, the amount of information can be reduced.
  • the detection information is encoded.
  • the present invention is not limited to this.
  • the coding may not be performed.
  • the position of the object is notified as it is, so that an accurate position can be notified.
  • a terminal device is a terminal device that can be mounted on a vehicle, and includes a first acquisition unit that acquires position information of the vehicle, and a second acquisition that acquires detection information of an object existing behind the vehicle.
  • a generating unit that generates a packet signal including the position information acquired in the first acquiring unit and the detection information acquired in the second acquiring unit, and a notifying unit that notifies the packet signal generated in the generating unit .
  • the detection information of the rear object is also notified, so even if the rear object is not equipped with a terminal device, the occurrence of the vehicle behind is suppressed by notifying the presence of the rear object. it can.
  • the presence position of the vehicle is indicated as an absolute value
  • the detection information acquired by the second acquisition unit the position of the object is indicated as a relative value from the vehicle. May be shown. In this case, the amount of information can be reduced.
  • the presence position of the object may be indicated as code information. In this case, the amount of information can be reduced.
  • the type of the object may be indicated as code information. In this case, the amount of information can be reduced.
  • a third acquisition unit that acquires detection information of an object existing in front of the vehicle may be further provided.
  • the generation unit may include the detection information acquired by the third acquisition unit in the packet signal. In this case, the occurrence of a vehicle accident can be further suppressed.
  • the occurrence of a vehicle accident can be suppressed even when the penetration rate of the device is low.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

A first acquisition unit (64) acquires location information of a vehicle. A second acquisition unit (80) acquires detection information of an object which is present to the rear of the vehicle. A generating unit (66) generates a packet signal wherein are included the location information which is acquired by the first acquisition unit (64) and the detection information which is acquired by the second acquisition unit (80). A modulation/demodulation unit (54) and an RF unit (52) report the packet signal which is generated by the generating unit (66).

Description

端末装置、制御方法、及び、プログラムTerminal device, control method, and program
 本発明は、通信技術に関し、特に所定の情報が含まれた信号を送受信する端末装置に関する。 The present invention relates to communication technology, and more particularly to a terminal device that transmits and receives a signal including predetermined information.
 従来、自動車の分野においては、自車が危険な走行状態に陥ることを未然に防止すべく、危険回避のための警報出力あるいは回避動作を自動的に行なう運転支援システムが提案されている。運転支援システムでは、警報出力あるいは回避動作の制御状態を決定するために、道路側に設けられた設備から無線通信によって取得した各種情報あるいは撮影画像をもとに、自車の走行状態あるいは前方障害物の存在を検出する(例えば、特許文献1参照)。 Conventionally, in the field of automobiles, in order to prevent the own vehicle from falling into a dangerous driving state, a driving support system that automatically outputs an alarm for avoiding danger or avoiding operation has been proposed. In the driving support system, in order to determine the control state of alarm output or avoidance action, the driving state of the vehicle or obstacles ahead is determined based on various information or captured images acquired by radio communication from the equipment provided on the road side. The presence of an object is detected (for example, refer to Patent Document 1).
特開2003-157482号公報JP 2003-157482 A
 道路側に設けられ、かつ各種情報あるいは撮影画像を送信する設備の普及率、車両に設けられ、かつ道路側の設備からの各種情報あるいは撮影画像を受信する装置の普及率が低ければ、自車の走行状態あるいは前方障害物の存在の検出が十分になされなくなる。それによって、危険回避が十分になされなくなる。 If the penetration rate of equipment that is provided on the road side and transmits various information or photographed images and the penetration rate of equipment that is provided on the vehicle and receives various information or photographed images from the roadside equipment is low, the own vehicle The detection of the running state of the vehicle or the presence of a front obstacle is not sufficiently performed. As a result, danger avoidance is not sufficiently achieved.
 本発明はこうした状況に鑑みてなされたものであり、その目的は、装置の普及率が低い場合であっても、車両の事故発生を抑制する技術を提供することにある。 The present invention has been made in view of such circumstances, and an object thereof is to provide a technique for suppressing the occurrence of an accident in a vehicle even when the penetration rate of the apparatus is low.
 上記課題を解決するために、本発明のある態様の端末装置は、車両に搭載可能な端末装置であって、車両の位置情報を取得する第1取得部と、車両の後方に存在する物体の検出情報を取得する第2取得部と、第1取得部において取得した位置情報と、第2取得部において取得した検出情報とが含まれたパケット信号を生成する生成部と、生成部において生成したパケット信号を報知する報知部と、を備える。 In order to solve the above problems, a terminal device according to an aspect of the present invention is a terminal device that can be mounted on a vehicle, and includes a first acquisition unit that acquires position information of the vehicle, and an object that exists behind the vehicle. A generation unit that generates a packet signal including a second acquisition unit that acquires detection information, position information acquired by the first acquisition unit, and detection information acquired by the second acquisition unit, and a generation unit that generates the packet signal A notification unit for reporting the packet signal.
 なお、以上の構成要素の任意の組合せ、本発明の表現を方法、装置、システム、記録媒体、コンピュータプログラムなどの間で変換したものもまた、本発明の態様として有効である。 It should be noted that an arbitrary combination of the above-described components and a conversion of the expression of the present invention between a method, an apparatus, a system, a recording medium, a computer program, and the like are also effective as an aspect of the present invention.
 本発明によれば、装置の普及率が低い場合であっても、車両の事故発生を抑制できる。 According to the present invention, the occurrence of a vehicle accident can be suppressed even when the penetration rate of the device is low.
本発明の実施例に係る通信システムの構成を示す図である。It is a figure which shows the structure of the communication system which concerns on the Example of this invention. 図1の基地局装置の構成を示す図である。It is a figure which shows the structure of the base station apparatus of FIG. 図3(a)-(d)は、図1の通信システムにおいて規定されるフレームのフォーマットを示す図である。FIGS. 3A to 3D are diagrams showing frame formats defined in the communication system of FIG. 図1の通信システムの別の構成を示す図である。It is a figure which shows another structure of the communication system of FIG. 図1の車両に搭載された端末装置の構成を示す図である。It is a figure which shows the structure of the terminal device mounted in the vehicle of FIG. 図6(a)-(b)は、図5の第2取得部における処理概要を示す図である。FIGS. 6A to 6B are diagrams showing an outline of processing in the second acquisition unit in FIG. 図7(a)-(b)は、図5の第2取得部における別の処理概要を示す図である。FIGS. 7A and 7B are diagrams showing another processing outline in the second acquisition unit in FIG. 図5の第2取得部に記憶されたテーブルのデータ構造を示す図である。It is a figure which shows the data structure of the table memorize | stored in the 2nd acquisition part of FIG. 図5の端末装置における受信手順を示すフローチャートである。It is a flowchart which shows the reception procedure in the terminal device of FIG. 図10は、変形例における端末装置の処理概要を示す図である。FIG. 10 is a diagram illustrating an outline of processing of the terminal device according to the modification.
 本発明の実施例を具体的に説明する前に、基礎となった知見を説明する。本発明の実施例は、車両に搭載された端末装置間において車車間通信を実行するとともに、交差点等に設置された基地局装置から端末装置へ路車間通信も実行する通信システムに関する。このような通信システムは、ITS(Intelligent Transport Systems)とも呼ばれる。ITSは、例えば、700MHz帯高度道路交通システムの標準規格(一般社団法人電波産業会)に規定されている。通信システムは、IEEE802.11等の規格に準拠した無線LAN(Local Area Network)と同様に、CSMA/CA(Carrier Sense Multiple Access with Collision Avoidance)と呼ばれるアクセス制御機能を使用する。そのため、複数の端末装置によって同一の無線チャネルが共有される。一方、ITSでは、不特定多数の端末装置へ情報を送信する必要がある。そのような送信を効率的に実行するために、本通信システムは、パケット信号をブロードキャスト送信する。 DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Prior to specific description of embodiments of the present invention, the knowledge that is the basis will be described. Embodiments of the present invention relate to a communication system that performs vehicle-to-vehicle communication between terminal devices mounted on a vehicle, and also executes road-to-vehicle communication from a base station device installed at an intersection or the like to a terminal device. Such a communication system is also called ITS (Intelligent Transport Systems). ITS is stipulated in, for example, a standard for a 700 MHz band intelligent transportation system (Radio Industry Association). The communication system uses an access control function called CSMA / CA (Carrier Sense Multiple Access Avidance) as well as a wireless LAN (Local Area Network) compliant with a standard such as IEEE 802.11. Therefore, the same radio channel is shared by a plurality of terminal devices. On the other hand, in ITS, it is necessary to transmit information to an unspecified number of terminal devices. In order to efficiently perform such transmission, the communication system broadcasts a packet signal.
 つまり、車車間通信として、端末装置は、車両の速度あるいは位置等の情報を格納したパケット信号をブロードキャスト送信する。また、他の端末装置は、パケット信号を受信するとともに、前述の情報をもとに車両の接近等を認識する。ここで、路車間通信と車車間通信との干渉を低減するために、基地局装置は、複数のサブフレームが含まれたフレームを繰り返し規定する。基地局装置は、路車間通信のために、複数のサブフレームのいずれかを選択し、選択したサブフレームの先頭部分の期間において、制御情報等が格納されたパケット信号をブロードキャスト送信する。 That is, as a vehicle-to-vehicle communication, the terminal device broadcasts a packet signal that stores information such as the speed or position of the vehicle. In addition, the other terminal device receives the packet signal and recognizes the approach of the vehicle based on the above-described information. Here, in order to reduce interference between road-vehicle communication and vehicle-to-vehicle communication, the base station apparatus repeatedly defines a frame including a plurality of subframes. The base station apparatus selects any of a plurality of subframes for road-to-vehicle communication, and broadcasts a packet signal in which control information and the like are stored during the period of the head portion of the selected subframe.
 制御情報には、当該基地局装置がパケット信号をブロードキャスト送信するための期間(以下、「路車送信期間」という)に関する情報が含まれている。端末装置は、制御情報をもとに路車送信期間を特定し、路車送信期間以外の期間(以下、「車車送信期間」という)においてCSMA方式にてパケット信号をブロードキャスト送信する。その結果、路車間通信と車車間通信とが時分割多重される。なお、基地局装置からの制御情報を受信できない端末装置、つまり基地局装置によって形成されたエリアの外に存在する端末装置は、フレームの構成に関係なくCSMA方式にてパケット信号を送信する。 The control information includes information related to a period during which the base station apparatus broadcasts a packet signal (hereinafter referred to as “road vehicle transmission period”). The terminal device specifies a road and vehicle transmission period based on the control information, and broadcasts a packet signal by the CSMA method in a period other than the road and vehicle transmission period (hereinafter referred to as “vehicle transmission period”). As a result, road-to-vehicle communication and vehicle-to-vehicle communication are time-division multiplexed. Note that a terminal device that cannot receive control information from the base station device, that is, a terminal device that exists outside the area formed by the base station device transmits a packet signal by the CSMA method regardless of the frame configuration.
 次に、本実施例の概略を説明する。このような通信システムによる衝突事故の回避の効果は、端末装置の普及率に依存しており、普及率が低い段階では、十分な効果が得られない可能性がある。本実施例では、普及率が低い場合であっても、車両の事故発生を抑制するために、次の処理を実行する。端末装置が搭載された車両には、後方の物体を検出するためのセンサが備えられる。物体とは、車両、歩行者等である。端末装置は、車両の位置情報に加えて、検出した後方の物体に関する情報(以下、「検出情報」という)もパケット信号に含め、パケット信号を報知する。このパケット信号を受信した端末装置は、送信元の端末装置を搭載した車両の存在だけではなく、当該車両の後方に存在する物体の存在も認識する。つまり、物体である車両等に端末装置が搭載されていなくても、その存在が認識される。 Next, an outline of the present embodiment will be described. The effect of avoiding a collision accident by such a communication system depends on the penetration rate of the terminal device, and there is a possibility that a sufficient effect cannot be obtained at a stage where the penetration rate is low. In the present embodiment, the following processing is executed in order to suppress the occurrence of a vehicle accident even when the penetration rate is low. A vehicle equipped with a terminal device is provided with a sensor for detecting a rear object. An object is a vehicle, a pedestrian, or the like. In addition to the position information of the vehicle, the terminal device also includes information on the detected object behind (hereinafter referred to as “detection information”) in the packet signal to notify the packet signal. The terminal device that has received this packet signal recognizes not only the presence of the vehicle on which the transmission source terminal device is mounted, but also the presence of an object that exists behind the vehicle. That is, even if a terminal device is not mounted on a vehicle or the like that is an object, its presence is recognized.
 図1は、本発明の実施例に係る通信システム100の構成を示す。これは、ひとつの交差点を上方から見た場合に相当する。通信システム100は、基地局装置10、車両12と総称される第1車両12a、第2車両12b、第3車両12c、第4車両12d、第5車両12e、第6車両12f、第7車両12g、第8車両12h、ネットワーク202を含む。ここでは、第1車両12aのみに示しているが、各車両12には、端末装置14が搭載されている。また、エリア212が、基地局装置10の周囲に形成され、エリア外214が、エリア212の外側に形成されている。 FIG. 1 shows a configuration of a communication system 100 according to an embodiment of the present invention. This corresponds to a case where one intersection is viewed from above. The communication system 100 includes a base station device 10, a first vehicle 12a, a second vehicle 12b, a third vehicle 12c, a fourth vehicle 12d, a fifth vehicle 12e, a sixth vehicle 12f, and a seventh vehicle 12g, collectively referred to as a vehicle 12. , The eighth vehicle 12h, and the network 202. Here, only the first vehicle 12 a is shown, but each vehicle 12 is equipped with a terminal device 14. An area 212 is formed around the base station apparatus 10, and an outside area 214 is formed outside the area 212.
 図示のごとく、図面の水平方向、つまり左右の方向に向かう道路と、図面の垂直方向、つまり上下の方向に向かう道路とが中心部分で交差している。ここで、図面の上側が方角の「北」に相当し、左側が方角の「西」に相当し、下側が方角の「南」に相当し、右側が方角の「東」に相当する。また、ふたつの道路の交差部分が「交差点」である。第1車両12a、第2車両12bが、左から右へ向かって進んでおり、第3車両12c、第4車両12dが、右から左へ向かって進んでいる。また、第5車両12e、第6車両12fが、上から下へ向かって進んでおり、第7車両12g、第8車両12hが、下から上へ向かって進んでいる。 As shown in the figure, the road that goes in the horizontal direction of the drawing, that is, the left and right direction, intersects the vertical direction of the drawing, that is, the road that goes in the up and down direction, at the central portion. Here, the upper side of the drawing corresponds to the direction “north”, the left side corresponds to the direction “west”, the lower side corresponds to the direction “south”, and the right side corresponds to the direction “east”. The intersection of the two roads is an “intersection”. The first vehicle 12a and the second vehicle 12b are traveling from left to right, and the third vehicle 12c and the fourth vehicle 12d are traveling from right to left. Further, the fifth vehicle 12e and the sixth vehicle 12f are traveling from the top to the bottom, and the seventh vehicle 12g and the eighth vehicle 12h are traveling from the bottom to the top.
 通信システム100において、基地局装置10は、交差点に固定して設置される。基地局装置10は、端末装置間の通信を制御する。基地局装置10は、図示しないGPS(Global Positioning System)衛星から受信した信号、あるいは図示しない他の基地局装置10にて形成されたフレームをもとに、複数のサブフレームが含まれたフレームを繰り返し生成する。ここで、各サブフレームの先頭部分に路車送信期間が設定可能であるような規定がなされている。 In the communication system 100, the base station apparatus 10 is fixedly installed at an intersection. The base station device 10 controls communication between terminal devices. The base station device 10 receives a frame including a plurality of subframes based on a signal received from a GPS (Global Positioning System) satellite (not shown) or a frame formed by another base station device 10 (not shown). Generate repeatedly. Here, the road vehicle transmission period can be set at the head of each subframe.
 基地局装置10は、フレーム中の複数のサブフレームのうち、他の基地局装置10によって路車送信期間が設定されていないサブフレームを選択する。基地局装置10は、選択したサブフレームの先頭部分に路車送信期間を設定する。基地局装置10は、設定した路車送信期間においてパケット信号を報知する。路車送信期間において、複数のパケット信号が報知されることもある。また、パケット信号には、例えば、事故情報、渋滞情報、信号情報等が含まれる。なお、パケット信号には、路車送信期間が設定されたタイミングに関する情報およびフレームに関する制御情報も含まれる。 The base station apparatus 10 selects a subframe in which the road and vehicle transmission period is not set by another base station apparatus 10 from among a plurality of subframes in the frame. The base station apparatus 10 sets a road and vehicle transmission period at the beginning of the selected subframe. The base station apparatus 10 notifies the packet signal in the set road and vehicle transmission period. In the road and vehicle transmission period, a plurality of packet signals may be notified. The packet signal includes, for example, accident information, traffic jam information, signal information, and the like. Note that the packet signal also includes information related to the timing when the road and vehicle transmission period is set and control information related to the frame.
 端末装置14は、前述のごとく、車両12に搭載され移動可能である。また、端末装置14は、歩行者によっても保持可能である。端末装置14は、基地局装置10からのパケット信号を受信すると、エリア212に存在すると推定する。端末装置14は、エリア212に存在する場合、パケット信号に含まれた制御情報、特に路車送信期間が設定されたタイミングに関する情報およびフレームに関する情報をもとに、フレームを生成する。その結果、複数の端末装置14のそれぞれにおいて生成されるフレームは、基地局装置10において生成されるフレームに同期する。端末装置14は、路車送信期間とは異なった期間である車車送信期間においてパケット信号を報知する。ここで、車車送信期間においてCSMA/CAが実行される。一方、端末装置14は、基地局装置10からのパケット信号を受信しないことによって、エリア外214に存在していると推定した場合、フレームの構成に関係なく、CSMA/CAを実行することによって、パケット信号を報知する。 The terminal device 14 is mounted on the vehicle 12 and movable as described above. Moreover, the terminal device 14 can be held by a pedestrian. When receiving the packet signal from the base station apparatus 10, the terminal apparatus 14 estimates that the terminal apparatus 14 exists in the area 212. When the terminal device 14 exists in the area 212, the terminal device 14 generates a frame based on the control information included in the packet signal, in particular, the information on the timing when the road and vehicle transmission period is set and the information on the frame. As a result, the frame generated in each of the plurality of terminal devices 14 is synchronized with the frame generated in the base station device 10. The terminal device 14 notifies the packet signal in the vehicle transmission period that is a period different from the road and vehicle transmission period. Here, CSMA / CA is executed in the vehicle transmission period. On the other hand, when the terminal device 14 estimates that the terminal device 14 is outside the area 214 by not receiving the packet signal from the base station device 10, the terminal device 14 performs CSMA / CA regardless of the frame configuration. A packet signal is broadcast.
 図2は、基地局装置10の構成を示す。基地局装置10は、アンテナ20、RF部22、変復調部24、処理部26、制御部28、ネットワーク通信部30を含む。また、処理部26は、フレーム規定部32、選択部34、生成部36を含む。 FIG. 2 shows the configuration of the base station apparatus 10. The base station apparatus 10 includes an antenna 20, an RF unit 22, a modem unit 24, a processing unit 26, a control unit 28, and a network communication unit 30. Further, the processing unit 26 includes a frame defining unit 32, a selecting unit 34, and a generating unit 36.
 RF部22は、受信処理として、図示しない端末装置14あるいは他の基地局装置10からのパケット信号をアンテナ20にて受信する。RF部22は、受信した無線周波数のパケット信号に対して周波数変換を実行し、ベースバンドのパケット信号を生成する。さらに、RF部22は、ベースバンドのパケット信号を変復調部24に出力する。一般的に、ベースバンドのパケット信号は、同相成分と直交成分によって形成されるので、ふたつの信号線が示されるべきであるが、ここでは、図を明瞭にするためにひとつの信号線だけを示すものとする。RF部22には、LNA(Low Noise Amplifier)、ミキサ、AGC、A/D変換部も含まれる。 The RF unit 22 receives a packet signal from the terminal device 14 (not shown) or another base station device 10 by the antenna 20 as a reception process. The RF unit 22 performs frequency conversion on the received radio frequency packet signal to generate a baseband packet signal. Further, the RF unit 22 outputs a baseband packet signal to the modem unit 24. In general, baseband packet signals are formed by in-phase and quadrature components, so two signal lines should be shown, but here only one signal line is shown for clarity. Shall be shown. The RF unit 22 also includes an LNA (Low Noise Amplifier), a mixer, an AGC, and an A / D conversion unit.
 RF部22は、送信処理として、変復調部24から入力したベースバンドのパケット信号に対して周波数変換を実行し、無線周波数のパケット信号を生成する。さらに、RF部22は、路車送信期間において、無線周波数のパケット信号をアンテナ20から送信する。また、RF部22には、PA(Power Amplifier)、ミキサ、D/A変換部も含まれる。例えば、無線周波数として、700MHz帯が使用される。 The RF unit 22 performs frequency conversion on the baseband packet signal input from the modem unit 24 as a transmission process, and generates a radio frequency packet signal. Further, the RF unit 22 transmits a radio frequency packet signal from the antenna 20 during the road-vehicle transmission period. The RF unit 22 also includes a PA (Power Amplifier), a mixer, and a D / A conversion unit. For example, the 700 MHz band is used as the radio frequency.
 変復調部24は、受信処理として、RF部22からのベースバンドのパケット信号に対して、復調を実行する。さらに、変復調部24は、復調した結果を処理部26に出力する。また、変復調部24は、送信処理として、処理部26からのデータに対して、変調を実行する。さらに、変復調部24は、変調した結果をベースバンドのパケット信号としてRF部22に出力する。ここで、通信システム100は、OFDM(Orthogonal Frequency Division Multiplexing)変調方式に対応するので、変復調部24は、受信処理としてFFT(Fast Fourier Transform)も実行し、送信処理としてIFFT(Inverse Fast Fourier Transform)も実行する。 The modem unit 24 demodulates the baseband packet signal from the RF unit 22 as a reception process. Further, the modem unit 24 outputs the demodulated result to the processing unit 26. The modem unit 24 also modulates the data from the processing unit 26 as a transmission process. Further, the modem unit 24 outputs the modulated result to the RF unit 22 as a baseband packet signal. Here, since the communication system 100 corresponds to the OFDM (Orthogonal Frequency Division Multiplexing) modulation method, the modem unit 24 also executes FFT (Fast Fourier Transform) as reception processing and IFFT (Inverse TransFastFast) as transmission processing. Also execute.
 フレーム規定部32は、図示しないGPS衛星からの信号を受信し、受信した信号をもとに時刻の情報を取得する。なお、時刻の情報の取得には公知の技術が使用されればよいので、ここでは説明を省略する。フレーム規定部32は、時刻の情報をもとに、複数のフレームを生成する。例えば、フレーム規定部32は、時刻の情報にて示されたタイミングを基準にして、「1sec」の期間を10分割することによって、「100msec」のフレームを10個生成する。このような処理を繰り返すことによって、フレームが繰り返されるように規定される。なお、フレーム規定部32は、復調結果から制御情報を検出し、検出した制御情報をもとにフレームを生成してもよい。このような処理は、他の基地局装置10によって形成されたフレームのタイミングに同期したフレームを生成することに相当する。 The frame defining unit 32 receives a signal from a GPS satellite (not shown), and acquires time information based on the received signal. In addition, since a well-known technique should just be used for acquisition of the information of time, description is abbreviate | omitted here. The frame defining unit 32 generates a plurality of frames based on the time information. For example, the frame defining unit 32 generates ten “100 msec” frames by dividing the “1 sec” period into ten on the basis of the timing indicated by the time information. By repeating such processing, the frame is defined to be repeated. The frame defining unit 32 may detect control information from the demodulation result and generate a frame based on the detected control information. Such processing corresponds to generating a frame synchronized with the timing of the frame formed by another base station apparatus 10.
 図3(a)-(d)は、通信システム100において規定されるフレームのフォーマットを示す。図3(a)は、フレームの構成を示す。フレームは、第1サブフレームから第Nサブフレームと示されるN個のサブフレームによって形成されている。これは、端末装置14が報知に使用可能なサブフレームを複数時間多重することによってフレームが形成されているといえる。例えば、フレームの長さが100msecであり、Nが8である場合、12.5msecの長さのサブフレームが規定される。Nは、8以外であってもよい。図3(b)-(d)の説明は、後述し、図2に戻る。 FIGS. 3A to 3D show frame formats defined in the communication system 100. FIG. FIG. 3A shows the structure of the frame. The frame is formed of N subframes indicated as the first subframe to the Nth subframe. This can be said that the terminal device 14 forms a frame by multiplexing a plurality of subframes that can be used for notification for a plurality of hours. For example, when the frame length is 100 msec and N is 8, a subframe having a length of 12.5 msec is defined. N may be other than 8. The description of FIGS. 3B to 3D will be described later, and returns to FIG.
 選択部34は、フレームに含まれた複数のサブフレームのうち、路車送信期間を設定すべきサブフレームを選択する。具体的に説明すると、選択部34は、フレーム規定部32にて規定されたフレームを受けつける。また、選択部34は、図示しないインターフェイスを介して、選択したサブフレームに関する指示を受けつける。選択部34は、指示に対応したサブフレームを選択する。これとは別に、選択部34は、自動的にサブフレームを選択してもよい。その際、選択部34は、RF部22、変復調部24を介して、図示しない他の基地局装置10あるいは端末装置14からの復調結果を入力する。選択部34は、入力した復調結果のうち、他の基地局装置10からの復調結果を抽出する。選択部34は、復調結果を受けつけたサブフレームを特定することによって、復調結果を受けつけていないサブフレームを特定する。 The selection unit 34 selects a subframe in which a road and vehicle transmission period is to be set from among a plurality of subframes included in the frame. More specifically, the selection unit 34 receives a frame defined by the frame defining unit 32. The selection unit 34 receives an instruction regarding the selected subframe via an interface (not shown). The selection unit 34 selects a subframe corresponding to the instruction. Apart from this, the selection unit 34 may automatically select a subframe. At this time, the selection unit 34 inputs a demodulation result from another base station device 10 or the terminal device 14 (not shown) via the RF unit 22 and the modem unit 24. The selection part 34 extracts the demodulation result from the other base station apparatus 10 among the input demodulation results. The selection unit 34 specifies the subframe that has not received the demodulation result by specifying the subframe that has received the demodulation result.
 これは、他の基地局装置10によって路車送信期間が設定されていないサブフレーム、つまり未使用のサブフレームを特定することに相当する。未使用のサブフレームが複数存在する場合、選択部34は、ランダムにひとつのサブフレームを選択する。未使用のサブフレームが存在しない場合、つまり複数のサブフレームのそれぞれが使用されている場合に、選択部34は、復調結果に対応した受信電力を取得し、受信電力の小さいサブフレームを優先的に選択する。 This corresponds to specifying a subframe in which the road and vehicle transmission period is not set by another base station apparatus 10, that is, an unused subframe. When there are a plurality of unused subframes, the selection unit 34 selects one subframe at random. When there are no unused subframes, that is, when each of a plurality of subframes is used, the selection unit 34 acquires reception power corresponding to the demodulation result, and gives priority to subframes with low reception power. Select
 図3(b)は、図示しない第1基地局装置10aによって生成されるフレームの構成を示す。第1基地局装置10aは、第1サブフレームの先頭部分に路車送信期間を設定する。また、第1基地局装置10aは、第1サブフレームにおいて路車送信期間につづいて車車送信期間を設定する。車車送信期間とは、端末装置14がパケット信号を報知可能な期間である。つまり、第1基地局装置10aは、第1サブフレームの先頭期間である路車送信期間においてパケット信号を報知可能であり、かつフレームのうち、路車送信期間以外の車車送信期間において端末装置14がパケット信号を報知可能であるような規定がなされる。さらに、第1基地局装置10aは、第2サブフレームから第Nサブフレームに車車送信期間のみを設定する。 FIG. 3B shows a configuration of a frame generated by the first base station apparatus 10a (not shown). The first base station apparatus 10a sets a road and vehicle transmission period at the beginning of the first subframe. Moreover, the 1st base station apparatus 10a sets a vehicle transmission period following the road and vehicle transmission period in a 1st sub-frame. The vehicle transmission period is a period during which the terminal device 14 can notify the packet signal. That is, the first base station apparatus 10a can notify the packet signal in the road and vehicle transmission period which is the first period of the first subframe, and the terminal apparatus in the vehicle and vehicle transmission period other than the road and vehicle transmission period in the frame. It is specified that 14 can broadcast the packet signal. Furthermore, the first base station apparatus 10a sets only the vehicle transmission period from the second subframe to the Nth subframe.
 図3(c)は、図示しない第2基地局装置10bによって生成されるフレームの構成を示す。第2基地局装置10bは、第2サブフレームの先頭部分に路車送信期間を設定する。また、第2基地局装置10bは、第2サブフレームにおける路車送信期間の後段、第1サブフレーム、第3サブフレームから第Nサブフレームに車車送信期間を設定する。図3(d)は、図示しない第3基地局装置10cによって生成されるフレームの構成を示す。第3基地局装置10cは、第3サブフレームの先頭部分に路車送信期間を設定する。また、第3基地局装置10cは、第3サブフレームにおける路車送信期間の後段、第1サブフレーム、第2サブフレーム、第4サブフレームから第Nサブフレームに車車送信期間を設定する。このように、複数の基地局装置10は、互いに異なったサブフレームを選択し、選択したサブフレームの先頭部分に路車送信期間を設定する。図2に戻る。選択部34は、選択したサブフレームの番号を生成部36へ出力する。 FIG. 3C shows a configuration of a frame generated by the second base station apparatus 10b (not shown). The second base station apparatus 10b sets a road and vehicle transmission period at the beginning of the second subframe. Also, the second base station apparatus 10b sets the vehicle transmission period from the first stage of the road and vehicle transmission period in the second subframe, from the first subframe and the third subframe to the Nth subframe. FIG. 3D shows a configuration of a frame generated by a third base station apparatus 10c (not shown). The third base station apparatus 10c sets a road and vehicle transmission period at the beginning of the third subframe. In addition, the third base station apparatus 10c sets the vehicle transmission period from the first stage of the road and vehicle transmission period in the third subframe, the first subframe, the second subframe, and the fourth subframe to the Nth subframe. As described above, the plurality of base station apparatuses 10 select different subframes, and set the road and vehicle transmission period at the head portion of the selected subframe. Returning to FIG. The selection unit 34 outputs the selected subframe number to the generation unit 36.
 生成部36は、選択部34から、サブフレームの番号を受けつける。生成部36は、受けつけたサブフレーム番号のサブフレームに路車送信期間を設定し、路車送信期間において報知すべきパケット信号を生成する。ひとつの路車送信期間において複数のパケット信号が送信される場合、生成部36は、それらを生成する。パケット信号は、制御情報、ペイロードによって構成されている。制御情報には、路車送信期間を設定したサブフレーム番号等が含まれる。また、ペイロードには、例えば、事故情報、渋滞情報、信号情報等が含まれる。これらのデータは、ネットワーク通信部30によって、図示しないネットワーク202から取得される。処理部26は、変復調部24、RF部22に対して、路車送信期間においてパケット信号をブロードキャスト送信させる。制御部28は、基地局装置10全体の処理を制御する。 The generation unit 36 receives a subframe number from the selection unit 34. The generation unit 36 sets a road and vehicle transmission period in the subframe of the received subframe number, and generates a packet signal to be notified during the road and vehicle transmission period. When a plurality of packet signals are transmitted in one road and vehicle transmission period, the generation unit 36 generates them. The packet signal is composed of control information and a payload. The control information includes a subframe number in which a road and vehicle transmission period is set. The payload includes, for example, accident information, traffic jam information, signal information, and the like. These data are acquired from the network 202 (not shown) by the network communication unit 30. The processing unit 26 broadcasts the packet signal to the modem unit 24 and the RF unit 22 during the road and vehicle transmission period. The control unit 28 controls processing of the entire base station device 10.
 この構成は、ハードウエア的には、任意のコンピュータのCPU、メモリ、その他のLSIで実現でき、ソフトウエア的にはメモリにロードされたプログラムなどによって実現されるが、ここではそれらの連携によって実現される機能ブロックを描いている。したがって、これらの機能ブロックがハードウエアのみ、ハードウエアとソフトウエアの組合せによっていろいろな形で実現できることは、当業者には理解されるところである。 This configuration can be realized in terms of hardware by a CPU, memory, or other LSI of any computer, and in terms of software, it can be realized by a program loaded in the memory, but here it is realized by their cooperation. Draw functional blocks. Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms only by hardware, or by a combination of hardware and software.
 図4は、通信システム100の別の構成を示す。ここでは、基地局装置10が省略されており、中心に交差点が配置されている。また、図1とは車両12(第1車両12a~第7車両12g)の配置が異なっている。第1車両12a、第3車両12c、第7車両12gが左から右へ進行しており、第2車両12b、第4車両12d、第5車両12eが右から左へ進行しており、第6車両12fが上から下へ進行している。ここで、第1車両12a、第2車両12bは、右折待ちの状態である。また、第1車両12aにとって、第4車両12d、第5車両12eが死角に存在し、第2車両12bにとって、第3車両12c、第7車両12gが死角に存在する。 FIG. 4 shows another configuration of the communication system 100. Here, the base station apparatus 10 is omitted, and an intersection is arranged at the center. Also, the arrangement of vehicles 12 (first vehicle 12a to seventh vehicle 12g) is different from that in FIG. The first vehicle 12a, the third vehicle 12c, and the seventh vehicle 12g are traveling from left to right, the second vehicle 12b, the fourth vehicle 12d, and the fifth vehicle 12e are traveling from right to left. The vehicle 12f is traveling from top to bottom. Here, the first vehicle 12a and the second vehicle 12b are in a state of waiting for a right turn. Further, for the first vehicle 12a, the fourth vehicle 12d and the fifth vehicle 12e exist in the blind spot, and for the second vehicle 12b, the third vehicle 12c and the seventh vehicle 12g exist in the blind spot.
 これらの車両12のうち、第1車両12a、第2車両12b、第6車両12fには、端末装置14が搭載されている。なお、説明の便宜上、第1車両12a、第2車両12b、第6車両12fのそれぞれに搭載された端末装置14は、第1端末装置14a、第2端末装置14b、第6端末装置14fであるとする。第1車両12aには、後方のセンサと前方のセンサも搭載されており、それらによってそれぞれ第1後方ビーム90a、第1前方ビーム92aが形成されている。第2車両12b、第6車両12fについても第1車両12aと同様であり、第2後方ビーム90b、第2前方ビーム92b、第6後方ビーム90f、第6前方ビーム92fが形成される。なお、残りの車両12には、端末装置14が搭載されていない。 Among these vehicles 12, the terminal device 14 is mounted on the first vehicle 12a, the second vehicle 12b, and the sixth vehicle 12f. For convenience of explanation, the terminal devices 14 mounted on the first vehicle 12a, the second vehicle 12b, and the sixth vehicle 12f are the first terminal device 14a, the second terminal device 14b, and the sixth terminal device 14f, respectively. And The first vehicle 12a is also equipped with a rear sensor and a front sensor, which form a first rear beam 90a and a first front beam 92a, respectively. The second vehicle 12b and the sixth vehicle 12f are the same as the first vehicle 12a, and a second rear beam 90b, a second front beam 92b, a sixth rear beam 90f, and a sixth front beam 92f are formed. Note that the terminal device 14 is not mounted on the remaining vehicles 12.
 第1後方ビーム90aによって、第3車両12cと第7車両12gとが検出され、そのことが示された検出情報が第1端末装置14aによって取得される。第1端末装置14aは、第1車両12aの位置情報も取得し、位置情報と検出情報とが含まれたパケット信号を報知する。第2端末装置14bは、第1端末装置14aからパケット信号を受信し、パケット信号から位置情報と検出情報とを抽出する。第2端末装置14bは、位置情報から第1車両12aの存在を認識し、検出情報から第3車両12c、第7車両12gの存在を認識する。その結果、第3車両12c、第7車両12gに端末装置14が搭載されていなくても、その存在が知らしめられる。第2端末装置14bから第1端末装置14aへのパケット信号も同様である。 The third vehicle 12c and the seventh vehicle 12g are detected by the first rear beam 90a, and detection information indicating that is acquired by the first terminal device 14a. The 1st terminal device 14a also acquires the positional information on the 1st vehicle 12a, and alert | reports the packet signal containing the positional information and detection information. The second terminal device 14b receives the packet signal from the first terminal device 14a, and extracts position information and detection information from the packet signal. The second terminal device 14b recognizes the presence of the first vehicle 12a from the position information, and recognizes the presence of the third vehicle 12c and the seventh vehicle 12g from the detection information. As a result, even if the terminal device 14 is not mounted on the third vehicle 12c and the seventh vehicle 12g, the presence thereof is informed. The same applies to the packet signal from the second terminal device 14b to the first terminal device 14a.
 また、第6前方ビーム92fによって、横断歩道を歩行中の第1歩行者16a、第2歩行者16b、第3歩行者16cの存在が検出され、そのことが示された検出情報が第6端末装置14fによって取得される。第6端末装置14fも、位置情報と検出情報とが含まれたパケット信号を報知することによって、歩行者16の情報が知らしめられる。 Further, the presence of the first pedestrian 16a, the second pedestrian 16b, and the third pedestrian 16c who are walking on the pedestrian crossing is detected by the sixth front beam 92f, and the detection information indicating that is the sixth terminal. Acquired by the device 14f. The sixth terminal device 14f also informs the information of the pedestrian 16 by notifying the packet signal including the position information and the detection information.
 図5は、車両12に搭載された端末装置14の構成を示す。前述のごとく、車両12には、歩行者が含まれてもよい。端末装置14は、アンテナ50、RF部52、変復調部54、処理部56、制御部58を含む。処理部56は、タイミング特定部60、転送決定部62、第1取得部64、生成部66、通知部70、第2取得部80、第3取得部82を含む。また、タイミング特定部60は、抽出部72、キャリアセンス部74を含む。さらに、後方センサ84は第2取得部80に接続され、前方センサ86は第3取得部82に接続される。アンテナ50、RF部52、変復調部54は、図2のアンテナ20、RF部22、変復調部24と同様の処理を実行する。ここでは差異を中心に説明する。 FIG. 5 shows the configuration of the terminal device 14 mounted on the vehicle 12. As described above, the vehicle 12 may include a pedestrian. The terminal device 14 includes an antenna 50, an RF unit 52, a modem unit 54, a processing unit 56, and a control unit 58. The processing unit 56 includes a timing specifying unit 60, a transfer determination unit 62, a first acquisition unit 64, a generation unit 66, a notification unit 70, a second acquisition unit 80, and a third acquisition unit 82. The timing specifying unit 60 includes an extraction unit 72 and a carrier sense unit 74. Further, the rear sensor 84 is connected to the second acquisition unit 80, and the front sensor 86 is connected to the third acquisition unit 82. The antenna 50, the RF unit 52, and the modem unit 54 execute the same processing as the antenna 20, the RF unit 22, and the modem unit 24 in FIG. Here, the difference will be mainly described.
 変復調部54、処理部56は、受信処理において、図示しない他の端末装置14あるいは基地局装置10からのパケット信号を受信する。なお、前述のごとく、変復調部54、処理部56は、路車送信期間において、基地局装置10からのパケット信号を受信し、車車送信期間において、他の端末装置14からのパケット信号を受信する。 The modem unit 54 and the processing unit 56 receive a packet signal from another terminal device 14 or the base station device 10 (not shown) in the reception process. As described above, the modem unit 54 and the processing unit 56 receive a packet signal from the base station apparatus 10 during the road-to-vehicle transmission period, and receive packet signals from other terminal apparatuses 14 during the vehicle-to-vehicle transmission period. To do.
 抽出部72は、変復調部54からの復調結果が、図示しない基地局装置10からのパケット信号である場合に、路車送信期間が配置されたサブフレームのタイミングを特定する。その際、抽出部72は、図1のエリア212内に存在すると推定する。抽出部72は、サブフレームのタイミングと、パケット信号のメッセージヘッダの内容をもとに、フレームを生成する。その結果、抽出部72は、基地局装置10において形成されたフレームに同期したフレームを生成する。パケット信号の報知元が、他の端末装置14である場合、抽出部72は、同期したフレームの生成処理を省略する。抽出部72は、エリア212内に存在する場合、使用されている路車送信期間を特定した後、残りの車車送信期間を特定する。抽出部72は、フレームおよびサブフレームのタイミング、車車送信期間に関する情報をキャリアセンス部74へ出力する。 The extraction unit 72 specifies the timing of the subframe in which the road and vehicle transmission period is arranged when the demodulation result from the modem unit 54 is a packet signal from the base station device 10 (not shown). In that case, the extraction part 72 estimates that it exists in the area 212 of FIG. The extraction unit 72 generates a frame based on the subframe timing and the content of the message header of the packet signal. As a result, the extraction unit 72 generates a frame synchronized with the frame formed in the base station device 10. When the notification source of the packet signal is another terminal device 14, the extraction unit 72 omits the synchronized frame generation process. If the extraction unit 72 exists in the area 212, the extraction unit 72 specifies the remaining vehicle transmission period after specifying the road and vehicle transmission period in use. The extraction unit 72 outputs information on frame and subframe timing and vehicle transmission period to the carrier sense unit 74.
 一方、抽出部72は、基地局装置10からのパケット信号を受けつけていない場合、つまり基地局装置10に同期したフレームを生成していない場合、図1のエリア外214に存在すると推定する。抽出部72は、エリア外214に存在する場合、フレームの構成と無関係のタイミングを選択し、フレームの構成に関係のないキャリアセンスの実行をキャリアセンス部74に指示する。 On the other hand, when the extraction unit 72 has not received a packet signal from the base station apparatus 10, that is, when a frame synchronized with the base station apparatus 10 has not been generated, the extraction unit 72 estimates that it is outside the area 214 in FIG. When the extraction unit 72 exists outside the area 214, the extraction unit 72 selects a timing unrelated to the frame configuration, and instructs the carrier sense unit 74 to execute carrier sense unrelated to the frame configuration.
 キャリアセンス部74は、抽出部72から、フレームおよびサブフレームのタイミング、車車送信期間に関する情報を受けつける。キャリアセンス部74は、車車送信期間内でCSMA/CAを開始することによって送信タイミングを決定する。これは、路車送信期間に対してNAV(Network Allocation Vector)を設定し、NAVを設定した期間以外でキャリアセンスを実行することに相当する。一方、キャリアセンス部74は、抽出部72から、フレームの構成に関係のないキャリアセンスの実行を指示された場合、フレームの構成を考慮せずに、CSMA/CAを実行することによって、送信タイミングを決定する。キャリアセンス部74は、決定した送信タイミングを変復調部54、RF部52へ通知し、後述の生成部66において生成されたパケット信号をブロードキャスト送信させる。 The carrier sense unit 74 receives information on frame and subframe timing and vehicle transmission period from the extraction unit 72. The carrier sense unit 74 determines the transmission timing by starting CSMA / CA within the vehicle transmission period. This is equivalent to setting NAV (Network Allocation Vector) for the road and vehicle transmission period and performing carrier sense outside the period in which NAV is set. On the other hand, when the carrier sense unit 74 is instructed by the extraction unit 72 to execute carrier sense not related to the frame configuration, the carrier sense unit 74 performs transmission timing by executing CSMA / CA without considering the frame configuration. To decide. The carrier sense unit 74 notifies the modem 54 and the RF unit 52 of the determined transmission timing, and broadcasts the packet signal generated by the generation unit 66 described later.
 転送決定部62は、制御情報の転送を制御する。転送決定部62は、制御情報のうち、転送対象となる情報を抽出する。転送決定部62は、抽出した情報をもとに、転送すべき情報を生成する。ここでは、この処理の説明を省略する。転送決定部62は、転送すべき情報、つまり制御情報のうちの一部を生成部66に出力する。 The transfer determination unit 62 controls transfer of control information. The transfer determination unit 62 extracts information to be transferred from the control information. The transfer determination unit 62 generates information to be transferred based on the extracted information. Here, the description of this process is omitted. The transfer determination unit 62 outputs information to be transferred, that is, a part of the control information, to the generation unit 66.
 第1取得部64は、図示しないGPS受信機、ジャイロスコープ、車速センサ等を含んでおり、それらから供給されるデータによって、図示しない車両12、つまり端末装置14が搭載された車両12の存在位置、進行方向、移動速度等(以下、「位置情報」と総称する)を取得する。なお、存在位置は、緯度・経度によって示される。つまり、車両12の存在位置は絶対的な値として示されている。これらの取得には公知の技術が使用されればよいので、ここでは説明を省略する。また、GPS受信機、ジャイロスコープ、車速センサ等は、端末装置14の外部にあってもよい。第1取得部64は、位置情報を生成部66へ出力する。 The first acquisition unit 64 includes a GPS receiver, a gyroscope, a vehicle speed sensor, and the like (not shown), and the presence position of the vehicle 12 (not shown), that is, the vehicle 12 on which the terminal device 14 is mounted, based on data supplied from them. The traveling direction, the moving speed, etc. (hereinafter collectively referred to as “position information”) are acquired. The existence position is indicated by latitude and longitude. That is, the presence position of the vehicle 12 is shown as an absolute value. Since a known technique may be used for these acquisitions, description thereof is omitted here. The GPS receiver, gyroscope, vehicle speed sensor, and the like may be outside the terminal device 14. The first acquisition unit 64 outputs the position information to the generation unit 66.
 後方センサ84は、車両12の後部に設置されており、車両12の後方に存在する物体の存在を検出する。後方センサ84は、例えばミリ波レーダであり、その検出範囲は、図4の第1後方ビーム90aとして示される。なお、第1後方ビーム90aは、説明を明瞭にするために記載されたものであり、第1後方ビーム90aから外れた部分に存在する物体の存在が検出されてもよい。後方センサ84は、公知の技術によって、物体の位置を検出する。なお、後方センサ84は、画像の撮像装置であってもよく、撮像した画像から物体を検出してもよい。その際、画像処理によって物体の種類が検出されてもよい。種類は、歩行者、二輪車、小型・中型車、大型車のように分類される。後方センサ84は、検出した物体の情報(以下、「検出情報」という)を第2取得部80へ出力する。複数の物体が検出された場合、それらすべてが検出情報に含まれてもよく、それらのうちの少なくともひとつ、例えばひとつだけが検出情報に含まれてもよい。 The rear sensor 84 is installed at the rear part of the vehicle 12 and detects the presence of an object existing behind the vehicle 12. The rear sensor 84 is, for example, a millimeter wave radar, and its detection range is shown as a first rear beam 90a in FIG. Note that the first rear beam 90a is described for the sake of clarity, and the presence of an object existing in a portion outside the first rear beam 90a may be detected. The rear sensor 84 detects the position of the object by a known technique. The rear sensor 84 may be an image capturing device, and may detect an object from the captured image. At that time, the type of the object may be detected by image processing. The types are classified as pedestrians, motorcycles, small / medium-sized vehicles, and large vehicles. The rear sensor 84 outputs information on the detected object (hereinafter referred to as “detection information”) to the second acquisition unit 80. When a plurality of objects are detected, all of them may be included in the detection information, and at least one of them, for example, only one may be included in the detection information.
 第2取得部80は、後方センサ84から、車両12の後方に存在する物体の検出情報を取得する。ここでは、第2取得部80の処理を説明する。図6(a)-(b)は、第2取得部80における処理概要を示す。図6(a)は、車両12の後方に形成されるエリアを示す。これは、後方ビーム90によって物体を検出可能なエリアである。図示のごとく、エリアは、第1エリア120、第2エリア122、第3エリア124、第4エリア126に分割される。第2取得部80は、物体の位置が、第1エリア120、第2エリア122、第3エリア124、第4エリア126のいずれに含まれるかを特定する。図6(b)は、第2取得部80に記憶されたテーブルのデータ構造を示す。図示のごとく、内容欄130、コード欄132が含まれる。第2取得部80は、特定したエリアをもとにコードを導出する。 The second acquisition unit 80 acquires detection information of an object existing behind the vehicle 12 from the rear sensor 84. Here, the process of the 2nd acquisition part 80 is demonstrated. FIGS. 6A and 6B show an outline of processing in the second acquisition unit 80. FIG. FIG. 6A shows an area formed behind the vehicle 12. This is an area where an object can be detected by the rear beam 90. As illustrated, the area is divided into a first area 120, a second area 122, a third area 124, and a fourth area 126. The second acquisition unit 80 specifies whether the position of the object is included in the first area 120, the second area 122, the third area 124, or the fourth area 126. FIG. 6B shows the data structure of the table stored in the second acquisition unit 80. As shown, a content column 130 and a code column 132 are included. The second acquisition unit 80 derives a code based on the identified area.
 図7(a)-(b)は、第2取得部80における別の処理概要を示す。図7(a)は、車両12の後方に形成される間隔を示す。これは、車両12からの距離を離散的に分割した間隔である。図示のごとく、間隔は、車両12から遠ざかる方向において、第1間隔140、第2間隔142、第3間隔144、第4間隔146、第5間隔148のように分割される。第2取得部80は、物体の位置が、第1間隔140、第2間隔142、第3間隔144、第4間隔146、第5間隔148のいずれに含まれるかを特定する。図7(b)は、第2取得部80に記憶された別のテーブルのデータ構造を示す。図示のごとく、内容欄150、コード欄152が含まれる。第2取得部80は、特定した間隔をもとにコードを導出する。図5に戻る。第2取得部80は、物体の位置の代わりに、導出したふたつのコードの組合せを検出情報に含める。つまり、検出情報では、物体の存在位置が車両12からの相対的な値として示されるとともに、物体の存在位置がコード情報として示される。 FIGS. 7A to 7B show another processing outline in the second acquisition unit 80. FIG. FIG. 7A shows an interval formed behind the vehicle 12. This is an interval obtained by discretely dividing the distance from the vehicle 12. As illustrated, the interval is divided into a first interval 140, a second interval 142, a third interval 144, a fourth interval 146, and a fifth interval 148 in the direction away from the vehicle 12. The second acquisition unit 80 specifies whether the position of the object is included in the first interval 140, the second interval 142, the third interval 144, the fourth interval 146, or the fifth interval 148. FIG. 7B shows the data structure of another table stored in the second acquisition unit 80. As shown, a content column 150 and a code column 152 are included. The second acquisition unit 80 derives a code based on the specified interval. Returning to FIG. The second acquisition unit 80 includes the derived combination of the two codes in the detection information instead of the position of the object. That is, in the detection information, the presence position of the object is indicated as a relative value from the vehicle 12, and the presence position of the object is indicated as code information.
 図8は、第2取得部80に記憶されたさらに別のテーブルのデータ構造を示す。図示のごとく、内容欄160、コード欄162が含まれる。第2取得部80は、物体の種類からコードを導出する。ここで、後方センサ84として撮像装置が使用されている。図5に戻る。第2取得部80は、物体の種類の代わりに、導出したコードを検出情報に含める。つまり、検出情報では、物体の種類がコード情報として示される。 FIG. 8 shows the data structure of yet another table stored in the second acquisition unit 80. As shown, a content column 160 and a code column 162 are included. The second acquisition unit 80 derives a code from the type of object. Here, an imaging device is used as the rear sensor 84. Returning to FIG. The second acquisition unit 80 includes the derived code in the detection information instead of the type of the object. That is, the detection information indicates the type of the object as code information.
 前方センサ86は、車両12の前部に設置されており、車両12の前方に存在する物体の存在を検出する。第3取得部82は、車両12の前方に存在する物体の検出情報を取得する。第3取得部82、前方センサ86は、第2取得部80、後方センサ84と同様の処理を実行すればよいので、ここでは説明を省略する。なお、後方の物体に対する検出情報と、前方の物体に対する検出情報とを区別するために、コードは互いに異なるように規定される。 The front sensor 86 is installed in the front part of the vehicle 12 and detects the presence of an object existing in front of the vehicle 12. The third acquisition unit 82 acquires detection information of an object that exists in front of the vehicle 12. Since the 3rd acquisition part 82 and the front sensor 86 should just perform the process similar to the 2nd acquisition part 80 and the back sensor 84, description is abbreviate | omitted here. It should be noted that the codes are defined to be different from each other in order to distinguish the detection information for the rear object from the detection information for the front object.
 生成部66は、第1取得部64から位置情報、第2取得部80からの検出情報、第3取得部82からの検出情報を受けつける。さらに、生成部66は、転送決定部62から制御情報の一部を受けつける。生成部66は、受けつけた制御情報の一部を制御情報に格納し、位置情報、検出情報をペイロードに格納することによって、パケット信号を生成する。前述のごとく、処理部56、変復調部54、RF部52は、生成部66において生成したパケット信号を報知する。 The generation unit 66 receives position information from the first acquisition unit 64, detection information from the second acquisition unit 80, and detection information from the third acquisition unit 82. Further, the generation unit 66 receives part of the control information from the transfer determination unit 62. The generating unit 66 generates a packet signal by storing a part of the received control information in the control information and storing the position information and the detection information in the payload. As described above, the processing unit 56, the modem unit 54, and the RF unit 52 notify the packet signal generated by the generation unit 66.
 抽出部72は、他の端末装置14からのパケット信号に含まれた位置情報、検出情報を抽出し、位置情報、検出情報を通知部70へ出力する。通知部70は、抽出部72から、位置情報、検出情報を取得する。通知部70は、抽出部72から取得した他の端末装置14の位置情報と、第1取得部64から入力した位置情報とをもとに、他の車両12の接近を検出する。通知部70は、モニタあるいはスピーカを介して運転者へ、他の車両12の接近を通知する。また、通知部70は、検出情報に含まれた各種のコードをもとに、物体の位置および種類を特定する。ここでは、図6(b)、図7(b)、図8をもとに、第2取得部80とは逆の処理を実行することによって、コードから内容が特定される。なお、物体の位置は、他の車両12に対する相対的な位置として示されているので、前述のように導出した他の車両12の位置をもとに、相対的な位置を考慮することによって、物体の位置を特定する。送信側において、コード化がなされていることによって、物体の位置は厳密ではない。さらに、通知部70は、特定した物体の位置と種類も通知する。制御部58は、端末装置14の動作を制御する。 The extraction unit 72 extracts the position information and detection information included in the packet signal from the other terminal device 14 and outputs the position information and detection information to the notification unit 70. The notification unit 70 acquires position information and detection information from the extraction unit 72. The notification unit 70 detects the approach of the other vehicle 12 based on the position information of the other terminal device 14 acquired from the extraction unit 72 and the position information input from the first acquisition unit 64. The notification unit 70 notifies the driver of the approach of another vehicle 12 through a monitor or a speaker. Further, the notification unit 70 specifies the position and type of the object based on various codes included in the detection information. Here, based on FIG. 6B, FIG. 7B, and FIG. 8, the content is specified from the code by executing the reverse process of the second acquisition unit 80. In addition, since the position of the object is shown as a relative position with respect to the other vehicle 12, by considering the relative position based on the position of the other vehicle 12 derived as described above, Identify the position of the object. Due to the encoding on the transmitting side, the position of the object is not exact. Furthermore, the notification unit 70 also notifies the position and type of the identified object. The control unit 58 controls the operation of the terminal device 14.
 以上の構成による通信システム100の動作を説明する。図9は、端末装置14における受信手順を示すフローチャートである。抽出部72は、パケット信号から位置情報を抽出し(S10)、通知部70は、他の車両12の位置を特定する(S12)。パケット信号に検出情報が含まれている場合(S14のY)、抽出部72は、パケット信号から検出情報を抽出する(S16)。通知部70は、物体の位置を特定する(S18)。パケット信号に検出情報が含まれていない場合(S14のN)、処理は終了される。 The operation of the communication system 100 configured as above will be described. FIG. 9 is a flowchart showing a reception procedure in the terminal device 14. The extraction unit 72 extracts position information from the packet signal (S10), and the notification unit 70 specifies the position of another vehicle 12 (S12). When the detection information is included in the packet signal (Y in S14), the extraction unit 72 extracts the detection information from the packet signal (S16). The notification unit 70 specifies the position of the object (S18). If detection information is not included in the packet signal (N in S14), the process is terminated.
 本発明の実施例によれば、位置情報に加えて、後方の物体の検出情報も報知するので、後方の物体が端末装置を備えていなくても、後方の物体の存在を知らせることができる。また、後方の物体の存在が知らされるので、端末装置の普及率が低い場合であっても、車両の事故発生を抑制できる。また、車両の存在位置が絶対的な値として示される一方で、物体の存在位置が車両からの相対的な値として示されるので、情報量を低減できる。また、物体の存在位置がコード情報として示されるので、情報量を低減できる。また、物体の種類がコード情報として示されるので、情報量を低減できる。また、前方の物体の検出情報も報知するので、車両の事故発生をさらに抑制できる。 According to the embodiment of the present invention, since the detection information of the rear object is also notified in addition to the position information, the existence of the rear object can be notified even if the rear object does not include the terminal device. In addition, since the presence of an object behind is informed, the occurrence of a vehicle accident can be suppressed even when the penetration rate of the terminal device is low. In addition, while the vehicle location is indicated as an absolute value, the object location is indicated as a relative value from the vehicle, so the amount of information can be reduced. Further, since the presence position of the object is indicated as code information, the amount of information can be reduced. Also, since the type of object is indicated as code information, the amount of information can be reduced. Moreover, since the detection information of the object ahead is also notified, the occurrence of a vehicle accident can be further suppressed.
 図10は、第2取得部80および第3取得部82における別の処理概要を示す。図10は、車両12を中心として、車両位置に対する絶対的なエリア位置を示す。これは、車両12からの方角(方位)および距離を離散的に分割したエリア位置である。図示のごとく、エリア位置は、車両12の絶対的な方角および距離によって、北側エリアN1~N15、東側エリアE1~E9、南側エリアS1~S15、西側エリアW1~W9のようにエリア位置をコードとして分割定義される。車両12の方角は、車両12に搭載される図示しない電子コンパスまたはGPSによる走行軌跡等から決定される。 FIG. 10 shows another processing outline in the second acquisition unit 80 and the third acquisition unit 82. FIG. 10 shows the absolute area position with respect to the vehicle position with the vehicle 12 as the center. This is an area position obtained by discretely dividing the direction (azimuth) and distance from the vehicle 12. As shown in the figure, the area position is determined according to the absolute direction and distance of the vehicle 12, using the area position as a code such as the north area N1 to N15, the east area E1 to E9, the south area S1 to S15, and the west area W1 to W9. It is divided and defined. The direction of the vehicle 12 is determined from an electronic compass (not shown) mounted on the vehicle 12 or a travel locus by GPS.
 車両12は前方センサ86によって検出範囲90Fの範囲の物体を検出し、また後方センサ84によって検出範囲90Rの範囲の物体を検出する。第2取得部80は、後方センサ84によって検出された物体が、北側エリアN1~N15、東側エリアE1~E9、南側エリアS1~S15、西側エリアW1~W9のいずれのエリア位置に含まれるかを特定する。第3取得部82は、前方センサ86によって検出された物体が、北側エリアN1~N15、東側エリアE1~E9、南側エリアS1~S15、西側エリアW1~W9のいずれのエリア位置に含まれるかを特定する。 The vehicle 12 detects an object in the detection range 90F by the front sensor 86, and detects an object in the detection range 90R by the rear sensor 84. The second acquisition unit 80 determines whether the object detected by the rear sensor 84 is included in any of the area positions of the north areas N1 to N15, the east areas E1 to E9, the south areas S1 to S15, and the west areas W1 to W9. Identify. The third acquisition unit 82 determines whether the object detected by the front sensor 86 is included in any of the area positions of the north areas N1 to N15, the east areas E1 to E9, the south areas S1 to S15, and the west areas W1 to W9. Identify.
 第2取得部80および第3取得部82は、導出したコード情報をそれぞれの検出情報に含める。つまりこの時の検出情報は、物体の検出位置が車両12からの方角および距離情報として示されるとともに、物体の存在位置がコード情報として示される。なお、コード情報は、車両位置に対する絶対的なエリア位置を示すので、後方の物体に対する検出情報と、前方の物体に対する検出情報とを区別することなく、共通に利用することができる。 The second acquisition unit 80 and the third acquisition unit 82 include the derived code information in each detection information. That is, in the detection information at this time, the detection position of the object is indicated as direction and distance information from the vehicle 12, and the presence position of the object is indicated as code information. Since the code information indicates an absolute area position with respect to the vehicle position, the detection information for the rear object and the detection information for the front object can be used in common without distinguishing them.
 生成部66は、第1取得部64からの位置情報、第2取得部80からの検出情報、第3取得部82からの検出情報を受けつける。以降の処理は、先の実施例と同様である。 The generation unit 66 receives position information from the first acquisition unit 64, detection information from the second acquisition unit 80, and detection information from the third acquisition unit 82. The subsequent processing is the same as in the previous embodiment.
 本変形例によれば、他の車両の周囲で検出される物体の存在位置が、他の車両の位置情報を基準とした絶対的なエリア位置情報(コード情報)として示されるので、情報量を低減できるとともに、地図上における物体のマッピングが容易になる。 According to this modification, the position of the object detected around the other vehicle is indicated as absolute area position information (code information) based on the position information of the other vehicle. It can be reduced, and the mapping of the object on the map becomes easy.
 以上、本発明を実施例をもとに説明した。この実施例は例示であり、それらの各構成要素あるいは各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The present invention has been described based on the embodiments. This embodiment is an exemplification, and it will be understood by those skilled in the art that various modifications can be made to each of those constituent elements or combinations of processing processes, and such modifications are also within the scope of the present invention. .
 本発明の実施例によれば、第2取得部80によって後方の物体の検出情報が取得され、第3取得部82によって前方の物体の検出情報が取得される。しかしながらこれに限らず例えば、第3取得部82によって前方の物体の検出情報が取得されなくてもよい。その場合、前方センサ86、第3取得部82が含まれなくてもよい。本変形例によれば、構成を簡易にできる。 According to the embodiment of the present invention, the second acquisition unit 80 acquires the detection information of the rear object, and the third acquisition unit 82 acquires the detection information of the front object. However, the present invention is not limited to this, and for example, the detection information of the front object may not be acquired by the third acquisition unit 82. In that case, the front sensor 86 and the third acquisition unit 82 may not be included. According to this modification, the configuration can be simplified.
 本発明の実施例によれば、検出情報には、物体の位置と種類とが含まれている。しかしながらこれに限らず例えば、物体の種類が含まれなくてもよく、物体の位置だけが含まれてもよい。本変形例によれば、情報量を低減できる。 According to the embodiment of the present invention, the detection information includes the position and type of the object. However, the present invention is not limited to this. For example, the type of the object may not be included, and only the position of the object may be included. According to this modification, the amount of information can be reduced.
 本発明の実施例によれば、検出情報にはコード化がなされている。しかしながらこれに限らず例えば、コード化がなされていなくてもよい。本変形例によれば、物体の位置がそのまま通知されるので、正確な位置を知らせることができる。 According to the embodiment of the present invention, the detection information is encoded. However, the present invention is not limited to this. For example, the coding may not be performed. According to this modification, the position of the object is notified as it is, so that an accurate position can be notified.
 本発明の一態様の概要は、次の通りである。本発明のある態様の端末装置は、車両に搭載可能な端末装置であって、車両の位置情報を取得する第1取得部と、車両の後方に存在する物体の検出情報を取得する第2取得部と、第1取得部において取得した位置情報と、第2取得部において取得した検出情報とが含まれたパケット信号を生成する生成部と、生成部において生成したパケット信号を報知する報知部と、を備える。 The outline of one embodiment of the present invention is as follows. A terminal device according to an aspect of the present invention is a terminal device that can be mounted on a vehicle, and includes a first acquisition unit that acquires position information of the vehicle, and a second acquisition that acquires detection information of an object existing behind the vehicle. A generating unit that generates a packet signal including the position information acquired in the first acquiring unit and the detection information acquired in the second acquiring unit, and a notifying unit that notifies the packet signal generated in the generating unit .
 この態様によると、位置情報に加えて、後方の物体の検出情報も報知するので、後方の物体が端末装置を備えていなくても、後方の物体の存在の通知によって、車両の事故発生を抑制できる。 According to this aspect, in addition to the position information, the detection information of the rear object is also notified, so even if the rear object is not equipped with a terminal device, the occurrence of the vehicle behind is suppressed by notifying the presence of the rear object. it can.
 第1取得部において取得される位置情報では、車両の存在位置が絶対的な値として示され、第2取得部において取得される検出情報では、物体の存在位置が車両からの相対的な値として示されてもよい。この場合、情報量を低減できる。 In the position information acquired by the first acquisition unit, the presence position of the vehicle is indicated as an absolute value, and in the detection information acquired by the second acquisition unit, the position of the object is indicated as a relative value from the vehicle. May be shown. In this case, the amount of information can be reduced.
 第2取得部において取得される検出情報では、物体の存在位置がコード情報として示されてもよい。この場合、情報量を低減できる。 In the detection information acquired by the second acquisition unit, the presence position of the object may be indicated as code information. In this case, the amount of information can be reduced.
 第2取得部において取得される検出情報では、物体の種類がコード情報として示されてもよい。この場合、情報量を低減できる。 In the detection information acquired by the second acquisition unit, the type of the object may be indicated as code information. In this case, the amount of information can be reduced.
 車両の前方に存在する物体の検出情報を取得する第3取得部をさらに備えてもよい。生成部は、第3取得部において取得した検出情報もパケット信号に含めてもよい。この場合、車両の事故発生をさらに抑制できる。 A third acquisition unit that acquires detection information of an object existing in front of the vehicle may be further provided. The generation unit may include the detection information acquired by the third acquisition unit in the packet signal. In this case, the occurrence of a vehicle accident can be further suppressed.
 本発明によれば、装置の普及率が低い場合であっても、車両の事故発生を抑制できる。 According to the present invention, the occurrence of a vehicle accident can be suppressed even when the penetration rate of the device is low.
 10  基地局装置
 12  車両
 14  端末装置
 16  歩行者
 20  アンテナ
 22  RF部
 24  変復調部
 26  処理部
 28  制御部
 30  ネットワーク通信部
 32  フレーム規定部
 34  選択部
 36  生成部
 50  アンテナ
 52  RF部
 54  変復調部
 56  処理部
 58  制御部
 60  タイミング特定部
 62  転送決定部
 64  第1取得部
 66  生成部
 70  通知部
 72  抽出部
 74  キャリアセンス部
 80  第2取得部
 82  第3取得部
 84  後方センサ
 86  前方センサ
 100  通信システム
DESCRIPTION OF SYMBOLS 10 Base station apparatus 12 Vehicle 14 Terminal apparatus 16 Pedestrian 20 Antenna 22 RF part 24 Modulator / demodulator 26 Processing part 28 Control part 30 Network communication part 32 Frame prescription part 34 Selection part 36 Generation part 50 Antenna 52 RF part 54 Modulator / Demodulator 56 Process Unit 58 control unit 60 timing specifying unit 62 transfer determining unit 64 first acquisition unit 66 generation unit 70 notification unit 72 extraction unit 74 carrier sense unit 80 second acquisition unit 82 third acquisition unit 84 rear sensor 86 front sensor 100 communication system

Claims (7)

  1.  車両に搭載可能な端末装置であって、
     車両の位置情報を取得する第1取得部と、
     車両の後方に存在する物体の検出情報を取得する第2取得部と、
     前記第1取得部において取得した位置情報と、前記第2取得部において取得した検出情報とが含まれたパケット信号を生成する生成部と、
     前記生成部において生成したパケット信号を報知する報知部と、
     を備えることを特徴とする端末装置。
    A terminal device that can be mounted on a vehicle,
    A first acquisition unit for acquiring vehicle position information;
    A second acquisition unit for acquiring detection information of an object existing behind the vehicle;
    A generation unit that generates a packet signal including the position information acquired in the first acquisition unit and the detection information acquired in the second acquisition unit;
    An informing unit for informing the packet signal generated in the generating unit;
    A terminal device comprising:
  2.  前記第1取得部において取得される位置情報では、車両の存在位置が絶対的な値として示され、
     前記第2取得部において取得される検出情報では、物体の存在位置が車両からの相対的な値として示されることを特徴とする請求項1に記載の端末装置。
    In the position information acquired by the first acquisition unit, the vehicle position is indicated as an absolute value,
    The terminal device according to claim 1, wherein the detection information acquired by the second acquisition unit indicates the position of the object as a relative value from the vehicle.
  3.  前記第2取得部において取得される検出情報では、物体の存在位置がコード情報として示されることを特徴とする請求項1または2に記載の端末装置。 The terminal device according to claim 1 or 2, wherein the detection information acquired by the second acquisition unit indicates the presence position of the object as code information.
  4.  前記第2取得部において取得される検出情報では、物体の種類がコード情報として示されることを特徴とする請求項1に記載の端末装置。 The terminal device according to claim 1, wherein the detection information acquired by the second acquisition unit indicates the type of the object as code information.
  5.  車両の前方に存在する物体の検出情報を取得する第3取得部をさらに備え、
     前記生成部は、前記第3取得部において取得した検出情報もパケット信号に含めることを特徴とする請求項1から4のいずれかに記載の端末装置。
    A third acquisition unit for acquiring detection information of an object existing in front of the vehicle;
    The terminal device according to claim 1, wherein the generation unit includes the detection information acquired by the third acquisition unit in a packet signal.
  6.  車両に搭載可能な端末装置を制御する制御方法であって、
     車両の位置情報を取得する第1取得ステップと、
     車両の後方に存在する物体の検出情報を取得する第2取得ステップと、
     前記第1取得ステップにおいて取得した位置情報と、前記第2取得ステップにおいて取得した検出情報とが含まれたパケット信号を生成する生成ステップと、
     前記生成ステップにおいて生成したパケット信号を報知する報知ステップと、
     を有することを特徴とする制御方法。
    A control method for controlling a terminal device that can be mounted on a vehicle,
    A first acquisition step of acquiring vehicle position information;
    A second acquisition step of acquiring detection information of an object existing behind the vehicle;
    A generation step of generating a packet signal including the position information acquired in the first acquisition step and the detection information acquired in the second acquisition step;
    An informing step of informing the packet signal generated in the generating step;
    A control method characterized by comprising:
  7.  車両に搭載可能な端末装置を制御するプログラムであって、
     車両の位置情報を取得する第1取得ステップと、
     車両の後方に存在する物体の検出情報を取得する第2取得ステップと、
     前記第1取得ステップにおいて取得した位置情報と、前記第2取得ステップにおいて取得した検出情報とが含まれたパケット信号を生成する生成ステップと、
     前記生成ステップにおいて生成したパケット信号を報知する報知ステップと、
     を有することを特徴とするプログラム。
    A program for controlling a terminal device that can be mounted on a vehicle,
    A first acquisition step of acquiring vehicle position information;
    A second acquisition step of acquiring detection information of an object existing behind the vehicle;
    A generation step of generating a packet signal including the position information acquired in the first acquisition step and the detection information acquired in the second acquisition step;
    An informing step of informing the packet signal generated in the generating step;
    The program characterized by having.
PCT/JP2014/003763 2013-08-27 2014-07-16 Terminal device, control method, and program WO2015029315A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-175843 2013-08-27
JP2013175843A JP2016189038A (en) 2013-08-27 2013-08-27 Terminal apparatus

Publications (1)

Publication Number Publication Date
WO2015029315A1 true WO2015029315A1 (en) 2015-03-05

Family

ID=52585915

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/003763 WO2015029315A1 (en) 2013-08-27 2014-07-16 Terminal device, control method, and program

Country Status (2)

Country Link
JP (1) JP2016189038A (en)
WO (1) WO2015029315A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010118731A (en) * 2008-11-11 2010-05-27 Advanced Telecommunication Research Institute International Wireless device and communication control method
JP2013092932A (en) * 2011-10-26 2013-05-16 Denso Corp Inter-vehicle communication system and inter-vehicle communication device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010118731A (en) * 2008-11-11 2010-05-27 Advanced Telecommunication Research Institute International Wireless device and communication control method
JP2013092932A (en) * 2011-10-26 2013-05-16 Denso Corp Inter-vehicle communication system and inter-vehicle communication device

Also Published As

Publication number Publication date
JP2016189038A (en) 2016-11-04

Similar Documents

Publication Publication Date Title
JP5874020B2 (en) Wireless device
JP5820964B2 (en) Wireless device
JP6213779B2 (en) Wireless device
JP2011091795A (en) Radio apparatus
JP6241751B2 (en) Terminal device
JP5754998B2 (en) Terminal device
JP5799238B1 (en) Wireless device, processing device, and processing system
JP2010258887A (en) Access control apparatus, terminal device and wireless communication system utilizing them
JP2015045949A (en) Terminal equipment and in-vehicle radio equipment
JP5608043B2 (en) Base station equipment
JP6072328B2 (en) Wireless device
JP5935176B2 (en) Wireless device
WO2015029315A1 (en) Terminal device, control method, and program
JP5901458B2 (en) Terminal device
JP2017174331A (en) Terminal device
JP6267781B2 (en) Wireless device
JP6245540B2 (en) Terminal device
JP6452050B2 (en) Terminal device
JP6090631B2 (en) Wireless device
JP5891404B2 (en) Terminal device
JPWO2012131830A1 (en) Terminal device
JP2017174330A (en) Terminal device and base station device
JP2017183987A (en) Terminal device
JP2011234098A (en) Terminal device
JP2010056866A (en) Notification method, and radio device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14839350

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14839350

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP