WO2018155127A1 - Dispositif d'affichage, procédé d'affichage, dispositif de commande et véhicule - Google Patents

Dispositif d'affichage, procédé d'affichage, dispositif de commande et véhicule Download PDF

Info

Publication number
WO2018155127A1
WO2018155127A1 PCT/JP2018/003500 JP2018003500W WO2018155127A1 WO 2018155127 A1 WO2018155127 A1 WO 2018155127A1 JP 2018003500 W JP2018003500 W JP 2018003500W WO 2018155127 A1 WO2018155127 A1 WO 2018155127A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
movement amount
display
vibration
unit
Prior art date
Application number
PCT/JP2018/003500
Other languages
English (en)
Japanese (ja)
Inventor
堀井 省次
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2018155127A1 publication Critical patent/WO2018155127A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers

Definitions

  • the present invention relates to a display device.
  • a technique for controlling an image so that the image looks stationary in the air when the main body shakes.
  • One embodiment is a display device that includes a display unit, a detection unit, a movement amount calculation unit, a storage unit, a movement amount prediction unit, and a display control unit.
  • the display unit detects the vibration of the display device, the movement amount calculation unit calculates the movement amount that the display device has moved due to the vibration of the display device, and the storage unit is defined by the time change of the movement amount of the display device.
  • the vibration pattern satisfies a predetermined condition
  • the vibration pattern is stored as a basic pattern
  • the movement amount prediction unit uses the basic pattern to calculate the movement amount of the display device when a new vibration is applied to the display device.
  • the display control unit moves the display position of the image displayed on the display unit based on the predicted movement amount predicted by the movement amount prediction unit.
  • ⁇ Appearance of display device> 1 and 2 are a perspective view and a rear view showing an example of the appearance of the display device 1, respectively.
  • the display device 1 includes a plate-like device case 10 that is substantially rectangular in plan view.
  • the device case 10 constitutes the exterior of the display device 1.
  • a display area 11 on which various information such as characters, symbols, and figures are displayed is located on the front surface 1a of the device case 10.
  • a touch panel 140 (FIG. 3) to be described later is located on the back side of the display area 11.
  • the user can input various information to the display device 1 by operating the display area 11 on the front surface of the display device 1 with a finger or the like.
  • the user can also input various types of information to the display device 1 by operating the display area 11 with a touch panel pen such as a stylus pen other than a finger.
  • the receiver hole 12 is located at the upper end of the front surface 1a of the device case 10.
  • a speaker hole 13 is located at the lower end of the front surface 1a.
  • a microphone hole 14 is located on the lower side surface 1 c of the device case 10.
  • a lens 191 included in a first camera 190 (FIG. 3) described later is visible.
  • a lens 201 included in a second camera 200 (FIG. 3) to be described later is visible from the upper end of the back surface 1 b of the device case 10.
  • the display device 1 includes an operation button group 18 having operation buttons 15, 16 and 17.
  • Each of the operation buttons 15 to 17 is a hardware button.
  • each of the operation buttons 15 to 17 is a push button.
  • at least one operation button included in the operation button group 18 may be a software button displayed in the display area 11.
  • the operation button group 18 may include a power button and a volume button.
  • FIG. 3 is a block diagram mainly showing an example of the electrical configuration of the display device 1.
  • the display device 1 includes a control unit 100, a wireless communication unit 110, a display unit 120, a touch panel 140, and an operation button group 18.
  • the display device 1 further includes a receiver 160, a speaker 170, a microphone 180, a first camera 190, and a second camera 200.
  • the display device 1 further includes an acceleration sensor 150 (detection unit) and a battery 210. These components included in the display device 1 are housed in a device case 10.
  • the control unit 100 can comprehensively manage the operation of the display device 1 by controlling other components of the display device 1.
  • the control unit 100 can also be said to be a control device.
  • the controller 100 includes at least one processor to provide control and processing capabilities to perform various functions, as described in further detail below.
  • At least one processor may be implemented as a single integrated circuit (IC) or as a plurality of communicatively connected integrated circuit ICs and / or discrete circuits. Good.
  • the at least one processor can be implemented according to various known techniques.
  • the processor includes one or more circuits or units configured to perform one or more data computation procedures or processes, for example, by executing instructions stored in associated memory.
  • the processor may be firmware (eg, a discrete logic component) configured to perform one or more data computation procedures or processes.
  • the processor may include one or more processors, controllers, microprocessors, microcontrollers, application specific integrated circuits (ASICs), digital signal processors, programmable logic devices, field programmable gate arrays, or the like.
  • ASICs application specific integrated circuits
  • digital signal processors programmable logic devices
  • field programmable gate arrays or the like.
  • the control unit 100 includes a CPU (Central Processing Unit) 101, a DSP (Digital Signal Processor) 102, and a storage unit 103.
  • the storage unit 103 includes a non-transitory recording medium that can be read by the CPU 101 and the DSP 102, such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
  • the ROM included in the storage unit 103 is, for example, a flash ROM (flash memory) that is a nonvolatile memory.
  • the storage unit 103 stores a plurality of control programs 103 a for controlling the display device 1.
  • Various functions of the control unit 100 are realized by the CPU 101 and the DSP 102 executing various control programs 103 a in the storage unit 103.
  • control unit 100 may include a plurality of CPUs 101.
  • the control unit 100 may include a main CPU that performs relatively complicated processing and a sub CPU that performs relatively simple processing.
  • control unit 100 may not include the DSP 102 or may include a plurality of DSPs 102.
  • all the functions of the control unit 100 or a part of the functions of the control unit 100 may be realized by a hardware circuit that does not require software to realize the function.
  • the storage unit 103 may include a computer-readable non-transitory recording medium other than the ROM and RAM.
  • the storage unit 103 may include, for example, a small hard disk drive and an SSD (Solid State Drive).
  • the plurality of control programs 103a in the storage unit 103 include various applications (application programs).
  • the storage unit 103 stores, for example, a call application for making a voice call and a video call, a browser for displaying a website, and a mail application for creating, browsing, and transmitting / receiving an e-mail.
  • the storage unit 103 also has a camera application for photographing a subject using the first camera 190 and the second camera 200, and a recorded image display application for displaying still images and moving images recorded in the storage unit 103.
  • a music reproduction control application for performing reproduction control of music data stored in the storage unit 103 is stored.
  • At least one application in the storage unit 103 may be stored in advance in the storage unit 103.
  • the at least one application in the storage unit 103 may be one that the display device 1 has downloaded from another device and stored in the storage unit 103.
  • the wireless communication unit 110 has an antenna 111.
  • the wireless communication unit 110 can use the antenna 111 to perform wireless communication using a plurality of types of communication methods, for example. Wireless communication of the wireless communication unit 110 is controlled by the control unit 100. It can be said that the wireless communication unit 110 is a communication circuit.
  • the wireless communication unit 110 can wirelessly communicate with a base station of a mobile phone system.
  • the wireless communication unit 110 can communicate with a mobile phone and a web server other than the display device 1 through the base station and a network such as the Internet.
  • the display device 1 can perform data communication, voice call, video call, and the like with other mobile phones and the like.
  • the wireless communication unit 110 performs various processing such as amplification processing on the signal received by the antenna 111 and outputs the processed received signal to the control unit 100.
  • the control unit 100 performs various processes on the input received signal and acquires information included in the received signal.
  • the control unit 100 outputs a transmission signal including information to the wireless communication unit 110.
  • the wireless communication unit 110 performs various processing such as amplification processing on the input transmission signal, and wirelessly transmits the processed transmission signal from the antenna 111.
  • the display unit 120 includes a display area 11 located in front of the display device 1 and a display panel 130.
  • the display unit 120 can display various information in the display area 11.
  • the display panel 130 is, for example, a liquid crystal display panel or an organic EL panel.
  • the display panel 130 can display various types of information such as characters, symbols, and figures by being controlled by the control unit 100.
  • the display panel 130 faces the display area 11 in the device case 10. Information displayed on the display panel 130 is displayed in the display area 11.
  • the touch panel 140 can detect an operation with an operator such as a finger on the display area 11.
  • the touch panel 140 is, for example, a projected capacitive touch panel.
  • the touch panel 140 is located on the back side of the display area 11, for example.
  • the control unit 100 can specify the content of the operation performed on the display area 11 based on the electrical signal (output signal) from the touch panel 140. And the control part 100 can perform the process according to the specified operation content.
  • the microphone 180 can convert a sound input from the outside of the display device 1 into an electrical sound signal and output it to the control unit 100. Sound from the outside of the display device 1 is taken into the display device 1 from the microphone hole 14 and input to the microphone 180.
  • the speaker 170 is, for example, a dynamic speaker.
  • the speaker 170 can convert an electrical sound signal from the control unit 100 into a sound and output the sound. Sound output from the speaker 170 is output from the speaker hole 13 to the outside. The user can hear the sound output from the speaker hole 13 even at a location away from the display device 1.
  • the receiver 160 can output a received sound.
  • the receiver 160 is a dynamic speaker, for example.
  • the receiver 160 can convert an electrical sound signal from the control unit 100 into a sound and output the sound.
  • the sound output from the receiver 160 is output from the receiver hole 12 to the outside.
  • the volume of the sound output from the receiver hole 12 is smaller than the volume of the sound output from the speaker hole 13.
  • the user can hear the sound output from the receiver hole 12 by bringing his ear close to the receiver hole 12.
  • a vibration element such as a piezoelectric vibration element that vibrates the front portion of the device case 10 may be provided. In this case, the sound is transmitted to the user by the vibration of the front portion.
  • the first camera 190 includes a lens 191 and an image sensor.
  • the second camera 200 includes a lens 201 and an image sensor. Each of the first camera 190 and the second camera 200 can photograph a subject based on control by the control unit 100, generate a still image or a moving image indicating the photographed subject, and output the still image or moving image to the control unit 100. .
  • the lens 191 of the first camera 190 is visible from the front surface 1a of the device case 10. Accordingly, the first camera 190 can photograph a subject existing on the front side (display area 11 side) of the display device 1.
  • the first camera 190 is called an in camera.
  • the lens 201 of the second camera 200 is visible from the back surface 1 b of the device case 10. Therefore, the second camera 200 can photograph a subject existing on the back side of the display device 1.
  • the second camera 200 is called an out camera.
  • the acceleration sensor 150 can detect the acceleration of the display device 1.
  • the acceleration sensor 150 is, for example, a three-axis acceleration sensor.
  • the acceleration sensor 150 can detect the acceleration of the display device 1 in the X-axis direction, the Y-axis direction, and the Z-axis direction.
  • the X-axis direction, the Y-axis direction, and the Z-axis direction of the acceleration sensor 150 are set in the longitudinal direction, the lateral direction, and the thickness direction of the display device 1, respectively.
  • the battery 210 can output the power source of the display device 1.
  • the battery 210 is, for example, a rechargeable battery.
  • the power output from the battery 210 is supplied to various components such as the control unit 100 and the wireless communication unit 110 included in the display device 1.
  • the display device 1 may include a sensor other than the acceleration sensor 150.
  • the display device 1 may include at least one of a geomagnetic sensor, a gyro sensor, an atmospheric pressure sensor, a temperature sensor, a proximity sensor, and an illuminance sensor.
  • FIG. 1 The technology of the display device 1 will be described with reference to FIGS. 1 to 3 and FIGS. 4 to 6.
  • FIG. 1 is a diagrammatic representation of the display device 1
  • FIG. 4 and 5 are flowcharts for explaining an example of the operation of the display device 1 for improving the visibility
  • FIG. 6 is a diagram schematically showing the operation.
  • the control unit 100 turns on a correction mode for moving the image displayed on the display unit 120 in response to the movement of the display device 1 when the display device 1 moves in a spatial position due to shaking or the like. Judge whether or not. Note that the correction mode is set to ON or OFF when the user performs an operation to set the correction mode on the menu screen of the display device 1, for example.
  • the control unit 100 confirms whether or not the correction mode of the display device 1 is ON in Step S401. If the correction mode is ON (Yes), the control unit 100 proceeds to Step S402. If the correction mode is not ON (No), the operation in step S401 is repeated.
  • step S402 the control unit 100 calculates the movement amount of the display device 1, that is, the movement amount of the image, based on the output of the acceleration sensor 150.
  • the image movement amount calculation operation in step S402 will be described with reference to FIG.
  • the control unit 100 turns on the acceleration sensor 150 when the correction mode is turned on (step S501).
  • the control unit 100 calculates an image movement amount that is a movement amount of the image displayed in the display area 11 of the display unit 120 based on the output of the acceleration sensor 150 (step S502).
  • the control unit 100 moves the image displayed in the display area 11 based on the calculated image movement amount (step S403).
  • the direction in which the image is moved is a direction in which the image appears to be fixed on the display screen. For example, when the display device 1 moves in the X axis positive direction, the display device 1 is moved in the direction to cancel the movement, that is, in the X axis negative direction.
  • the control unit 100 may display the image display position on the display unit 120 after step 403 described above so that the image display position is slightly closer to the position when the operation mode is OFF (standard display position). Thereby, the possibility that the display position of the image is greatly separated from the standard display position can be reduced.
  • control unit 100 determines whether or not the correction mode is turned off. When the correction mode is turned off (Yes), the operation is terminated. When the correction mode is not turned off (No), the operation after Step S402 is performed. Repeat (step S404). Note that the control unit 100 causes the display position of the image to be displayed so as to return to the standard display position after step 404 described above.
  • FIG. 6 An example of the operation of the display device 1 described with reference to FIGS. 4 and 5 will be described with reference to FIG.
  • directions are expressed using an XYZ orthogonal coordinate system, and the X-axis direction, the Y-axis direction, and the Z-axis direction correspond to the short side direction, the long side direction, and the thickness direction of the display device 1, respectively.
  • the display device 1 has moved by a movement amount L in the positive X-axis direction of the acceleration sensor 150, and the screen of the display area 11 of the display device 1 faces the Z-axis direction of the acceleration sensor 150. It shall be.
  • control unit 100 detects the acceleration of the moving display device 1 from the output value of the acceleration sensor 150, and calculates the movement amount L of the display device 1 based on the detected acceleration.
  • the movement amount L is obtained by the control unit 100 integrating the acceleration twice over time.
  • the control unit 100 moves the image displayed in the display area 11 before the display device 1 is moved, here, the entire display screen 301 in the display area 11.
  • the display device 1 before movement is shown on the upper side
  • the display device 1 after movement is shown on the lower side.
  • the display screen 301 is detected to the left side (X-axis negative direction) in the display area 11. It is moved by the movement amount L.
  • a heart-shaped image 3011 is indicated by a solid line on the display screen 301 before the display device 1 moves to make the operation easy to understand.
  • the correction mode is ON, when the display device 1 moves to the right side (X-axis positive direction) on the drawing, the display screen 301 is detected to move to the left side (X-axis negative direction) in the display area 11. Since the display device 1 moves by the amount L, the center line CL of the image 3011 does not move even after the display device 1 moves, and the image 3011 appears to be stationary on the display screen 301, and the visibility of the display device 1. Will improve.
  • an area NR indicating a margin is generated at the right end of the display area 11.
  • an image different from the display screen 301 is displayed. For example, an image painted in a predetermined single color is displayed over the entire area NR.
  • the correction mode when the display device 1 moves to the right (X-axis positive direction) by the movement amount L on the drawing, the display screen 301 also moves to the right by the movement amount L. As a result, the image 3011 also moves to the right by the movement amount L, becomes the position of the image 3012 indicated by a broken line, and the visibility of the display device 1 decreases.
  • the user when the user is using the display device 1 while holding it in the vehicle, the user also shakes as the vehicle shakes. In this case, the user's head (eyes) and the hand holding the display device 1 do not move in synchronization, and if the correction mode is not ON, the image is shaken and visibility is lowered. However, if the correction mode is ON, the image appears to be fixed on the display screen 301 of the display device 1, and for example, reading of characters and the like is facilitated, and visibility is improved.
  • the image on the display screen of the display device 1 appears to be fixed, and thus visibility is improved.
  • the central portion CP of the display screen 301 before the movement that is, the central portion of the display screen 301 in the X-axis direction and the Y-axis direction is the origin.
  • the image data position may be changed so that the origin moves in the X-axis direction and the Y-axis direction.
  • the entire display screen 301 is moved. However, only a specific image on the display screen 301 may be moved.
  • the amount of movement of the display device 1 is calculated a plurality of times during one frame period defined by the frame rate.
  • An average value of a plurality of movement amounts can be taken as a movement amount in one frame period. For example, if the frame rate is 60 fps (frames per second), the length of the frame period is 1/60 second, which is about 17 msec.
  • the acceleration sensor 150 measures the acceleration applied to the display device 1 every 5 msec, for example, three measurements are possible.
  • the image displayed in the display area 11 at a predetermined frame rate may be a moving image or a still image.
  • the movement amount of the display device 1 is calculated based on the output of the acceleration sensor.
  • the movement amount of the display device 1 is calculated based on the movement amount of the image captured by the camera. Also good.
  • the control unit 100 identifies the reference image in the captured image captured by the first camera (out camera) 190.
  • the reference image for example, an image having the highest luminance among the captured images is detected and set as the reference image.
  • the reference image setting method is not limited to the highest brightness among the captured images, and a person image, an image having a predetermined shape, a predetermined color in the captured image You may set to the image etc. which have.
  • an image having a predetermined image data value such as contrast may be used as the reference image.
  • the control part 100 specifies the predetermined coordinate position of the reference
  • the control unit 100 identifies a reference image in the captured image captured by the out-camera 190, and identifies a predetermined coordinate position in the reference image. , The second coordinate.
  • control unit 100 obtains the movement amount of the reference image from the displacement value from the first coordinate to the second coordinate in the captured image, and calculates the movement amount L of the display device 1.
  • the example which calculates the movement amount L of the display apparatus 1 using the captured image imaged with the 1st camera (out camera) 190 was shown above, the captured image imaged with the 2nd camera (in camera) 200 was shown. It may be used to calculate the movement amount L of the display device 1.
  • the captured image is not limited to a moving image, and still images may be captured continuously to calculate the moving amount of the image.
  • the frequency and damping rate of this vibration are determined by the vehicle weight, buffer spring and damper.
  • the frequency and the damping rate are the same unless the characteristics of the vehicle weight, the buffer spring, and the damper are changed. Therefore, if the vibration pattern is stored, the movement amount of the display device 1 can be predicted based on the stored vibration pattern when the same vibration occurs next time.
  • the control unit 100 turns on the acceleration sensor 150, and the turned-on acceleration sensor 150 repeats measurement of acceleration at a predetermined interval. Therefore, when the correction mode of the display device 1 in the vehicle is turned on and the vehicle climbs up a step, the acceleration sensor 150 is added to the display device 1 at a predetermined interval, for example, every 5 msec. The acceleration is measured and output to the control unit 100.
  • the control unit 100 detects the acceleration from the output value of the acceleration sensor 150 and calculates the amount of movement of the display device 1 by integrating the acceleration twice over time. By repeating this series of processing, it is possible to acquire a vibration pattern until the shaking of the vehicle body is settled.
  • FIG. 7 shows an example of the vibration pattern obtained in this way.
  • FIG. 7 is a waveform diagram illustrating a vibration pattern until the shaking of the vehicle body is settled.
  • the horizontal axis indicates time, and the vertical axis indicates the amount of movement of the display device 1.
  • the vibration is repeated for several cycles until the vibration starts and settles, and gradually attenuates with time.
  • the vibration pattern has a frequency of 3 Hz because it takes approximately 1 second until the vibration is stopped and repeats the vibration of three cycles.
  • FIG. 7 two types of vibration patterns with different maximum peak values are shown. This is because the maximum peak value (vibration magnitude) differs depending on the height of the step on which the vehicle rides. It shows that the vibration pattern is the same. Therefore, if the vibration pattern data as shown in FIG. 7 is stored, the next movement amount of the display device 1 generated based on the stored vibration pattern when the vibration of the same vibration pattern occurs next. Can be predicted.
  • FIG. 8 is a diagram conceptually illustrating a method of predicting the movement amount of the display device 1.
  • FIG. 8 shows a vibration pattern similar to that in FIG. 7, but this is a stored vibration pattern, and is hereinafter referred to as a basic pattern.
  • the acceleration sensor 150 measures the acceleration applied to the display device 1 every 5 msec and outputs it to the control unit 100 when the vehicle climbs on the step again and the vehicle body starts to shake in the vertical direction. Then, the control unit 100 calculates an amount of movement of the display device 1 based on the acceleration and starts an operation of acquiring a new vibration pattern.
  • this new vibration pattern is referred to as a measuring pattern.
  • the control unit 100 compares the pattern under measurement and the basic pattern, and if the same movement amount as the basic pattern is obtained at the same timing as the basic pattern, it is determined that the vibration having the same vibration pattern as the basic pattern has occurred.
  • the control unit 100 calculates the same movement amount at the same timing in the pattern under measurement. It is determined that the vibration having the same vibration pattern as the basic pattern has occurred. Note that “same” is not limited to completely the same value, but includes different values within a range of several percent.
  • the frame rate of the liquid crystal display device is 1/60 second, and after 1/60 second from the point P1 at the peak of the first waveform of the basic pattern, the movement amount is the movement amount L2 at the point P2. It is predicted that By using the movement amount L2 as the predicted movement amount and moving the display screen 301 after 1/60 second based on the predicted movement amount, the time lag from the shaking of the display device 1 to the movement of the display screen 301 can be reduced.
  • the control unit 100 calculates the movement amount of the display device 1 by detecting the acceleration of the shake of the display device 1, and therefore the movement of the display screen 301 from the shake of the display device 1.
  • the movement amount L2 after 1/60 second is predicted using the basic pattern as described above, so that the display device 1 can be moved until the display screen 301 is moved.
  • the user of the display device 1 can be reduced from feeling uncomfortable due to the time lag.
  • the acceleration may be measured three times including the peak point. That is, if acceleration is measured three times and the change in time of the movement amount at three points is known, it changes whether it changes toward the peak, away from the peak, or includes the peak. You can see if it is. If the acceleration sensor 150 measures the acceleration applied to the display device 1 every 5 msec, three measurements can be performed in 1/60 second, and the peak point of the pattern being measured can be specified. Become.
  • the basic pattern and the pattern under measurement are compared using the peak of the first waveform of the basic pattern.
  • the comparison target is not limited to this, and any vibration pattern with a short vibration period may be used.
  • the peak of the second waveform and the peak of the third waveform of the basic pattern may be used, and are not limited to the peak of the waveform.
  • FIG. 9 is a diagram illustrating an example of a part of functional blocks formed when the CPU 101 and the DSP 102 execute the control program 103 a in the storage unit 103.
  • the control unit 100 includes a movement amount calculation unit 300, an attenuation waveform determination unit 400, a basic pattern determination unit 500, a movement amount prediction unit 600, and a display control unit 700 as functional blocks.
  • At least one of the movement amount calculation unit 300, the attenuation waveform determination unit 400, the basic pattern determination unit 500, the movement amount prediction unit 600, and the display control unit 700 is realized by a hardware circuit that does not require software to execute the function. Also good.
  • the movement amount calculation unit 300 detects the acceleration from the output value of the acceleration sensor 150 and calculates the movement amount of the display device 1 by integrating the acceleration twice over time.
  • the attenuation waveform determination unit 400 determines whether or not an attenuation waveform that is an attenuation waveform is obtained by arranging the movement amounts of the display device 1 calculated by the movement amount calculation unit 300 in time series.
  • the basic pattern determination unit 500 determines the waveform as a basic pattern and stores the waveform in the storage unit 103, for example.
  • the movement amount predicting unit 600 compares the basic pattern with a new measurement pattern being measured, and determines that the vibration having the same vibration pattern as the basic pattern has occurred when the same movement amount is obtained at the same timing. Then, the movement amount of the display device 1 occurring in the future is predicted using the basic pattern.
  • the display control unit 700 performs display control so as to move the display screen 301 based on the movement amount of the display device 1 generated in the future predicted by the movement amount prediction unit 600.
  • the control unit 100 turns on the acceleration sensor 150, and causes the acceleration sensor 150 that has turned on to repeat measurement of acceleration at a predetermined interval (step S1).
  • the movement amount calculation unit 300 of the control unit 100 detects the acceleration from the output value of the acceleration sensor 150, and calculates the movement amount of the display device 1 by integrating the acceleration twice over time (step S2).
  • the amount of movement of the display device 1 obtained by repeating the processing of step S1 and step S2 over a predetermined time is arranged in time series to obtain an attenuation waveform. It is determined whether or not (step S3).
  • step S3 the amount of movement of the display device 1 obtained by repeating the processing of step S1 and step S2 over a predetermined time is arranged in time series to obtain an attenuation waveform. It is determined whether or not (step S3).
  • a periodic waveform is continuous instead of an attenuation waveform, it is determined that the attenuation waveform is not obtained, assuming that the vehicle is traveling on a road surface such as a wavy road, and steps S1 and after. Repeat the process.
  • the basic pattern determination unit 500 determines the waveform as a basic pattern (step S4). Note that the basic pattern determination unit 500 updates the newly determined basic pattern to a newly determined basic pattern if it is different from the newly determined basic pattern if there is a previously determined basic pattern.
  • the acceleration sensor 150 measures the acceleration applied to the display device 1 every 5 msec and outputs the acceleration to the control unit 100.
  • the movement amount calculation unit 300 of the control unit 100 determines the acceleration. Based on this, the movement amount of the display device 1 is calculated, and an operation of acquiring a pattern under measurement which is a new vibration pattern is started (step S5).
  • the movement amount prediction unit 600 of the control unit 100 compares the basic pattern with a new measurement pattern that is being measured, and if the same movement amount is obtained at the same timing, vibration with the same vibration pattern as the basic pattern occurs. (Step S6). At this time, the pattern under measurement is not acquired as a complete pattern, and the amount of movement of the display device 1 calculated sequentially each time acceleration is detected is compared with the amount of movement in the basic pattern. However, in the above description, the basic pattern and the pattern under measurement are compared for convenience.
  • the vibration to be determined here is a vibration having a large amplitude as if the vehicle has climbed a step, and the movement amount similar to the peak of the first waveform of the basic pattern is obtained. It can be considered that it is the same vibration. If it is determined that vibration having a vibration pattern different from the basic pattern has occurred, the operations in and after step S3 are repeated.
  • the movement amount prediction unit 600 determines that the vibration having the same vibration pattern as the basic pattern has occurred, the movement amount prediction unit 600 predicts the movement amount of the display device 1 that will occur in the future using the basic pattern (step S7).
  • the movement amount 1/60 second after the point at the peak of the first waveform of the basic pattern is predicted from the basic pattern.
  • the display control unit 700 of the control unit 100 performs display control so that the display screen 301 is moved in the direction to cancel the movement of the display device 1 with the same movement amount predicted by the movement amount prediction unit 600 (step S8). ).
  • This display control may be control that only changes the data of the display position based on the predicted movement amount predicted by the movement amount prediction unit 600 to the image data before movement of the display screen 301, or based on the predicted movement amount. It is good also as control which produces
  • the former control is suitable for moving a text image and a still image in which the temporal change of image data is relatively small, and the latter control is suitable for moving a moving image in which the temporal change of image data is relatively large.
  • control unit 100 may include a plurality of CPUs.
  • control unit 100 may include a main CPU that performs relatively complicated processing and a sub CPU that performs relatively simple processing.
  • FIG. 11 is a block diagram schematically illustrating an example of a configuration in which the control unit 100 includes a main CPU 101m and a sub CPU 101s.
  • the main CPU 101m transmits the image data of the display screen 301 to the display panel 130, and the sub CPU 101s executes the above-described processing of steps S1 to S8, whereby the image data of the display screen 301 is displayed.
  • the display screen 301 is moved by changing the display position based on the predicted movement amount predicted by the movement amount prediction unit 600.
  • the sub CPU 101s need only have a function of executing the processing of steps S1 to S8, and the device configuration is simplified.
  • FIG. 12 is also a block diagram schematically illustrating an example of a configuration in which the control unit 100 includes the main CPU 101m and the sub CPU 101s.
  • the main CPU 101m transmits image data of the display screen 301 to the display panel 130. Instead, the image data of the display screen 301 is transmitted via the sub CPU 101s.
  • the sub CPU 101s has a function of creating image data of the display screen 301, and changes the display position of the image data based on the predicted movement amount obtained by executing the processes of steps S1 to S8 described above.
  • the added image data is transmitted to the display panel 130.
  • the sub CPU 101s generates image data, so that the processing amount of the main CPU 101m can be reduced.
  • the main CPU 101m and the sub CPU 101s can perform parallel processing, and the time from when the acceleration is detected by the acceleration sensor 150 until the movement amount of the display screen 301 is calculated can be shortened. Further, when the processing of steps S1 to S8 is executed by the main CPU, a waiting time for other processing by the main CPU may occur. However, when the sub CPU 101s is used, the possibility is excluded. Thus, the time from when the acceleration is detected to when the movement amount of the display screen 301 is calculated can be shortened.
  • the display device is not limited to a portable device, and is fixed in a vehicle such as a train or an automobile. You may use for a display apparatus. In that case, the embodiment is a vehicle including the display device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Digital Computer Display Output (AREA)

Abstract

L'invention concerne un dispositif d'affichage qui comprend : une unité d'affichage; une unité de détection qui détecte une vibration du dispositif d'affichage; une unité de calcul de quantité de mouvement qui calcule une quantité de mouvement, ladite quantité de mouvement étant la quantité que le dispositif d'affichage a déplacée en raison de la vibration du dispositif d'affichage; une unité de mémorisation qui mémorise un motif de vibration en tant que motif de base si le motif de vibration satisfait à une condition prédéfinie, ledit motif de vibration étant prescrit par la variation temporelle de la quantité de mouvement du dispositif d'affichage; une unité de prédiction de quantité de mouvement qui, si une nouvelle vibration est appliquée au dispositif d'affichage, prédit la quantité de mouvement du dispositif d'affichage à l'aide du motif de base; et une unité de commande d'affichage qui, sur la base de la quantité de mouvement prédite par l'unité de prédiction de quantité de mouvement, déplace la position d'affichage d'une image affichée par l'unité d'affichage et affiche le résultat.
PCT/JP2018/003500 2017-02-21 2018-02-02 Dispositif d'affichage, procédé d'affichage, dispositif de commande et véhicule WO2018155127A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-029775 2017-02-21
JP2017029775A JP2018136400A (ja) 2017-02-21 2017-02-21 表示装置、表示方法、制御装置および車両

Publications (1)

Publication Number Publication Date
WO2018155127A1 true WO2018155127A1 (fr) 2018-08-30

Family

ID=63252697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/003500 WO2018155127A1 (fr) 2017-02-21 2018-02-02 Dispositif d'affichage, procédé d'affichage, dispositif de commande et véhicule

Country Status (2)

Country Link
JP (1) JP2018136400A (fr)
WO (1) WO2018155127A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020075825A1 (fr) * 2018-10-12 2020-04-16 洋紀 山本 Dispositif d'estimation de mouvement, instrument électronique, programme de commande et procédé d'estimation de mouvement

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002123242A (ja) * 2000-10-17 2002-04-26 Ricoh Co Ltd 画像表示装置
JP2006113272A (ja) * 2004-10-14 2006-04-27 Matsushita Electric Ind Co Ltd 映像投射装置
JP2008139600A (ja) * 2006-12-01 2008-06-19 Toshiba Corp 表示装置
JP2011257503A (ja) * 2010-06-07 2011-12-22 Sony Corp 画像安定化装置、画像安定化方法、及びプログラム
JP2012019452A (ja) * 2010-07-09 2012-01-26 Mitsubishi Electric Corp 画像処理装置及び画像処理方法
JP2013010223A (ja) * 2011-06-28 2013-01-17 Ricoh Co Ltd 画像形成装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013140223A (ja) * 2011-12-28 2013-07-18 Sharp Corp 情報表示装置、情報表示装置の制御方法、制御プログラム、制御プログラムを記録したコンピュータ読み取り可能な記録媒体
JP6316607B2 (ja) * 2014-01-30 2018-04-25 京セラ株式会社 表示装置及び表示方法
JP2015166816A (ja) * 2014-03-04 2015-09-24 富士通株式会社 表示装置,表示制御プログラム及び表示制御方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002123242A (ja) * 2000-10-17 2002-04-26 Ricoh Co Ltd 画像表示装置
JP2006113272A (ja) * 2004-10-14 2006-04-27 Matsushita Electric Ind Co Ltd 映像投射装置
JP2008139600A (ja) * 2006-12-01 2008-06-19 Toshiba Corp 表示装置
JP2011257503A (ja) * 2010-06-07 2011-12-22 Sony Corp 画像安定化装置、画像安定化方法、及びプログラム
JP2012019452A (ja) * 2010-07-09 2012-01-26 Mitsubishi Electric Corp 画像処理装置及び画像処理方法
JP2013010223A (ja) * 2011-06-28 2013-01-17 Ricoh Co Ltd 画像形成装置

Also Published As

Publication number Publication date
JP2018136400A (ja) 2018-08-30

Similar Documents

Publication Publication Date Title
KR101477442B1 (ko) 모바일 디바이스에서 제스처-기반의 사용자 입력 검출을 위한 방법들 및 장치들
JP6109413B2 (ja) 画像表示方法、画像表示装置、端末、プログラム及び記録媒体
EP3062286B1 (fr) Compensation de distorsion optique
US10612918B2 (en) Mobile computing device and method for calculating a bending angle
EP2433207A1 (fr) Appareil électronique portable comprenant un dispositif d'affichage et procédé de commande d'un tel appareil
US10359839B2 (en) Performing output control based on user behaviour
JP6316607B2 (ja) 表示装置及び表示方法
US9921796B2 (en) Sharing of input information superimposed on images
US10511702B2 (en) Information processing device and information processing method
WO2020075825A1 (fr) Dispositif d'estimation de mouvement, instrument électronique, programme de commande et procédé d'estimation de mouvement
WO2017005070A1 (fr) Procédé et dispositif de commande d'affichage
CN109831817B (zh) 终端控制方法、装置、终端及存储介质
WO2015078189A1 (fr) Procédé de réglage d'interface et dispositif mobile
CN108196701B (zh) 确定姿态的方法、装置及vr设备
WO2018155127A1 (fr) Dispositif d'affichage, procédé d'affichage, dispositif de commande et véhicule
CN111447562B (zh) 车辆行程轨迹分析方法及装置、计算机存储介质
WO2018155123A1 (fr) Dispositif d'affichage, procédé d'affichage, dispositif de commande et véhicule
WO2018155128A1 (fr) Dispositif d'affichage, dispositif de commande et véhicule
JP6996616B2 (ja) 情報処理装置、情報処理方法、及びプログラム
WO2018155122A1 (fr) Dispositif électronique, appareil de commande, véhicule, programme de commande et procédé d'actionnement de dispositif électronique
WO2018155134A1 (fr) Appareil électronique, véhicule, dispositif de commande, programme de commande et procédé pour faire fonctionner un appareil électronique
JP6621167B1 (ja) 動き推定装置、電子機器、制御プログラム及び動き推定方法
JP2018101149A (ja) 表示装置及び表示方法
JPWO2020071284A1 (ja) 情報処理装置、情報処理方法およびプログラム
CN114285934B (zh) 一种振动调节方法、电子设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18757348

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18757348

Country of ref document: EP

Kind code of ref document: A1