WO2019130696A1 - 情報処理装置、情報処理方法および情報処理システム - Google Patents
情報処理装置、情報処理方法および情報処理システム Download PDFInfo
- Publication number
- WO2019130696A1 WO2019130696A1 PCT/JP2018/035490 JP2018035490W WO2019130696A1 WO 2019130696 A1 WO2019130696 A1 WO 2019130696A1 JP 2018035490 W JP2018035490 W JP 2018035490W WO 2019130696 A1 WO2019130696 A1 WO 2019130696A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input device
- information
- light emission
- information processing
- processing apparatus
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03542—Light pens for emitting or receiving light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0383—Signal control means within the pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0386—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0382—Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
Definitions
- the present disclosure relates to an information processing device, an information processing method, and an information processing system.
- IR LED infrared Light Emitting Diode
- a light emission control unit configured to output a first light emission instruction to the first input device when the first identification information is received from the first input device;
- a process of associating the first light emitting position information with the first identification information when the first light emitting position information indicating the position where light is emitted by the first light emitting unit of the first input device is acquired An information processing apparatus is provided, comprising:
- control when the first identification information is received from the first input device, control is performed such that a first light emission instruction is output to the first input device;
- the first light emitting position information indicating the position where light is emitted by the first light emitting unit of the first input device is acquired, the first light emitting position information is associated with the first identification information. And an information processing method is provided.
- the information processing system includes an input device and an information processing device
- the input device includes a control unit that controls transmission of identification information
- the control unit is configured to issue a light emission instruction from the input device. Is controlled to emit light, and the information processing apparatus is controlled to output the light emission instruction when the identification information is received, and the light emission unit
- An information processing system comprising: a processing unit that links the light emitting position information with the identification information when light emitting position information indicating a light emitting position is acquired.
- FIG. 7 is a diagram showing a process flow of identifying a pen according to an embodiment of the present disclosure.
- FIG. 6 shows an example of a visible light image and an infrared light image of a screen written by a pen, respectively.
- 1 is a schematic configuration diagram of an information processing system according to an embodiment of the present disclosure. It is a figure which shows the function structural example of an input device. It is a figure showing an example of functional composition of an information processing system concerning an embodiment of this indication. It is a state transition figure at the time of pairing operation by the information processing system concerning an embodiment of this indication. It is a figure which shows the example of the queue managed by the communication management part. It is a figure which shows the image of the cue
- a plurality of components having substantially the same functional configuration may be distinguished by attaching different numerals after the same reference numerals. However, when it is not necessary to distinguish each of a plurality of components having substantially the same functional configuration, only the same reference numerals will be given.
- a plurality of components having the same or similar functional configuration may be distinguished by attaching different alphabets after the same symbol. However, in the case where it is not necessary to distinguish each of the plurality having the same or similar functional configuration, only the same symbol is attached.
- a communication function for example, by Bluetooth (registered trademark)
- Bluetooth registered trademark
- BT Bluetooth (registered trademark)
- BT Bluetooth (registered trademark)
- FIG. 1 is a diagram showing a process flow of identifying a pen according to an embodiment of the present disclosure.
- first and second pens time T0
- Identification information is stored in each of the two pens.
- BT ID is used as an example of the identification information.
- the BT ID of the first pen is “AAA”
- the BT ID of the second pen is “BBB”.
- the pen tip of the first pen When the writing is started by the first pen (time T1), the pen tip of the first pen is switched on, and the communication function (by BT, for example) causes the information from the first pen to the system (BT ID "AAA”) Is sent).
- the system obtains the BT ID “AAA” of the first pen from the information sent from the first pen, and the request to turn on the IR LED by communication using the BT ID “AAA” is the first Send to the pen.
- the first pen receives the request to turn on the IR LED, the IR LED is turned on and the IR emitting at the pen tip of the first pen is detected in the camera image (time T2). .
- the system links the detected bright spot (light emitting position) by the IR LED with the information (including the BT ID “AAA”) received from the first pen in communication, and stores it as the information of the first pen . From now on, the system will be selected from the bright spot by the IR LED detected by the camera, the information received from the first pen (BT ID "AAA", ie including the identification of the hardware of the first pen), and the system It becomes possible to link and handle the functions of the pen managed in the application (the color, thickness, blur, etc. of the locus).
- FIG. 2 shows an example of a visible light image and an infrared light image of a screen written by a pen, respectively.
- the visible light image G10 is an image captured by a visible light camera, and the visible light image G10 is used by the screen Sc, the user U10 and the user U10 for writing on the screen Sc.
- the input device 200 is shown.
- the infrared light image g10 is an image captured by the infrared light camera, and the infrared light image g10 has a bright spot p10 of the IR emitted by the input device 200.
- the environment can be understood only by the image through the camera.
- the information processing apparatus identifies a pen emitting IR light by capturing the bright spot p10 from the infrared light image g10.
- FIG. 3 is a schematic configuration diagram of the information processing system 1 according to the embodiment of the present disclosure.
- the information processing system 1 includes an information processing apparatus 100, an input device 200, a camera 300, and a projector 410 as an example of an output device.
- the camera 300 and the projector 410 are installed on the floor surface, but the positions where the camera 300 and the projector 410 are installed are not limited (may be installed above the floor surface) ).
- FIG. 4 is a diagram showing an example of a functional configuration of the input device 200. As shown in FIG. As shown in FIG. 4, in the embodiment of the present disclosure, it is mainly assumed that a pen (pen-type input device) is used as an example of the input device 200. However, the input device 200 according to the embodiment of the present disclosure is not limited to a pen (pen-type input device).
- the input device 200 includes an IR LED 210 as an example of a light emitting unit used for handwriting detection, a pen tip switch 220 as an example of a detecting unit for detecting an input state, and the overall operation of the input device 200. And a communication module 240 as an example of a communication unit for communicating with the information processing apparatus 100.
- FIG. 5 is a diagram illustrating an example of a functional configuration of the information processing system 1 according to the embodiment of the present disclosure.
- the information processing system 1 includes an information processing apparatus 100, an input device 200, a camera 300, and an output device 400.
- the camera 300 has a function of detecting the light emitting position (IR bright spot) of the input device 200 (pen) by the IR LED.
- the camera 300 includes an infrared camera.
- the IR bright spot detected by the camera 300 is linked to the BT ID of the input device 200 (pen).
- the IR bright spots detected by the camera 300 can also be used as the locus (handwriting) of the input device 200 (pen).
- the information processing apparatus 100 includes an I / F unit 110.
- the I / F unit 110 can function as a communication module that communicates with the input device 200.
- the communication module turns on / off the penpoint switch when linking the IR bright spot detected by the camera 300 with the BT ID of the input device 200 (pen) (hereinafter, also simply referred to as “pairing”). Off Receive notification, send IR LED On / Off notification, etc.
- the communication module performs wireless communication with the input device 200 by BT.
- the communication module may perform wired communication with the input device 200.
- the information processing apparatus 100 further includes an ID management unit 121, a communication management unit 122, a light emission recognition unit 123, a coordinate conversion unit 124, and a control unit 130.
- the ID management unit 121, the communication management unit 122, the light emission recognition unit 123, the coordinate conversion unit 124, and the control unit 130 may be configured by a processing device such as a CPU (Central Processing Unit).
- a processing device such as a CPU (Central Processing Unit).
- the control unit is configured by a processing unit such as a CPU, the processing unit may be configured by an electronic circuit.
- the ID management unit 121 manages the unique ID (BT ID) of the input device 200 (pen) acquired from the input device 200 by the communication control unit 132 via the I / F unit 110. That is, the ID management unit 121 manages the unique ID (BT ID) of the input device 200 (pen) recognized up to now.
- the communication management unit 122 transmits the communication device information of each input device 200 (pen) (for example, the BT ID of the first pen is “AAA”, the BT ID of the second pen is “BBB”, etc.) to manage.
- the light emission recognition unit 123 recognizes the bright spot of the IR LED from the image (for example, an infrared light image) captured by the camera 300.
- the coordinate conversion unit 124 converts the bright spot recognized by the light emission recognition unit 123 from the camera coordinate system to the screen coordinate system.
- the storage unit 140 stores information in which the BT ID of each input device 200 (pen) and the bright point of IR (information indicating the light emitting position of the IR LED) are linked.
- the control unit 130 includes a light emission control unit 131, a communication control unit 132, and a processing unit 133.
- the light emission control unit 131 controls the light emission of the IR LED (transmission control of the on / off notification of the IR LED) when associating the bright spot of IR detected by the camera 300 with the BT ID of the input device 200 (pen). I do.
- the communication control unit 132 controls the reception of the on / off notification of the pen tip switch when linking the bright spot of the IR detected by the camera 300 with the BT ID of the input device 200 (pen).
- the processing unit 133 associates the IR bright spot detected by the camera 300 with the BT ID of the input device 200 (pen), and stores the string in the storage unit 140.
- the input device 200 is a device used for input by the user. As described above, in the embodiment of the present disclosure, it is mainly assumed that a pen-shaped device having an LED 210 (IR LED) mounted at its tip is used. For example, the input device 200 has a mechanism in which the pen tip switch 220 is turned on and the LED 210 emits light when the user presses the screen. The light emission position (bright spot) by the LED 210 is detected by the camera 300 and sent to the information processing apparatus 100.
- IR LED IR LED
- the output device 400 has a function of outputting an image (for example, a trajectory of the input device 200).
- the projector 410 having the projector 410 as an output device 400 and capable of projecting an image on a screen input by the user is used.
- the output device 400 is not limited to the projector 410, and the TV 420 may be used, the tablet 430 may be used, the smartphone 440 may be used, and a PC (Personal Computer) 450 is used. It may be used.
- FIG. 6 is a state transition diagram at the time of pairing operation by the information processing system 1 according to the embodiment of the present disclosure.
- the power of the information processing system 1 is turned on in the initial state where the power of the information processing system 1 is turned off (state S0)
- the information processing device 100 and the input device 200 ( BT connection with the pen is turned on.
- the IR LED remains off
- the BT ID hereinafter also referred to as “pen ID” of the input device 200 remains indefinite (state S1).
- the CPU 230 Controls the BT event (notification of the pen tip switch including the pen ID) to be transmitted to the information processing apparatus 100 via the communication module 240 (state S2).
- the information processing apparatus 100 when a BT Event (including a pen ID) is received by the I / F unit 110, the light emission control unit 131 outputs a light emission instruction to the input device 200 (pen). Control.
- the CPU 230 In the input device 200 (pen), when a light emission instruction is input via the communication module 240, the CPU 230 causes the IR LED 210 to emit light. As a result, although the IR LED is turned on, the pen ID of the input device 200 remains indefinite (state S3).
- the light emission recognition unit 123 attempts to acquire light emission position information indicating a position where light is emitted by the IR LED 210 of the input device 200 (pen). For example, it is attempted to obtain light emission position information from an image captured by the camera 300.
- the processing unit 133 associates the light emitting position information with the pen ID.
- the processing unit 133 does not associate the light emission position information with the pen ID.
- the processing unit 133 associates the light emission position information with the pen ID when the light emission position information is acquired within the first time from the output of the light emission instruction (IR Detect).
- the on state of the IR LED is continued, and the pen ID of the input device 200 is determined ("1" in the example shown in FIG. 6) (state S4).
- the light emitting position information and the pen ID are linked, the light emitting position information (the locus of the pen) recognized sequentially is linked to the pen ID, and the projector 410 projects an image (handwriting) on the light emitting position. As a result, drawing is performed (state S7).
- the processing unit 133 causes the light emission stop instruction to be output to the input device 200 (pen) when the time when the light emission position information is not acquired continues after the acquisition of the light emission position information for more than the fourth time. Control.
- the input device 200 (pen) when the light emission stop instruction is input through the communication module 240, the CPU 230 stops the light emission by the IR LED 210. This causes a state transition from state S7 to state S5.
- the processing unit 133 does not associate the light emitting position information with the pen ID when the light emitting position information is not acquired within the first time from the output of the light emission instruction (Time out). Thereby, the on state of the IR LED is continued, and the indetermination of the pen ID of the input device 200 is continued (state S5). Then, when the light emission position information is not acquired, the light emission control unit 131 controls the light emission stop instruction to be output to the input device 200 (pen). In the input device 200 (pen), when the light emission stop instruction is input through the communication module 240, the CPU 230 stops the light emission by the IR LED 210. As a result, the IR LED is turned off, and the pen ID of the input device 200 remains indeterminate (state S6, state S1).
- a plurality of input devices 200 there may be a plurality of input devices 200 (pens).
- a plurality of BT Events On notification of the pen point switch including the pen ID
- a plurality of BT Events On notification of the pen tip switch
- a plurality of BT Events are queued by the communication management unit 122 and sequentially processed.
- the light emission control unit 131 performs the second identification from the second input device different from the first input device within a second time after the first identification information is received from the first input device.
- the second light emission instruction is issued to the second input device after the second time has elapsed since the first identification information is received. Control to be output.
- the second time has elapsed since the reception of the first identification information, the time when the first light emission position information is acquired and the first time from the output of the first light emission instruction Of the time elapsed, it may be earlier.
- the processing unit 133 indicates the position at which light is emitted by the second light emitting unit of the second input device (the IR LED 210 of the second input device) within a third time from the output of the second light emission instruction.
- the second light emitting position information is acquired, the second light emitting position information and the second identification information (pen ID of the second input device) are linked.
- the third time may be the same as the first time described above, or may be different from the first time.
- FIG. 7 is a diagram showing an example of queues managed by the communication management unit 122.
- BT Event On notification of pen point switch
- BT Event Off notification of pen point switch
- the processing for such an event is “done”.
- the processing for this event is “in process”.
- FIG. 8 is a diagram showing an image of a queue that is sequentially executed. Referring to FIG. 8, the process of the pen ID “AAA” corresponding to the first queue is performed first, and then the process of the pen ID “BBB” corresponding to the second queue is performed. Further, referring to FIG. 7, the processing for the third and subsequent queues is "standby".
- the light emitting unit is not limited to such an example.
- the light emitting unit may be a visible light LED.
- the visible light LED may be a single color LED (red, blue, etc.) or a full color LED.
- the detection unit is not limited to such an example.
- the detection unit may include a pressure sensor, an electromagnetic induction sensor, or a capacitance sensor.
- a pressure sensor is used as the detection unit, not only the handwriting by the pen but also the pen pressure by the pen may be acquired.
- the ID (pen ID) of the input device 200 is communicated by Bluetooth (registered trademark) has been mainly described.
- the ID (pen ID) of the input device 200 may be communicated by another communication method.
- the ID (pen ID) of the input device 200 may be communicated using Wi-Fi (registered trademark), Zigbee (registered trademark), infrared communication, or ultrasonic communication.
- a camera is used as an imaging device.
- an apparatus other than a camera may be used as an imaging device.
- a two-dimensional PSD Position Sensitive Device
- a two-dimensional PSD Position Sensitive Device
- extension functions applicable to the present technology.
- the input device 200 (pen) receives the status information of the input device 200 (pen) managed by the information processing apparatus 100 to the input device 200 (pen) side. It is possible to present state information of the pen) to the user.
- the light emission control unit 131 of the information processing apparatus 100 controls so that the presentation according to the status information of the input device 200 (pen) is executed by the presentation unit (not shown) of the input device 200 (pen). It may function as a presentation control unit.
- the presentation unit (not shown) of the input device 200 (pen) is an LED for status presentation different from the IR LED 210, but the presentation unit (not shown) of the input device 200 (pen) is for status presentation
- the speaker is not limited to the LED.
- the state information is not particularly limited, but may be information indicating a pairing state.
- the state information is before output of the light emission instruction (that is, indefinite state (S1)), after output of the light emission instruction and before acquisition of the light emission position information (that is, during detection of bright spots (S3))
- the BT ID may be linked (that is, at least one of the pairing complete state (S4)).
- the state information may include the case where the time when the light emission position information is not acquired continues beyond the second time (that is, the lost state of the input device (S6)).
- presentation according to the state information may be performed in any way.
- the presentation according to the state information may be executed by the blinking pattern of the LED for state presentation.
- the presentation according to the status information may be performed by the color of the LED for status presentation.
- FIG. 9 is a diagram showing an example of presentation of the pairing state.
- FIG. 9 shows the detection of the bright spot (S3), the pairing completion state (S4), the lost state (S6) of the input device, and the indefinite state (S1). Further, the upper end of FIG. 9 shows the case where the blinking pattern of the LED for state presentation is changed according to the pairing state. On the other hand, at the lower end of FIG. 9, the case is shown where the color of the LED for state presentation is changed according to the pairing state. The blinking pattern and the color may be used alone or in combination.
- the user may be able to select information (for example, the color, thickness, blur, and the like of a handwriting) as an application for drawing using the input device 200 (pen).
- information for example, the color, thickness, blur, and the like of a handwriting
- the processing unit 133 may associate the light emitting position information, the BT ID, and the information on the handwriting of the input device 200 (pen).
- the state information may be information on handwriting, and the presentation according to the state information may be performed by the blinking pattern of the LED for state presentation, or performed by the color of the LED for state presentation It is also good.
- FIG. 10 and FIG. 11 are diagrams showing an example of presentation of information regarding handwriting.
- FIGS. 10 and 11 show a case where presentation according to information on handwriting is performed by the color of the LED for status presentation.
- the states S11 to S14, the states S21 to S27, and the states S31 to S32 when the color selection is executed by the input device 200 (pen) on the pallet, the color information is processed by the communication function by BT.
- the LED for state presentation can be made to emit light based on the received color information.
- the present technology is also applicable to devices other than pens. That is, if a controller type device having both a space coordinate detection function and a communication function is available in an arbitrary space, the device can be obtained by pairing the three-dimensional coordinates of the device with an arbitrary communication connection. It can be treated as an input device like a pen.
- FIG. 12 to 17 are diagrams showing application examples of the present technology.
- a user U10 and a user U20 are playing a game.
- the user U10 wears the display D10 and has a gun J10 as an example of an AR / VR controller.
- the user U20 also wears the display D20 and has a gun J20 as an example of an AR / VR controller.
- each of the gun J10 and the gun J20 has an IR LED and a communication function as the input device 200 (pen) described above has, and by providing a camera for detecting the IR LED in the environment, the gun J10 And the gun J20 can be identified.
- a car M10 and a car M20 are traveling on the road.
- each of the car M10 and the car M20 has an IR LED and a communication function as the above-described input device 200 (pen) has, and a camera for detecting the IR LED (for example, a road on which the car travels
- the vehicle M10 and the vehicle M20 can be identified by providing the camera in the environment with an overhead camera. If a front projection projector (or an AR glass or the like) is mounted on each of the car M10 and the car M20, effects such as an attack effect and a power-up effect on the other car can be produced by the display image on the front projection projector .
- a car M30 and a car M40 are traveling on the road.
- a camera for detecting the IR LED for example, a road on which the car travels is provided to each of the car M30 and the car M40, as the input device 200 (pen) described above has the IR LED and the communication function.
- the vehicle M30 and the vehicle M40 can be identified by providing the camera in the environment with an overhead camera. If a front projection projector (or AR glass or the like) is mounted on each of the car M30 and the car M40, the car image is displayed by the display image on the front projection projector, and the attack effect and the power-up effect etc. An effect can be produced.
- the user is playing bouldering.
- effects associated with the course background or character may cause some effects to be unavailable.
- the information processing apparatus 100 recognizes that the hold is used by the band-type controllers B10 to B40 (the controllers paired as described above) attached to the limbs, and presents tactile feedback or the like to the user. You may do damage expression.
- the information processing apparatus 100 identifies a plurality of users by band-type controllers B51 to B54 and B61 to B64 (controllers paired as described above) attached to each user's limbs, and places where each user climbs It may be identified as a position game.
- band-type controllers B51 to B54 and B61 to B64 controllers paired as described above
- the combination of the ID of both hands and both feet is registered in advance which user's ID. For example, it is assumed to prepare a GUI for selecting four points from the paired ID and registering as a user.
- a plurality of individual groups R10 (individuals R11, R12, R13,...) Act as a group.
- the individual may be a robot, another device such as a drone, or an organism such as a human being as shown in FIG.
- each of the individuals constituting the population R10 with an IR LED and a communication function as the above-described input device 200 (pen) has, and providing a camera for detecting the IR LED in the environment, Individuals constituting the population R10 can be identified.
- FIG. 18 is a block diagram showing an example of the hardware configuration of the information processing apparatus 100 according to an embodiment of the present disclosure.
- the information processing apparatus 100 includes a central processing unit (CPU) 801, a read only memory (ROM) 803, and a random access memory (RAM) 805.
- the information processing apparatus 100 may also include a host bus 807, a bridge 809, an external bus 811, an interface 813, a storage device 819, a drive 821, a connection port 823, and a communication device 825.
- the information processing apparatus 100 may have a processing circuit called a digital signal processor (DSP) or an application specific integrated circuit (ASIC) instead of or in addition to the CPU 801.
- DSP digital signal processor
- ASIC application specific integrated circuit
- the CPU 801 functions as an arithmetic processing unit and a control unit, and controls the overall operation or a part of the information processing apparatus 100 according to various programs recorded in the ROM 803, the RAM 805, the storage unit 819, or the removable recording medium 827.
- the ROM 803 stores programs used by the CPU 801, calculation parameters, and the like.
- the RAM 805 temporarily stores programs used in the execution of the CPU 801, parameters that appropriately change in the execution, and the like.
- the CPU 801, the ROM 803, and the RAM 805 are mutually connected by a host bus 807 configured by an internal bus such as a CPU bus. Further, the host bus 807 is connected to an external bus 811 such as a peripheral component interconnect / interface (PCI) bus via a bridge 809.
- PCI peripheral component interconnect / interface
- the storage device 819 is a device for data storage configured as an example of a storage unit of the information processing device 100.
- the storage device 819 includes, for example, a magnetic storage unit device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 819 stores programs executed by the CPU 801, various data, various data acquired from the outside, and the like.
- the drive 821 is a reader / writer for a removable recording medium 827 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 100.
- the drive 821 reads the information recorded in the mounted removable recording medium 827 and outputs the information to the RAM 805.
- the drive 821 also writes a record on the attached removable recording medium 827.
- the connection port 823 is a port for directly connecting a device to the information processing apparatus 100.
- the connection port 823 may be, for example, a Universal Serial Bus (USB) port, an IEEE 1394 port, a Small Computer System Interface (SCSI) port, or the like.
- the connection port 823 may be an RS-232C port, an optical audio terminal, a high-definition multimedia interface (HDMI (registered trademark)) port, or the like.
- HDMI registered trademark
- the communication device 825 is, for example, a communication interface configured of a communication device or the like for connecting to the network 931.
- the communication device 825 may be, for example, a communication card for a wired or wireless Local Area Network (LAN), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 825 may be a router for optical communication, a router for asymmetric digital subscriber line (ADSL), or a modem for various communications.
- the communication device 825 transmits and receives signals and the like to and from the Internet or another communication device using a predetermined protocol such as TCP / IP.
- a network 931 connected to the communication device 825 is a network connected by wire or wireless, and is, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
- the first identification information is received from the first input device
- the first light emission instruction is output to the first input device.
- the first light emitting position information indicating the position where light is emitted by the first light emitting unit of the first input device, and the first light emitting position information and the first light emitting position information.
- An information processing apparatus comprising: a processing unit that links identification information with one. According to such a configuration, a technology capable of identifying an input device while reducing costs is provided.
- a light emission control unit configured to control a first light emission instruction to be output to the first input device when the first identification information is received from the first input device;
- An information processing apparatus comprising: (2) When the first light emission position information is acquired within a first time from the output of the first light emission instruction, the processing unit generates the first light emission position information and the first identification information. Attach, The information processing apparatus according to (1).
- the light emission control unit instructs the first input device to stop light emission when the first light emission position information is not acquired within the first time from the output of the first light emission instruction. Control to be output, The information processing apparatus according to (2).
- the light emission control unit receives second identification information from a second input device different from the first input device within a second time after the first identification information is received, Control is performed such that a second light emission instruction is output to the second input device after the second time has elapsed since the reception of the first identification information.
- the information processing apparatus according to (3).
- the processing unit is configured to acquire second light emitting position information indicating a position where light is emitted by the second light emitting unit of the second input device within a third time from an output of the second light emission instruction.
- the second light emitting position information is associated with the second identification information.
- the information processing apparatus according to (4).
- the processing unit instructs the first input device to stop light emission when the time during which the first light emission position information is not acquired continues after the acquisition of the first light emission position information for a fourth time period. Control to output The information processing apparatus according to any one of the above (1) to (5).
- the information processing apparatus includes a presentation control unit configured to perform a presentation according to state information of the first input device by a presentation unit of the first input device.
- the information processing apparatus according to any one of the above (1) to (6).
- the state information may be the first light emission position information and the first identification information before the output of the first light emission instruction, after the output of the first light emission instruction, and before the acquisition of the first light emission position information.
- the information processing apparatus according to (7). The presentation is performed by a blinking pattern of LEDs or a color of the LEDs, The information processing apparatus according to (7) or (8).
- the information processing apparatus includes a presentation control unit configured to perform a presentation according to information on handwriting of the first input device by a presentation unit of the first input device.
- the information processing apparatus according to any one of the above (1) to (6).
- the processing unit links the first light emission position information with the first identification information when the first light emission position information is acquired based on a captured image.
- the information processing apparatus according to any one of the above (1) to (10).
- the first input device is a pen-type input device.
- the processing unit links the first light emitting position information, the first identification information, and information on the handwriting of the first input device.
- the first light emitting unit is an infrared light LED or a visible light LED.
- the information processing apparatus according to any one of the above (1) to (13).
- the first identification information is transmitted from the first input device when a predetermined operation is detected by the detection unit of the first input device.
- the information processing apparatus according to any one of (1) to (14).
- the detection unit includes a switch, a pressure sensor, an electromagnetic induction sensor, or a capacitance sensor.
- the first identification information is received from the first input device using Bluetooth (registered trademark), Wi-Fi (registered trademark), Zigbee (registered trademark), infrared communication, or ultrasonic communication. , The information processing apparatus according to any one of the above (1) to (16).
- the first light emission position information and the first identification information are Attaching and Information processing methods, including: (19) An information processing system having an input device and an information processing device, wherein The input device is A control unit that controls transmission of identification information; The control unit controls the light emitting unit to emit light when a light emission instruction is received from the input device.
- the information processing apparatus is A light emission control unit configured to control the light emission instruction to be output when the identification information is received;
- a processing unit that associates the light emitting position information with the identification information when light emitting position information indicating a light emitting position is acquired by the light emitting unit;
- An information processing system comprising: (20)
- the information processing system includes an imaging device.
- the processing unit associates the light emitting position information with the identification information when the light emitting position information is acquired based on an image captured by the imaging device.
- the information processing system according to (19).
- REFERENCE SIGNS LIST 1 information processing system 100 information processing apparatus 110 I / F unit 121 ID management unit 122 communication management unit 123 light emission recognition unit 124 coordinate conversion unit 130 control unit 131 light emission control unit 132 communication control unit 133 processing unit 140 storage unit 210 LED 220 pen tip switch 230 CPU 240 communication module 300 camera 400 output device 410 projector 430 tablet 440 smartphone
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
0.概要
1.実施形態の詳細
1.1.システム構成例
1.2.ペアリング動作の詳細
1.3.各種の変形例
1.4.効果
1.5.拡張機能の例
1.6.応用例
1.7.ハードウェア構成例
2.むすび
昨今、各団体によって販売されているインタラクティブプロジェクタにおいて、デジタルペンによる入力がサポートされている。しかし、複数のペンが同時に使用されている状況において、各団体によるペンの識別手法は異なっている。例えば、複数のペンを個別に認識することにより、各ペンの筆跡に関する情報(例えば、鉛筆調、筆調、ブラシ調や、軌跡の太さなど)をペンの筆跡に反映したり、筆跡の色を変えたり、先生のペンでは書き込みが可能であるが、生徒のペンでは書き込みが不可能であるといったように、書き込み可能なセキュリティレベルの割り当てなどをペンのハードウェアに紐づけることができる。
[1.1.システム構成例]
続いて、図3を参照しながら、本開示の実施形態に係る情報処理システム1の構成例について説明する。図3は、本開示の実施形態に係る情報処理システム1の概略構成図である。図3に示されるように、情報処理システム1は、情報処理装置100、入力装置200、カメラ300、および、出力装置の例としてのプロジェクタ410を備える。図3に示された例では、カメラ300およびプロジェクタ410が床面に設置されているが、カメラ300およびプロジェクタ410が設置される位置は限定されない(床面よりも上方に設置されていてもよい)。
続いて、本開示の実施形態に係る情報処理システム1によるペアリング動作の詳細について説明する。
ここでは、本技術に適用可能な各種の変形例について述べる。
ここでは、本技術が奏する効果について述べる。
ここでは、本技術に適用可能な拡張機能の例について述べる。上記したように、入力装置200と情報処理装置100との間においては、双方向通信が可能である場合を主に想定している。かかる場合には、情報処理装置100によって管理されている入力装置200(ペン)の状態情報を、入力装置200(ペン)側にフィードバックすることによって、入力装置200(ペン)は、入力装置200(ペン)の状態情報をユーザに提示することが可能である。
ここでは、本技術の応用例について述べる。本技術はペン以外のデバイスにも応用可能である。すなわち、任意空間において、空間座標検出機能と通信機能とを併せ持つコントローラ形式のデバイスが利用可能であれば、そのデバイスの3次元座標と任意の通信コネクションとをペアリングすることによって、そのデバイスを上記したペンと同様に入力装置として扱うことが可能である。
次に、図18を参照して、本開示の実施形態に係る情報処理装置100のハードウェア構成について説明する。図18は、本開示の実施形態に係る情報処理装置100のハードウェア構成例を示すブロック図である。
以上説明したように、本開示の実施形態によれば、第1の入力装置から第1の識別情報が受信された場合、前記第1の入力装置に対して第1の発光指示が出力されるように制御する発光制御部と、前記第1の入力装置の第1の発光部によって発光された位置を示す第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付ける処理部と、を備える、情報処理装置が提供される。かかる構成によれば、コストを低減しつつ、入力装置を識別することが可能な技術が提供される。
(1)
第1の入力装置から第1の識別情報が受信された場合、前記第1の入力装置に対して第1の発光指示が出力されるように制御する発光制御部と、
前記第1の入力装置の第1の発光部によって発光された位置を示す第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付ける処理部と、
を備える、情報処理装置。
(2)
前記処理部は、前記第1の発光指示の出力から第1の時間内に、前記第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付ける、
前記(1)に記載の情報処理装置。
(3)
前記発光制御部は、前記第1の発光指示の出力から前記第1の時間内に、前記第1の発光位置情報が取得されなかった場合、前記第1の入力装置に対して発光停止指示が出力されるように制御する、
前記(2)に記載の情報処理装置。
(4)
前記発光制御部は、前記第1の識別情報が受信されてから第2の時間内に、前記第1の入力装置とは異なる第2の入力装置から第2の識別情報が受信された場合、前記第1の識別情報が受信されてから前記第2の時間が経過した後に、前記第2の入力装置に対して第2の発光指示が出力されるように制御する、
前記(3)に記載の情報処理装置。
(5)
前記処理部は、前記第2の発光指示の出力から第3の時間内に、前記第2の入力装置の第2の発光部によって発光された位置を示す第2の発光位置情報が取得された場合、前記第2の発光位置情報と前記第2の識別情報とを紐付ける、
前記(4)に記載の情報処理装置。
(6)
前記処理部は、前記第1の発光位置情報の取得後に前記第1の発光位置情報が取得されない時間が第4の時間を超えて継続した場合、前記第1の入力装置に対して発光停止指示が出力されるように制御する、
前記(1)~(5)のいずれか一項に記載の情報処理装置。
(7)
前記情報処理装置は、前記第1の入力装置の提示部によって、前記第1の入力装置の状態情報に応じた提示が実行されるように制御する提示制御部を備える、
前記(1)~(6)のいずれか一項に記載の情報処理装置。
(8)
前記状態情報は、前記第1の発光指示の出力前、前記第1の発光指示の出力後かつ前記第1の発光位置情報の取得前、前記第1の発光位置情報と前記第1の識別情報とが紐付けられた状態の少なくともいずれか一つを含む、
前記(7)に記載の情報処理装置。
(9)
前記提示は、LEDの点滅パターンまたは前記LEDの色によって実行される、
前記(7)または(8)に記載の情報処理装置。
(10)
前記情報処理装置は、前記第1の入力装置の提示部によって、前記第1の入力装置の筆跡に関する情報に応じた提示が実行されるように制御する提示制御部を備える、
前記(1)~(6)のいずれか一項に記載の情報処理装置。
(11)
前記処理部は、撮像画像に基づいて前記第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付ける、
前記(1)~(10)のいずれか一項に記載の情報処理装置。
(12)
前記第1の入力装置は、ペン型の入力装置である、
前記(1)~(11)のいずれか一項に記載の情報処理装置。
(13)
前記処理部は、前記第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報と前記第1の入力装置の筆跡に関する情報とを紐付ける、
前記(1)~(12)のいずれか一項に記載の情報処理装置。
(14)
前記第1の発光部は、赤外光LEDまたは可視光LEDである、
前記(1)~(13)のいずれか一項に記載の情報処理装置。
(15)
前記第1の入力装置の検出部によって、所定の操作が検出された場合に、前記第1の入力装置から前記第1の識別情報が送信される、
前記(1)~(14)のいずれか一項に記載の情報処理装置。
(16)
前記検出部は、スイッチ、感圧センサ、電磁誘導方式のセンサ、または、静電容量方式のセンサを含む、
前記(15)に記載の情報処理装置。
(17)
前記第1の識別情報は、Bluetooth(登録商標)、Wi-Fi(登録商標)、Zigbee(登録商標)、赤外線通信、または、超音波通信を用いて、前記第1の入力装置から受信される、
前記(1)~(16)のいずれか一項に記載の情報処理装置。
(18)
第1の入力装置から第1の識別情報が受信された場合、前記第1の入力装置に対して第1の発光指示が出力されるように制御することと、
プロセッサにより、前記第1の入力装置の第1の発光部によって発光された位置を示す第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付けることと、
を含む、情報処理方法。
(19)
入力装置と情報処理装置とを有する情報処理システムであって、
前記入力装置は、
識別情報の送信を制御する制御部を備え、
前記制御部は、前記入力装置から発光指示が受信された場合、発光部が発光するように制御し、
前記情報処理装置は、
前記識別情報が受信された場合、前記発光指示が出力されるように制御する発光制御部と、
前記発光部によって発光された位置を示す発光位置情報が取得された場合、前記発光位置情報と前記識別情報とを紐付ける処理部と、
を備える、情報処理システム。
(20)
前記情報処理システムは、撮像装置を有し、
前記処理部は、前記撮像装置による撮像画像に基づいて前記発光位置情報が取得された場合、前記発光位置情報と前記識別情報とを紐付ける、
前記(19)に記載の情報処理システム。
100 情報処理装置
110 I/F部
121 ID管理部
122 通信管理部
123 発光認識部
124 座標変換部
130 制御部
131 発光制御部
132 通信制御部
133 処理部
140 記憶部
210 LED
220 ペン先スイッチ
230 CPU
240 通信モジュール
300 カメラ
400 出力装置
410 プロジェクタ
430 タブレット
440 スマートフォン
Claims (20)
- 第1の入力装置から第1の識別情報が受信された場合、前記第1の入力装置に対して第1の発光指示が出力されるように制御する発光制御部と、
前記第1の入力装置の第1の発光部によって発光された位置を示す第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付ける処理部と、
を備える、情報処理装置。 - 前記処理部は、前記第1の発光指示の出力から第1の時間内に、前記第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付ける、
請求項1に記載の情報処理装置。 - 前記発光制御部は、前記第1の発光指示の出力から前記第1の時間内に、前記第1の発光位置情報が取得されなかった場合、前記第1の入力装置に対して発光停止指示が出力されるように制御する、
請求項2に記載の情報処理装置。 - 前記発光制御部は、前記第1の識別情報が受信されてから第2の時間内に、前記第1の入力装置とは異なる第2の入力装置から第2の識別情報が受信された場合、前記第1の識別情報が受信されてから前記第2の時間が経過した後に、前記第2の入力装置に対して第2の発光指示が出力されるように制御する、
請求項3に記載の情報処理装置。 - 前記処理部は、前記第2の発光指示の出力から第3の時間内に、前記第2の入力装置の第2の発光部によって発光された位置を示す第2の発光位置情報が取得された場合、前記第2の発光位置情報と前記第2の識別情報とを紐付ける、
請求項4に記載の情報処理装置。 - 前記処理部は、前記第1の発光位置情報の取得後に前記第1の発光位置情報が取得されない時間が第4の時間を超えて継続した場合、前記第1の入力装置に対して発光停止指示が出力されるように制御する、
請求項1に記載の情報処理装置。 - 前記情報処理装置は、前記第1の入力装置の提示部によって、前記第1の入力装置の状態情報に応じた提示が実行されるように制御する提示制御部を備える、
請求項1に記載の情報処理装置。 - 前記状態情報は、前記第1の発光指示の出力前、前記第1の発光指示の出力後かつ前記第1の発光位置情報の取得前、前記第1の発光位置情報と前記第1の識別情報とが紐付けられた状態の少なくともいずれか一つを含む、
請求項7に記載の情報処理装置。 - 前記提示は、LEDの点滅パターンまたは前記LEDの色によって実行される、
請求項7に記載の情報処理装置。 - 前記情報処理装置は、前記第1の入力装置の提示部によって、前記第1の入力装置の筆跡に関する情報に応じた提示が実行されるように制御する提示制御部を備える、
請求項1に記載の情報処理装置。 - 前記処理部は、撮像画像に基づいて前記第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付ける、
請求項1に記載の情報処理装置。 - 前記第1の入力装置は、ペン型の入力装置である、
請求項1に記載の情報処理装置。 - 前記処理部は、前記第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報と前記第1の入力装置の筆跡に関する情報とを紐付ける、
請求項1に記載の情報処理装置。 - 前記第1の発光部は、赤外光LEDまたは可視光LEDである、
請求項1に記載の情報処理装置。 - 前記第1の入力装置の検出部によって、所定の操作が検出された場合に、前記第1の入力装置から前記第1の識別情報が送信される、
請求項1に記載の情報処理装置。 - 前記検出部は、スイッチ、感圧センサ、電磁誘導方式のセンサ、または、静電容量方式のセンサを含む、
請求項15に記載の情報処理装置。 - 前記第1の識別情報は、Bluetooth(登録商標)、Wi-Fi(登録商標)、Zigbee(登録商標)、赤外線通信、または、超音波通信を用いて、前記第1の入力装置から受信される、
請求項1に記載の情報処理装置。 - 第1の入力装置から第1の識別情報が受信された場合、前記第1の入力装置に対して第1の発光指示が出力されるように制御することと、
プロセッサにより、前記第1の入力装置の第1の発光部によって発光された位置を示す第1の発光位置情報が取得された場合、前記第1の発光位置情報と前記第1の識別情報とを紐付けることと、
を含む、情報処理方法。 - 入力装置と情報処理装置とを有する情報処理システムであって、
前記入力装置は、
識別情報の送信を制御する制御部を備え、
前記制御部は、前記入力装置から発光指示が受信された場合、発光部が発光するように制御し、
前記情報処理装置は、
前記識別情報が受信された場合、前記発光指示が出力されるように制御する発光制御部と、
前記発光部によって発光された位置を示す発光位置情報が取得された場合、前記発光位置情報と前記識別情報とを紐付ける処理部と、
を備える、情報処理システム。 - 前記情報処理システムは、撮像装置を有し、
前記処理部は、前記撮像装置による撮像画像に基づいて前記発光位置情報が取得された場合、前記発光位置情報と前記識別情報とを紐付ける、
請求項19に記載の情報処理システム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/956,161 US11402932B2 (en) | 2017-12-27 | 2018-09-25 | Information processing device, information processing method, and information processing system |
CN201880082579.7A CN111512275B (zh) | 2017-12-27 | 2018-09-25 | 信息处理装置、信息处理方法和信息处理*** |
EP18897540.3A EP3734422B1 (en) | 2017-12-27 | 2018-09-25 | Information processing device, information processing method, and information processing system |
CA3085906A CA3085906A1 (en) | 2017-12-27 | 2018-09-25 | Information processing device, information processing method, and information processing system |
JP2019562759A JP7230829B2 (ja) | 2017-12-27 | 2018-09-25 | 情報処理装置、情報処理方法および情報処理システム |
KR1020207016481A KR102511791B1 (ko) | 2017-12-27 | 2018-09-25 | 정보 처리 장치, 정보 처리 방법 및 정보 처리 시스템 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017250391 | 2017-12-27 | ||
JP2017-250391 | 2017-12-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019130696A1 true WO2019130696A1 (ja) | 2019-07-04 |
Family
ID=67063454
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/035490 WO2019130696A1 (ja) | 2017-12-27 | 2018-09-25 | 情報処理装置、情報処理方法および情報処理システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US11402932B2 (ja) |
EP (1) | EP3734422B1 (ja) |
JP (1) | JP7230829B2 (ja) |
KR (1) | KR102511791B1 (ja) |
CN (1) | CN111512275B (ja) |
CA (1) | CA3085906A1 (ja) |
WO (1) | WO2019130696A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11836300B2 (en) | 2020-01-09 | 2023-12-05 | Sony Group Corporation | Information processing apparatus and information processing method |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2023142263A (ja) * | 2022-03-24 | 2023-10-05 | 富士フイルム株式会社 | 指示装置、画像生成装置、描画システム、通信方法、及び通信プログラム |
JP2024043321A (ja) * | 2022-09-16 | 2024-03-29 | 株式会社東芝 | 軌跡入力システム |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006189706A (ja) * | 2005-01-07 | 2006-07-20 | Fujinon Corp | ライトペン |
JP2008077168A (ja) * | 2006-09-19 | 2008-04-03 | Fuji Xerox Co Ltd | 筆記情報処理システム、筆記情報生成装置およびプログラム |
JP2014230034A (ja) * | 2013-05-21 | 2014-12-08 | シャープ株式会社 | 電子情報機器およびその操作権制限方法 |
JP2016018455A (ja) | 2014-07-09 | 2016-02-01 | キヤノン株式会社 | 座標入力装置及びその制御方法、コンピュータプログラム |
JP2016066521A (ja) * | 2014-09-25 | 2016-04-28 | 国立研究開発法人産業技術総合研究所 | イベント用発光装置及び該装置を用いた情報処理システム |
JP2017098268A (ja) * | 2017-01-30 | 2017-06-01 | 株式会社東芝 | 同定装置 |
JP2018132799A (ja) * | 2017-02-13 | 2018-08-23 | Necディスプレイソリューションズ株式会社 | 電子黒板システム、電子ペン、表示装置及び電子ペン位置検出方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101441539B (zh) * | 2008-12-30 | 2013-06-12 | 华为终端有限公司 | 电子白板***、输入装置、处理装置及处理方法 |
JP2011239319A (ja) * | 2010-05-13 | 2011-11-24 | Panasonic Corp | 遠隔指示送受信システム |
TW201349029A (zh) * | 2012-05-21 | 2013-12-01 | Everest Display Inc | 具光點辨識之互動投影系統以及控制方法 |
GB2508840B (en) | 2012-12-12 | 2015-10-21 | Modular Software Ltd T A Reflective Thinking | Method and apparatus for tracking the movement of a plurality of pointer devices within a scene |
JP2017117312A (ja) * | 2015-12-25 | 2017-06-29 | 株式会社リコー | 情報処理装置、情報入力システム、情報処理方法およびプログラム |
KR102451687B1 (ko) * | 2016-02-19 | 2022-10-07 | 삼성전자주식회사 | 디바이스 대 디바이스 방식을 지원하는 통신 시스템에서 위치 검출 장치 및 방법 |
US11258880B2 (en) * | 2019-06-24 | 2022-02-22 | Amazon Technologies, Inc. | Wearable device for controlling endpoint devices |
-
2018
- 2018-09-25 JP JP2019562759A patent/JP7230829B2/ja active Active
- 2018-09-25 WO PCT/JP2018/035490 patent/WO2019130696A1/ja unknown
- 2018-09-25 KR KR1020207016481A patent/KR102511791B1/ko active IP Right Grant
- 2018-09-25 US US16/956,161 patent/US11402932B2/en active Active
- 2018-09-25 CA CA3085906A patent/CA3085906A1/en active Pending
- 2018-09-25 EP EP18897540.3A patent/EP3734422B1/en active Active
- 2018-09-25 CN CN201880082579.7A patent/CN111512275B/zh active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006189706A (ja) * | 2005-01-07 | 2006-07-20 | Fujinon Corp | ライトペン |
JP2008077168A (ja) * | 2006-09-19 | 2008-04-03 | Fuji Xerox Co Ltd | 筆記情報処理システム、筆記情報生成装置およびプログラム |
JP2014230034A (ja) * | 2013-05-21 | 2014-12-08 | シャープ株式会社 | 電子情報機器およびその操作権制限方法 |
JP2016018455A (ja) | 2014-07-09 | 2016-02-01 | キヤノン株式会社 | 座標入力装置及びその制御方法、コンピュータプログラム |
JP2016066521A (ja) * | 2014-09-25 | 2016-04-28 | 国立研究開発法人産業技術総合研究所 | イベント用発光装置及び該装置を用いた情報処理システム |
JP2017098268A (ja) * | 2017-01-30 | 2017-06-01 | 株式会社東芝 | 同定装置 |
JP2018132799A (ja) * | 2017-02-13 | 2018-08-23 | Necディスプレイソリューションズ株式会社 | 電子黒板システム、電子ペン、表示装置及び電子ペン位置検出方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11836300B2 (en) | 2020-01-09 | 2023-12-05 | Sony Group Corporation | Information processing apparatus and information processing method |
Also Published As
Publication number | Publication date |
---|---|
JPWO2019130696A1 (ja) | 2020-12-17 |
EP3734422B1 (en) | 2023-11-01 |
US20210117015A1 (en) | 2021-04-22 |
EP3734422A1 (en) | 2020-11-04 |
CN111512275A (zh) | 2020-08-07 |
KR102511791B1 (ko) | 2023-03-21 |
EP3734422A4 (en) | 2021-01-27 |
US11402932B2 (en) | 2022-08-02 |
KR20200100056A (ko) | 2020-08-25 |
CA3085906A1 (en) | 2019-07-04 |
JP7230829B2 (ja) | 2023-03-01 |
CN111512275B (zh) | 2024-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107003739B (zh) | 对接*** | |
WO2019130696A1 (ja) | 情報処理装置、情報処理方法および情報処理システム | |
JP6968154B2 (ja) | 制御システムならびに制御処理方法および装置 | |
US8380246B2 (en) | Connecting mobile devices via interactive input medium | |
JP5826408B2 (ja) | ジェスチャー・コントロールのための方法、ジェスチャー・サーバ・デバイス、およびセンサ入力デバイス | |
US10096165B2 (en) | Technologies for virtual camera scene generation using physical object sensing | |
CN105850148A (zh) | 视频传输和显示*** | |
WO2014073346A1 (ja) | 情報処理装置、情報処理方法およびコンピュータ読み取り可能な記録媒体 | |
TWI557646B (zh) | 電子白板系統、電子書寫筆及電子書寫方法 | |
JP6000929B2 (ja) | 情報処理装置 | |
CN106537280A (zh) | 交互式镜子 | |
EP3103527B1 (en) | Information processing device and assignment method for input device | |
US11019162B2 (en) | System and method for provisioning a user interface for sharing | |
TW201349029A (zh) | 具光點辨識之互動投影系統以及控制方法 | |
CN106462251A (zh) | 显示控制设备、显示控制方法以及程序 | |
WO2018109876A1 (ja) | 表示装置、電子黒板システム及びユーザーインターフェース設定方法 | |
US10262278B2 (en) | Systems and methods for identification and interaction with electronic devices using an augmented reality device | |
KR102169626B1 (ko) | 입력 장치의 판별 기능을 갖춘 컴퓨터 입력 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18897540 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019562759 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3085906 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018897540 Country of ref document: EP Effective date: 20200727 |