WO2016072132A1 - Information processing device, information processing system, real object system, and information processing method - Google Patents

Information processing device, information processing system, real object system, and information processing method Download PDF

Info

Publication number
WO2016072132A1
WO2016072132A1 PCT/JP2015/074169 JP2015074169W WO2016072132A1 WO 2016072132 A1 WO2016072132 A1 WO 2016072132A1 JP 2015074169 W JP2015074169 W JP 2015074169W WO 2016072132 A1 WO2016072132 A1 WO 2016072132A1
Authority
WO
WIPO (PCT)
Prior art keywords
real object
information processing
real
processing apparatus
user
Prior art date
Application number
PCT/JP2015/074169
Other languages
French (fr)
Japanese (ja)
Inventor
平田 真一
アレクシー アンドレ
圭司 外川
直紀 沼口
洋 大澤
山岸 建
永塚 仁夫
Original Assignee
ソニー株式会社
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社, 株式会社ソニー・インタラクティブエンタテインメント filed Critical ソニー株式会社
Publication of WO2016072132A1 publication Critical patent/WO2016072132A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to information processing technology using an object in real space.
  • the above-mentioned conventional technology replaces simple things in the real world with various and attractive things in the virtual world or communicates with the equipment by using the main output means as image display on the display device. Is easily made.
  • the degree of dependence on image representation is reduced, the use of real-world objects and the room for computers to exhibit their performance are limited, making it difficult to diversify.
  • Information processing technology using objects in the real world is very effective in that it is easy for the user to understand intuitively and easily obtain a sense of reality. Therefore, there is a need for a technique that can realize the same aspect with an object in the real world, even if it is less dependent on image representation.
  • the present invention has been made in view of such problems, and an object thereof is to provide an information processing technique that can realize various forms using real-world objects.
  • An aspect of the present invention relates to an information processing apparatus.
  • the information processing apparatus sequentially acquires image frames of moving images taken by the imaging apparatus, and detects the state information of the real object existing in the object space at predetermined time intervals by detecting the images in the image frames.
  • An information processing unit that determines operation details of the real object to be controlled among the real objects at predetermined time intervals according to a predetermined rule based on the state information detected by the state specifying unit, and a control target
  • a real object control unit that controls the real object to operate according to the content determined by the information processing unit.
  • This information processing system is an information processing system including an information processing device and a real object that moves under the control of the information processing device, and the information processing device sequentially processes image frames of moving images taken by the imaging device.
  • a state specifying unit that detects state information of a real object existing in the object space at a predetermined time interval, and a predetermined based on the state information detected by the state specifying unit
  • an information processing unit that determines the operation content of the real object to be controlled among the real objects at predetermined time intervals, and a control target real object to be operated with the content determined by the information processing unit And a real object control unit.
  • Still another aspect of the present invention relates to a real object system.
  • This real object system includes a communication unit that receives a control signal from an information processing device and a drive unit that operates an actuator according to the received control signal, thereby including a plurality of real objects that operate based on the control signal.
  • one of the real objects is operated based on a control signal reflecting a user operation performed via an input device connected to the information processing device, and another real object is determined by the information processing device. The operation is based on a control signal reflecting the movement.
  • Still another aspect of the present invention relates to an information processing method.
  • This information processing method sequentially acquires image frames of moving images taken by the imaging device, and detects state information of the real object existing in the object space at predetermined time intervals by detecting images in the image frames.
  • the marker provided on the real object of the present embodiment is a diagram showing a case where the light emission of the marker is used for distinguishing the front and back of the real object by a top view of the real object
  • FIG. 1 shows a configuration example of an information processing system to which this embodiment can be applied.
  • the information processing system 1 moves the real objects 120a and 120b placed on the play field 20, the imaging device 12 that captures the space on the play field 20, and moves at least one of the real objects 120a and 120b by performing predetermined information processing.
  • the information processing device 10 includes an input device 14 that accepts user operations, a microphone 16 that acquires ambient sound, a display device 18 that displays an image, and a speaker 19 that outputs sound. Note that the input device 14, the microphone 16, the display device 18, and the speaker 19 may not be included in the information processing system 1 depending on the mode of implementation.
  • the play field 20 is a plane that defines an area serving as a reference for the information processing apparatus 10 to recognize the real objects 120a and 120b and specify the position coordinates thereof.
  • the play field 20 is not limited in material and shape as long as such a planar area can be defined, and may be any of paper, board, cloth, desk top board, game board, and the like. Or the image etc. which the projector contained in the display apparatus 18 projected on the desk or the floor may be sufficient.
  • the shape of the real objects 120a and 120b is not limited as long as it is an object existing in real space. That is, it may be a simple shape as shown in the figure, or may be a more complex shape such as a doll or a miniature of a real world object such as a doll or a minicar, its parts, or a game piece. Further, the size, material, color, number of used objects, etc. of the real objects 120a, 120b are not limited. Furthermore, it may be a structure that can be assembled or disassembled by the user, or may be a finished product. At least one of the real objects 120a and 120b includes an actuator that establishes communication with the information processing apparatus 10 and is driven by the transmitted control signal.
  • each of the real objects 120a and 120b includes wheels 122a and 122b, and is configured to self-run by rotating an axle motor according to a control signal from the information processing apparatus 10.
  • the information processing apparatus 10 moves the real object 120b based on a user operation via the input device 14, while the real object 120a is moved according to the position or movement of the real object 120b. Move.
  • the information processing apparatus 10 basically moves any real object (for example, the real object 120a) according to a predetermined rule corresponding to the state on the play field 20.
  • the real object for which the information processing apparatus 10 determines the movement is particularly referred to as “real object controlled by the information processing apparatus 10”, and is distinguished from the real object operated by the user via the input device 14.
  • the real object (for example, the real object 120b) other than the real object controlled by the information processing apparatus 10 may be any object that can be freely placed and moved by the user, and may not include an actuator inside. Even if an actuator is provided, the operation means is not limited to the input device 14 and may be directly operated using a dedicated controller or the like.
  • the object driven by the actuator is not limited to the wheel.
  • a gripper or an arm may be attached to a real object and the movable part thereof may be moved, or any of mechanisms that are controlled by a general robot or toy may be employed.
  • a light emitting element, a display, a speaker, a vibrator, and the like may be incorporated to operate them.
  • a plurality of mechanisms may be operated simultaneously. In any case, these mechanisms are controlled by the information processing apparatus 10.
  • the image pickup device 12 is a general digital video camera having an image pickup device such as a CCD (Charge-Coupled Device) or a CMOS (Complementary-Metal-Oxide-Semiconductor), and is a space on the play field 20 where real objects 120a and 120b are placed. Take a video. Alternatively, a camera that detects invisible light such as near-infrared light or a general camera that detects visible light may be combined.
  • the frame data of the moving image is sequentially transmitted to the information processing apparatus 10 together with the photographing, and is used to acquire the position coordinates of the real objects 120a and 120b on the plane formed by the play field 20. Therefore, the imaging device 12 is preferably arranged so as to overlook the play field 20.
  • the position and angle of the imaging device 12 are not particularly limited.
  • the position coordinates in the camera coordinate system are acquired using the distance, May be converted into position coordinates in the world coordinate system with the play field 20 as a horizontal plane.
  • the technology for acquiring the position of the subject in the depth direction using a stereo camera is widely known.
  • a viewpoint moving camera may be used instead of the stereo camera.
  • a device that irradiates reference light such as near infrared rays and detects the reflected light may be provided, and the positions of the real objects 120a and 120b may be specified by an existing method such as TOF (Time of Flight). Further, the positions of the real objects 120a and 120b may be specified by detecting the contact position using the upper surface of the play field 20 as a touch pad.
  • TOF or touchpad is used, the real objects 120a and 120b are distinguished by integrating with color information of each image in the captured image. Further, the real objects 120a and 120b once detected can be tracked using the existing visual tracking technology, so that the subsequent position coordinate acquisition can be made efficient.
  • the information processing apparatus 10 acquires the position and movement of a real object other than the real object on the play field 20, mainly the real object 120a that it controls, and determines the movement of the real object 120a that it controls based on that. And the real object 120a is operated by transmitting the control signal according to the determination result.
  • the information processing apparatus 10 may be a game device or a personal computer, and may implement an information processing function by loading a necessary application program. Further, as will be described later, the information processing apparatus 10 may establish communication with another information processing apparatus or server via a network and send and receive necessary information.
  • the movements of the real objects 120a and 120b are obtained by tracking the time change of their position coordinates using the moving image data captured by the imaging device 12 as described above.
  • the time change of the position is specified from the user's operation content on the input device 14 thereafter. it can.
  • the movements of the real objects 120a and 120b may be acquired by means other than the captured image, or accuracy may be improved by integrating information acquired by a plurality of means.
  • the input device 14 receives user operations such as start / end of processing and driving of the real object 120b, and inputs a signal representing the operation content to the information processing device 10.
  • the input device 14 may be any common input device such as a game controller, a keyboard, a mouse, a joystick, or a touch pad, or any combination thereof.
  • the microphone 16 acquires ambient sound, converts it into an electrical signal, and inputs it to the information processing apparatus 10.
  • the information processing apparatus 10 mainly recognizes the user's voice out of the voice signal acquired from the microphone 16 and reflects it in the movement of the real object 120a controlled by the information processing apparatus 10 itself.
  • the display device 18 displays an image generated by the information processing device 10.
  • the display device 18 is a projector and an image is displayed on the play field 20.
  • the display device 18 may be a display such as a general television monitor, or both the projector and the display may be included in the configuration.
  • the play field 20 may be a display.
  • the speaker 19 may be a general speaker, a sounding device such as a buzzer, or a combination thereof, and outputs a predetermined sound or voice as a sound according to a request from the information processing device 10. Connection between the information processing apparatus 10 and other apparatuses may be wired or wireless, and may be via various networks. Alternatively, any two or more of the information processing device 10, the imaging device 12, the input device 14, the microphone 16, the display device 18, and the speaker 19 may be combined or integrally provided. Further, an external device such as a speaker may be connected to the information processing apparatus 10.
  • FIG. 2 shows an internal circuit configuration of the information processing apparatus 10.
  • the information processing apparatus 10 includes a CPU (Central Processing Unit) 22, a GPU (Graphics Processing Unit) 24, and a main memory 26. These are connected to each other via a bus 30.
  • An input / output interface 28 is further connected to the bus 30.
  • the input / output interface 28 includes a peripheral device interface such as USB or IEEE 1394, a communication unit 32 including a wired or wireless LAN network interface, a storage unit 34 such as a hard disk drive or a nonvolatile memory, a display device 18, a speaker 19, and the like.
  • An output unit 36 that outputs data to the output device, an imaging device 12, an input device 14, and an input unit 38 that inputs data from the microphone 16, a recording medium drive unit that drives a removable recording medium such as a magnetic disk, an optical disk, or a semiconductor memory 40 is connected.
  • the CPU 22 controls processing and signal transmission in the components inside the information processing apparatus 10 by executing the operating system stored in the storage unit 34.
  • the CPU 22 also executes various programs read from the removable recording medium and loaded into the main memory 26 or downloaded via the communication unit 32.
  • the GPU 24 has a function of a geometry engine and a function of a rendering processor.
  • the GPU 24 performs a drawing process according to a drawing command from the CPU 22 and outputs it to the display device 18 as appropriate.
  • the main memory 26 is composed of RAM (Random Access Memory) and stores programs and data necessary for processing.
  • the communication unit 32 establishes communication with the real objects 120a and 120b including the actuator and transmits a control signal. In a mode in which sound is output or an image is displayed on the real object 120a or the like, those data are also transmitted. As will be described later, in a mode in which a sensor is provided on the real object 120a or the like, the communication unit 32 may receive a measurement value from the sensor from the real object 120a or the like. Further, the communication unit 32 may establish communication with the network as necessary, and send and receive necessary files and data to and from an external server or information processing apparatus.
  • FIG. 3 shows the configuration of the real objects 120a and 120b and the information processing apparatus 10 in detail.
  • each element described as a functional block for performing various processes may be configured by a CPU (Central Processing Unit), a memory, a microprocessor, other LSIs, actuators, sensors, and the like in hardware.
  • CPU Central Processing Unit
  • memory a memory
  • microprocessor other LSIs, actuators, sensors, and the like
  • software it is realized by a program loaded in a memory. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
  • a real object controlled by the information processing apparatus 10 is a real object 120a
  • a real object operated by the user via the input device 14 is a real object 120b.
  • the real object 120b may not be provided.
  • a real object that the user simply moves or places by hand may be a flat card, a doll without an internal structure, a game piece, a block, or an ornament. Even in this case, it is possible to identify each image with high accuracy from the photographed image by adding a change on the exterior such as different colors, patterns, and shapes, or printing a two-dimensional barcode. it can.
  • the real objects 120a and 120b include drive units 106a and 106b that operate according to a control signal from the information processing apparatus 10, and communication units 108a and 108b that receive necessary control signals and data from the information processing apparatus 10, respectively.
  • the control signal received by the real object 120a is determined by the information processing apparatus 10, and the control signal received by the real object 120b reflects a user operation via the input device 14.
  • the drive units 106 a and 106 b include an actuator that is driven by a control signal from the information processing apparatus 10. As shown in FIG. 1, when the real objects 120a and 120b are self-propelled by the wheels 122a and 122b, the axle is rotated or the rudder angle is changed by the actuator.
  • the driving units 106a and 106b may include an actuator that generates a motion other than the wheel, a light emitting element, a display, a speaker, a vibrator, and the like and a mechanism that operates the actuator. These mechanisms are also operated by a control signal from the information processing apparatus 10 by utilizing an existing technology.
  • the communication units 108a and 108b receive the control signal transmitted from the information processing apparatus 10, and notify the respective driving units 106a and 106b.
  • the communication units 108a and 108b hold the individual identification information of their own real objects 120a and 120b in an internal memory. Then, it is determined based on the individual identification information transmitted together with the control signal whether or not the control signal transmitted from the information processing apparatus 10 is transmitted to the real objects 120a and 120b.
  • the real objects 120a and 120b may further include a sensor (not shown) that measures its own state.
  • a sensor (not shown) that measures its own state.
  • each real object measures its own state, so that the positional relationship, the change in the shape of the real object, and the like can be more accurately determined. You may ask for.
  • the communication units 108a and 108b transmit the measurement value obtained by the sensor to the information processing apparatus 10 together with its individual identification information.
  • a rotary encoder and a rudder angle sensor may be provided on the wheels 122a and 122b to specify the actual movement amount and movement direction.
  • a position sensor that acquires the absolute position of the real objects 120a and 120b, and a motion sensor such as an acceleration sensor, a gyro sensor, or a geomagnetic sensor may be provided.
  • the real objects 120a and 120b are basically moved by a control signal from the information processing apparatus 10, if actual measurement values are acquired in this way, feedback control can be performed so as to correct the error.
  • a joint that can be bent and stretched by the user may be provided on the real objects 120a and 120b, and a potentiometer for specifying the angle may be introduced.
  • a potentiometer for specifying the angle
  • the information processing apparatus 10 includes a communication unit 50 that transmits a control signal to the real objects 120a and 120b, a state specifying unit 52 that specifies a state on the play field 20 including the positional relationship between the real objects 120a and 120b, and information on the real objects.
  • a real object control unit 60 that generates a control signal as a result of a user operation or processing in the information processing unit 62 is included.
  • the state specifying unit 52 sequentially acquires image frames of moving images from the imaging device 12 and analyzes them to specify the positions of the real objects 120a and 120b at predetermined time intervals.
  • Various techniques are widely known as techniques for detecting and tracking an image of an object by image analysis, and any of them may be adopted in the present embodiment.
  • a technique for identifying a position in a three-dimensional space by integrating image information in a captured image and position information in the depth direction using a stereo image, TOF, etc. also uses a general method. can do.
  • the state specifying unit 52 uses the measurement values transmitted from the real objects 120a and 120b to specify the movement, position, shape, posture, and the like in detail. Also good.
  • the communication unit 50 receives the measurement values transmitted from the real objects 120 a and 120 b and supplies them to the state specifying unit 52.
  • the real objects 120a and 120b have markers or two-dimensional barcodes that emit light with different specific colors. May be provided.
  • the real objects 120a and 120b may be distinguished by other appearance features such as color, shape, pattern, size, and the like.
  • the real object information storage unit 54 stores information associating individual identification information of the real objects 120a and 120b with features on their appearance.
  • the state specifying unit 52 refers to the information and specifies individual identification information corresponding to the appearance feature of the image detected from the captured image, thereby distinguishing the real objects 120a and 120b at each position.
  • the individual identification information and the position coordinates of the real objects 120a and 120b are sequentially supplied to the information processing unit 62.
  • the information processing unit 62 performs information processing to be performed based on the positional relationship between the real objects 120a and 120b, a user operation on the input device 14, an audio signal acquired by the microphone 16, and the like. For example, when realizing a game in which a real object 120b operated by a user competes with a real object 120a controlled by the information processing apparatus 10, the information processing unit 62 determines the movement of the real object 120a controlled by the information processing apparatus 10. The game progresses while calculating the score, determining the win / loss, generating the display image, and determining the output sound. A game program, rules for determining movement, data necessary for image generation, and the like are stored in the scenario storage unit 56.
  • the information processing unit 62 When using the voice signal acquired by the microphone 16, the information processing unit 62 is provided with a voice recognition function using a general technique in order to detect a predetermined keyword from the voice signal.
  • the information processing unit 62 outputs the generated display image data to the display device 18 or outputs to the speaker 19 by decoding audio data to be generated.
  • the display device 18 is a projector that projects an image onto the play field 20
  • the image on the play field 20 can be changed according to the progress of the game, the positions of the real objects 120a and 120b, and the like. Information such as a score may be included in the image.
  • the real object control unit 60 generates a control signal so that the real object 120a moves with the movement determined by the information processing unit 62. Further, it interprets the contents of the user operation on the input device 14 and generates a control signal for the real object 120b that is the operation target of the user. These control signals are associated with the individual identification information of each of the real objects 120a and 120b supplied from the information processing unit 62.
  • the communication unit 50 establishes communication with the real objects 120a and 120b, and transmits the control signal generated by the real object control unit 60. Specifically, the signal to be transmitted varies depending on the control method, and a technique generally used in the field of robot engineering or the like may be appropriately employed.
  • FIG. 4 shows an external configuration example of the real object 120a.
  • the real object 120a of this example includes wheels 122a at the lower part of a rectangular parallelepiped main body as shown in FIG.
  • a marker 126 that emits light of a predetermined color is provided at the top.
  • the state specifying unit 52 of the information processing apparatus 10 acquires the position coordinates of the real object 120a by detecting the image of the marker 126 in the captured image. That is, the center 129 of the marker 126 is set as the tracking point of the real object 120a.
  • the marker 126 is disposed at a position close to one side of the rectangle that forms the upper surface 124a of the real object 120a.
  • the side surface close to the marker 126 can be defined as the front of the real object, and recognition of forward and backward movement can be shared with the user.
  • the marker 126 is mounted with, for example, an infrared light emitting diode or a white light emitting diode.
  • the color may be switched by attaching a color filter cap to the surface of the white light emitting diode.
  • the marker 126 may incorporate a wavelength conversion element such as an up-conversion phosphor.
  • the wheels 122a are each provided with a motor 128 that rotates the axle.
  • the motor 128 corresponds to the drive unit 106a shown in FIG. 3 and is driven by a control signal transmitted from the information processing apparatus 10 to rotate the wheel 122a at the requested rotation speed and direction.
  • the wheel 122a may be provided with a mechanism for changing the rudder angle by an actuator. As a result, a movement such as changing the moving direction of the real object 120a or turning on the spot is also generated.
  • the real object 120a may be provided with a switch for releasing the rotation of the wheel 122a from the motor 128 so that the user can run the real object 120a by hand.
  • FIG. 4 A single real object may be assembled using the real object 120a shown in FIG. 4 as a minimum unit.
  • FIG. 5 shows a connection example of a plurality of real objects in a bottom view.
  • the real object 130a shown on the left side of the figure has a configuration in which two real objects 120a of the smallest unit are connected to the front and rear. In this case, the real object 130a travels in the longitudinal direction as indicated by the arrow.
  • the coupling is realized by, for example, a plate capable of fitting two real objects 120a, a cover covering the whole, a binding band, and the like.
  • Each real object 120a may be provided with a concavo-convex connection portion, a hook-and-loop fastener, a magnet, and the like to be connected. The same applies to other examples.
  • the real object 130b shown in the upper right of FIG. 5 has a configuration in which two minimum real objects 120a are connected to the left and right. In this case, as indicated by the arrow, the real object 130b travels in a direction perpendicular to the longitudinal direction.
  • the real object 130c shown in the lower right of the drawing has a configuration in which the smallest real object 120a is connected to each other so as to form a triangle inside. In this case, the real object 130c turns at the same place as shown by the arrow.
  • FIG. 6 is a top view showing a modification of the number and positions of markers provided on a real object.
  • the real object 120c shown on the left side of the figure includes two markers 126a and 126b.
  • the state specifying unit 52 of the information processing apparatus 10 acquires the position of the midpoint 132 of the line segment connecting the markers 126a and 126b in the captured image as a tracking point of the real object 120c. Similar to the example shown in FIG. 4, the middle point 132 can be distinguished from the front and back by shifting the center point 132 from the center of the upper surface of the real object 120c.
  • a boom 134 is provided on the real object 120c, and markers 126a and 126b are arranged at both ends thereof, so that the position thereof can be adjusted so that it enters the field of view of the imaging device 12 regardless of the surrounding state. Also good.
  • the real object 120d shown on the right side of the figure includes three markers 126c, 126d, and 126e.
  • the state specifying unit 52 detects the images of the markers 126c, 126d, and 126e in the photographed image, and acquires the position of the center of gravity 136 of the triangle having the vertexes as the tracking points of the real object 120d.
  • the front and rear can be distinguished by changing the number of markers before and after the real object 120d.
  • FIG. 7 shows, as another example of the marker provided on the real object, a top view of the real object when the light emission of the marker is used to distinguish the front and back of the real object.
  • the real object 120e includes markers 126f and 126g at both ends of the boom 134 as in the real object 120c of FIG.
  • the boom 134 may be disposed at an arbitrary position such as the center of the upper surface of the real object 120e as illustrated.
  • the positions of the markers 126f and 126g may be freely arranged by the user.
  • the booms 134 may not be provided, and the markers 126f and 126g may be directly installed on the real object 120e main body.
  • a plurality of markers 126f and 126g are arranged on a straight line that can define the front-rear direction, such as a straight line connecting the left and right of the upper surface of the real object 120e.
  • the information processing apparatus 10 lights the left marker 126f according to the information because the left and right of the marker are determined naturally.
  • the wheel is rotated in two forward and reverse directions with one of the markers 126f and 126g lit, and the rotation direction in which the lit marker is on the left is determined from the movement of the marker in the captured image. Identify and determine the direction of forward rotation.
  • the state specifying unit 52 of the information processing apparatus 10 records the information thus determined in the real object information storage unit 54 in association with the individual identification information of the real object 120e, and uses it for subsequent real object control. If the user recognizes only the above rules, there is no discrepancy with the control of the information processing apparatus 10.
  • the tracking point is the same as the real object 120c in FIG. This is the midpoint 132 of the line segment connecting the markers 126f and 126g.
  • the markers 126f and 126g may be lit / flashing, may emit visible / invisible light, or may have different emission colors. Note that the arrangement and number of markers shown in FIGS. 6 and 7 are examples, and may be appropriately determined according to the size and shape of the real object. In addition, as described above, when an actual object can be recognized by appearance features other than the marker, the marker need not be provided.
  • an intended world is constructed by considering it as a variety of objects so that pretend play and games can be realized. That is, it is not necessary to change the appearance of the real object according to the world to be constructed. Even in such a case, for example, the user can distinguish the former as an ambulance, the latter as a normal car, etc. by emitting one real object's marker in red and the other real object's marker in white. Intuitive grasp. At this time, if an effect such as blinking red light emission is given, the user can further experience a sense of reality.
  • the user himself / herself can distinguish between the real object controlled by the information processing apparatus 10 and the real object operated by the user.
  • the information processing apparatus 10 may clearly indicate in advance which color of the real object is the target operated by the user.
  • the information processing apparatus 10 turns on the markers in the respective colors and moves one of the real objects slightly. If the moved real object is ruled as a real object controlled by the information processing apparatus 10 (or a real object operated by the user), the user can know which color of the real object he / she operates. You can play and play smoothly.
  • FIG. 8 is a flowchart illustrating a processing procedure in which the information processing apparatus 10 controls a real object according to the state on the play field 20. This flowchart is started when a user inputs a processing start request via the input device 14 or the like in a state where the real objects 120a and 120b are placed on the play field 20. Note that when the user operates the real object 120b via the input device 14, the processing for that is performed at an independent timing and is not included in the illustrated flowchart.
  • the state specifying unit 52 of the information processing device 10 requests the imaging device 12 to start shooting (S10).
  • the imaging device 12 starts capturing a moving image accordingly, an image frame is acquired at a predetermined frame rate and transmitted to the information processing device 10.
  • the period of the time step t may be the same as the imaging period of the image frame in the imaging device 12, or may be longer than that. That is, all image frames of a moving image may be processed, or processing may be performed while thinning out image frames every predetermined number.
  • the state specifying unit 52 analyzes the image frame, and states such as the position coordinates of the real objects 120a and 120b on the plane of the play field 20 Information is acquired (S16).
  • the position coordinates on the play field 20 are obtained by detecting an image of each real object marker from the image frame and appropriately performing coordinate conversion based on the detected image.
  • the information acquired here may vary depending on the embodiment. As described above, the value measured by the sensor provided on the real object, the content of the user operation via the input device 14, the audio information acquired by the microphone 16 and the like may be further used. These pieces of information are reflected in subsequent information processing as needed.
  • the movement is not limited to the moving direction, if the movement in the direction in which it is improved is determined for each time step on the play field 20, the change in the surroundings can be dealt with flexibly. Real objects can be realized. Further, even in a complicated system where there are many real objects, the motion can be determined relatively easily by selecting the best directivity by statistical processing one by one.
  • the information processing unit 62 not only changes the moving direction of the real object, but also the amount and direction of change of the movable part of the real object such as a joint, the lighting start of the light emitting element, the image to be displayed on the display built in the real object, The sound generated from a speaker with a built-in real object may be determined. Further, an image to be newly displayed on the display device 18 such as a projector or a sound effect generated from the speaker 19 may be determined. Then, the real object control unit 60 generates a control signal based on the determination result, adds image data and audio data as necessary, and transmits the control signal from the communication unit 50, thereby controlling the real object 120a (S20). At this time, the information processing unit 62 may appropriately output display image and audio data to the display device 18 and the speaker 19.
  • an image or light may be projected onto a real object by a projector or a projector connected to the information processing apparatus 10, or a sound that matches the movement may be generated by the speaker 19. Both may be performed simultaneously, or only one of them may be performed. In either case, it is possible to make it appear as if a real object is emitting light or uttering, and it is possible to realize as much expression as possible while suppressing the above disadvantages.
  • FIG. 9 shows a realization example of “competition / competition” among the aspects that can be realized in the present embodiment.
  • a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed.
  • the play field 20 includes miniatures such as trees 140 and houses 142a and 142b. These are real objects that do not have a drive mechanism and can be placed freely by the user.
  • a river 144, a bridge 146, and sand 148 are displayed. These may be projected by a projector or drawn on paper or the like.
  • the information processing apparatus 10 acquires information on the position and range of those areas on the play field 20.
  • the information may be stored in the scenario storage unit 56. Even when the user draws on the spot, the information can be acquired by detecting from the captured image.
  • the user moves the real object 120b to play against or compete with the real object 120a controlled by the information processing apparatus 10.
  • Various rules can be considered in this case.
  • the speed to a predetermined goal such as the tree 140 may be competed or directly hit.
  • the score may be lowered when hitting a placed real object such as the tree 140 or the houses 142a and 142b.
  • a laser irradiation means (not shown) may be provided on the real objects 120a and 120b so as to be regarded as a gun, and a score may be given if the object can be fired while aiming at the opponent's real object.
  • the real objects 120a and 120b may have a shape such as a tank, for example. Further, when a traveling sound of a tank or the like is generated from the speaker 19 or the like in accordance with the movement of the real objects 120a and 120b, a more realistic feeling is given.
  • the information processing apparatus 10 controls the river 144 and the sand 148 displayed on the play field 20 so that the movement of the real objects 120a and 120b is restricted.
  • the user moves his / her real object 120b in the direction of the river 144 where the bridge 146 is not applied, the user is forced to stop at the edge of the river 144. That is, even if the input device 14 is moved in the direction of the river 144, the information processing unit 62 controls the real object 120b so that the speed becomes zero at the edge of the river 144. As a result, the user resumes the movement by retreating or changing the direction of the real object 120b.
  • the moving speed is slowed or the direction change is slowed.
  • the information processing unit 62 controls the real object 120b by reducing the moving speed requested by the user to the input device 14 by a predetermined rate or reducing the response of the change in the steering angle.
  • the speed and direction are calculated so that the same restriction is imposed on the real object 120a controlled by the information processing apparatus 10.
  • the real objects 120 a and 120 b may be prevented from going out of the play field 20.
  • a region where speed and direction change can be accelerated may be provided.
  • a team battle may be made by placing a plurality of real objects 120a and 12b. For example, a plurality of users may each bring their own real objects 120b to form a team and play against the same number of real objects 120a controlled by the information processing apparatus 10.
  • FIG. 10 shows an example of data prepared in the scenario storage unit 56 in order to realize the form shown in FIG.
  • the scenario storage unit 56 prepares data of the image 110 to be projected.
  • a river 144, a bridge 146, and a sand 148 are drawn on the image 110.
  • the information processing unit 62 reads the data of the image 110 from the scenario storage unit 56 at the start of the battle, and outputs the data to the projector that is the display device 18 for projection.
  • the deformation rule of the image 110 may also be stored in the scenario storage unit 56, and the projected image may be changed over time. For example, by displaying an image in which the water of the river 144 overflows over time or the bridge 146 collapses at the timing when the real object 120b crosses, the real object and the image are fused, and a more thrilling battle realizable.
  • the area of the river 144 and the sand 148 is acquired as described above, and the data 112 mapped to the plane corresponding to the play field 20 is prepared.
  • a motion limit amount in each region is set.
  • the data 112 displays a dotted line area indicating the position of the real object acquired at each time step.
  • This data is obtained by converting and mapping the position of the image detected from the captured image by the state specifying unit 52 into the position coordinates on the plane of the play field 20 (xy plane in the drawing).
  • the identification information “# T0”, “# T1” of the tree 140, and the house 142a are located in the vicinity of the area of each real object in FIG. , 142b identification information “# H0” and “# H1”, and real object 120a and 120b identification information “# C0” and “# C1” are also shown.
  • the state specifying unit 52 creates individual identification information of each real object and information related to the position for each time step, and supplies the information to the information processing unit 62.
  • the positions of the real objects 120a and 120b are changed by a user operation or control by the information processing apparatus 10.
  • the information processing unit 62 monitors whether or not they enter the area where the speed limit is imposed, and limits the speed by a set amount.
  • the position of the real object 120a controlled by the information processing apparatus 10 (identification information “# C0”) is compared with the position of the real object 120b operated by the user (identification information “C1”), and the position is easily moved. For example, the optimum moving direction of the real object 120a at that time is determined and controlled.
  • the handling of the data related to the position information is the same in the forms described later.
  • FIG. 11 shows an implementation example of “cooperation / assistance” among the modes that can be realized in the present embodiment.
  • An annular road 150 and a pedestrian crossing 156 are displayed in the play field 20.
  • a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed.
  • miniatures such as trees 152a and 152b and a signal 158 and dolls 154a and 154b are placed in the play field 20.
  • the trees 152a and 152b and the dolls 154a and 154b are real objects that do not have a driving mechanism, and can be freely placed by the user or moved by hand.
  • the signal 158 includes blue, yellow, and red light emitting diodes, and is configured so that the user can switch the light emission color via the input device 14 or the information processing device 10. In any case, since the actual control is performed by the information processing apparatus 10, the information processing apparatus 10 can naturally recognize the switching of the emission color.
  • the real objects 120a and 120b travel on the road 150 counterclockwise.
  • the user causes the real object 120b that he / she operates to travel ahead of the real object 120a controlled by the information processing apparatus 10, and excludes from the road 150 objects that obstruct the traveling of the real object 120a controlled by the information processing apparatus 10.
  • “cooperation” is performed so that the real object 120a can travel quickly.
  • the tree 152b falls down and blocks the road 150, so the user pushes the tree 152b with the real object 120b to get out of the road 150. Even if the real object 120a controlled by the information processing apparatus 10 reaches the real object 120b, the real object 120a is stopped there until the tree 152b is eliminated.
  • the real object 120a controlled by the information processing apparatus 10 can be represented as an ambulance, and the real object 120b operated by the user can be regarded as a private car, thereby producing a situation such as a citizen cooperating to run the ambulance quickly.
  • Each real object may be in the shape of their car. Further, a driving sound or a brake sound of a car may be generated from the speaker 19 or the like in accordance with the movement of each real object.
  • the above-described “disturbance” mode may be realized by interfering with the real object 120b operated by the player so that the real object 120a of the opponent team cannot pass.
  • the information processing apparatus 10 can realize a movement that moves forward while avoiding interference by moving the real object 120a by moving away from the disturbing real object 120b and determining a direction with a small retraction amount. .
  • the third real object to be controlled by the information processing apparatus 10 or operated by another user is prepared, and the real object 120a controlled by the information processing apparatus 10 rushes to the third real object. It is good also as a game which the real object 120b which a user operates obstructs going.
  • the third real object is an ambulance
  • the real object 120a controlled by the information processing apparatus 10 is a tank
  • the real object 120b operated by the user is a private car
  • the ambulance that is likely to be attacked by the tank is protected by the private car.
  • the child moves the dolls 154a and 154b by hand.
  • the signal 158 is red
  • the real object 120 a controlled by the information processing apparatus 10 continues to travel on the road 150.
  • the signal 158 turns blue, it stops before the pedestrian crossing 156 so that the child can move the doll 154b or the like to cross the pedestrian crossing 156.
  • the signal 158 tries to cross in red or cross the road without a pedestrian crossing
  • the real object 120a or the speaker 19 may suddenly stop or generate a brake sound just before that. To recognize that it is a dangerous act.
  • the target to which the real object 120a controlled by the information processing apparatus 10 reacts may be an object that can be held and moved by hand instead of the real object 120b having the same configuration.
  • Such a mode is particularly effective for a user who is difficult to operate with the input device 14 such as an infant.
  • the tree 152b may be excluded using a hand instead of the real object 120b.
  • the “education” mode the user operates the real object 120b instead of the dolls 154a and 154b, and generates an interaction with the real object 120a controlled by the information processing apparatus 10 while traveling on a more complicated road. By doing so, you may be allowed to learn the traffic regulations of automobiles.
  • the real object 120a controlled by the information processing apparatus 10 may be stopped in front of the dolls 154a and 154b on the roadside.
  • the user leaves the real object 120a in a state where the doll is put on the stopped real object 120a, and stops again in front of the miniature of the house placed separately. This makes it possible to play a taxi. It is a rule to stop the real object 120a in front of the miniature at the bus stop, and you can play with the bus by getting on and off the doll.
  • FIG. 12 shows an implementation example of “act” among the modes that can be realized in the present embodiment.
  • a plurality of real objects 120a controlled by the information processing apparatus 10 are placed in the play field 20, and the user appreciates playing a mass game with a unified movement.
  • the position is acquired from the captured image.
  • it is possible to return to the set formation and start the mass game.
  • the user can return to the specified position and continue the mass game.
  • data in which the time change of the position coordinates of each real object 120a is set is stored in the scenario storage unit 56.
  • the initial formation and the position in the middle are defined, so that the actual object 120a deviating from it can be returned to the defined position.
  • the information processing unit 62 determines an optimal moving direction at each time step while referring to the setting of the position coordinates so as not to hinder the movement of the other real object 120a in the returning process.
  • the information processing apparatus 10 may add effects by the movement of light such as applying a spotlight using a projector or a lighting device (not shown), or may play music from the real object 120 a itself or the speaker 19. .
  • the contents of the mass game and the music may be switched depending on the combination of the shape of the placed real object 120a and the color of the marker.
  • some change may occur depending on the user's behavior. For example, when the applause or cheering sound made by the user when the mass game is finished, the mass game may be started again as an offer. In this case, when the size of the audio signal acquired by the microphone 16 exceeds the threshold value, the information processing unit 62 determines the start of offer.
  • a mass game may proceed in accordance with the user's command, or the rhythm, tempo, and tune may be switched.
  • the user presses a predetermined button of the input device 14 with a desired time signature the real object 120a is moved by the movement according to the time signature.
  • the voice for which the user counts may be acquired by the microphone 16.
  • the imaging device 12 may take a picture of a user actually conducting a command by hand or shaking a marker that imitates a command stick, and the time may be adjusted.
  • one of the plurality of real objects 120a in this example may be a real object operated by the user or a real object moved by the user's hand.
  • the information processing apparatus 10 controls the other real object 120a so as to move in accordance with the real object.
  • the other real object 120a may move with the same movement as the real object moved by the user, or the other real object 120a may move in a line with the real object moved by the user. Good. These correspond to the “imitation” forms described above.
  • another real object may be surrounded or aligned so that the real object moved by the user is always at the center, so that the real object moved by the user may move.
  • the distance and positional relationship with the real object moved by the user, the distance and positional relationship between the other real objects 120a, and their temporal changes are set in advance and stored in the scenario storage unit 56. If the next moving direction is calculated for every time step for all the real objects 120a to be controlled, a movement that matches the real object that the user moves can be realized.
  • a game that challenges whether a mass game can be performed according to a model by mixing a plurality of real objects 120a controlled by the information processing apparatus 10 with real objects operated by the user.
  • the information processing device 10 plays music from the speaker 19 and causes the display device 18 to display a moving image representing a model movement.
  • the user places an actual object that he / she operates in one of the initial formations, and operates it so that he / she follows the example.
  • the information processing unit 62 controls other real objects and detects an event in which the real object operated by the user deviates from the model based on the captured image. Then, a final score is calculated by deducting points according to the number of deviations and displayed on the display device 18 at the end of the mass game.
  • the information processing apparatus 10 may assist the operation. Specifically, if the actual movement of the real object 120b is different from the movement requested via the input device 14 due to some circumstances, fine adjustment is made to the control signal reflecting the operation content to the input device 14 To make the actual movement closer to the request. For example, when an excessive load is applied to the real object 120b, it may be meandering even if the user's operation is straight ahead. In such a case, the information processing unit 62 adjusts the rudder angle in a direction to suppress meandering so that the real object 120b can actually go straight.
  • the actual difference based on the change in the position of the real object in the photographed image or the measurement value by the sensor is compared with the user's operation content input from the input device 14, thereby obtaining the difference between the request and the actual. To detect. And a control signal is adjusted so that a difference may become small.
  • This form corresponds to the “auxiliary” described above.
  • the assist target is not limited to that related to the traveling of the real object, but the same applies to the movement of the gripper and the arm. Further, such adjustment can be performed at any time when any other mode is implemented.
  • FIG. 13 shows an example of a mode realized when a function for grasping an object is added.
  • a real object 120a controlled by the information processing apparatus 10 a real object 120b operated by a user, and a plurality of blocks 162 are placed.
  • the block 162 may be a lump such as a synthetic resin having no mechanism inside.
  • Each real object 120 a and 120 b includes grippers 160 a and 160 b for gripping the block 162.
  • the user operates the real object 120b via the input device 14 to bring it close to the one block 162 and open / close the gripper 160 so as to grasp it. Then, the grasped block 162 is carried into his position 164b displayed on the play field 20. Similarly, the information processing apparatus 10 controls the real object 120a to move the block 162 to another position 164a. The winner is the one who brings more blocks 162 to his position. Thereby, the above-described “match” and “disturbance” modes can be realized.
  • the grippers 160a and 160b sandwich the block 162 from the left and right by opening and closing, but the block 162 may be lifted so that its base portion can be moved in the vertical direction.
  • a gripper at the tip of the arm capable of controlling the joint angle
  • a more complicated operation such as lifting the block 162 to a high position or turning the block 162 upside down may be performed.
  • a forklift mechanism for inserting and lifting a claw between the block 162 and the floor may be provided.
  • the shape of the block 162 is also appropriately optimized to facilitate carrying.
  • FIG. 14 shows another example of a mode realized when a function of grasping an object is added. This example realizes the above-mentioned form of “cooperation / assistance”.
  • a real object 120a having a gripper and a plurality of blocks 170 having various colors, which are controlled by the information processing apparatus 10, are placed.
  • the user 8 assembles the block 170 in the work area 172 provided in the play field 20. As shown in the figure, they may be assembled three-dimensionally or arranged in a plane. In the former case, it may be simply placed like a building block, or the blocks 170 may be assembled as a mutually connectable structure.
  • the user 8 specifies the color of the next necessary block by voice during assembly.
  • “RED! (Red!)” Is designated.
  • the real object 120a searches for the block of the designated color among the blocks 170 in the area other than the work area 172 on the play field 20, and carries it close to the user. If the work area 172 is fixed, it can be estimated that the user 8 is in the vicinity thereof.
  • the information processing unit 62 recognizes the designated color based on the audio signal acquired by the microphone 16 and controls the block 170 of that color to be grasped by the real object 120a and carried to the estimated position.
  • the position of the user 8 may be detected by providing a rule that the user 8 is within the field of view of the imaging device 12. In this case, even if the work area 172 is not clearly defined, if it is carried from the block 170 placed farther from the detected position of the user 8, it is not necessary to carry the block being assembled.
  • the designation of the block by voice is not limited to color, and any attribute such as size and shape may be used, or a combination thereof. In this form, the user can assemble the blocks as he / she likes, and the real object 120a can carry out the work efficiently by carrying blocks randomly selected in the process. Since hands are used for assembling, by realizing the designation of the block by voice, the work efficiency is not reduced by the designation.
  • a model selected from a completed model prepared in advance may be assembled.
  • the user selects an object to be assembled from the models displayed on the paper or the display device, and designates it using the input device 14.
  • the scenario storage unit 56 of the information processing apparatus 10 stores an assembly order and block identification information used for each model. Accordingly, it is possible to realize a mode in which the real object 120a selects and carries the next necessary block by determination by the information processing apparatus 10 without designating the block by the user. At this time, information relating to how the carried blocks are connected may be displayed on the display device 18 or projected onto the play field 20.
  • the real object 120a may be configured to sort the plurality of blocks 170 according to a predetermined standard such as color, shape, and size.
  • the information processing unit 62 first specifies the number of types of the blocks 170 on the play field 20, for example, the number of colors based on the captured image. Then, the corresponding number of areas are set on the play field 20. Then, by repeating the control of bringing the block 170 held by the real object 120a to the corresponding area according to the attribute such as the color, a group of blocks 170 is formed for each type. This makes it much easier to find the desired block 170, especially when there are a large number of blocks 170. As shown in FIG.
  • the user 8 may perform the sorting operation in parallel during the assembly of the block 170, or may be performed not only during the assembly but also during a play game described later, for example. Further, the blocks may be cleared on the real object 120a by collecting scattered blocks 170 in one place regardless of the attribute.
  • a work area for the real object 120a may be provided, and the block 170 may be assembled in parallel with the real object 120a.
  • the user and the real object 120a assemble the same model selected by the user separately. If the user 8 assembles the actual object 120a as an example, the assembly procedure is equivalent to that shown.
  • the completed form may be displayed on the display device 18 and simultaneously assembled to compete for the speed to completion. You may assemble one thing in cooperation in one work area.
  • the blocks 170 need only be placed at the position when arranged in a plane, but when the three-dimensional assembly is performed, the above-described arm or crane is provided on the real object 120a. Thus, the block 170 can be lifted to a high position.
  • FIG. 15 is a diagram for explaining an example of a block connection method in a mode in which a real object assembles a block. This figure is a side view of the process of connecting the blocks by fitting the cylindrical concave portions 176 provided in the block 170b into the columnar convex portions 174 provided in the block 170a.
  • the diameter r2 of the concave portion 176 of the block 170b is set to be larger than the diameter r1 of the convex portion 174 of the block 170a.
  • the information processing unit 62 performs the concave portion 176 of the block 170b as in the state (c).
  • the convex portion 174 is tightened by the internal mechanism.
  • the force required for the real object 120a in the fitting operation can be equivalent to lifting the block 170b and placing it on the block 170a, and the internal mechanism strengthens the connection, so that the assembled object collapses even when tilted. You can avoid it.
  • the concave portion 176 has a clamp structure in which at least a part of the inner wall is narrowed by, for example, rotation of a screw.
  • the information processing unit 62 transmits a control signal for operating the actuator that rotates the screw to the block 170b to tighten the screw.
  • the convex portion 174 of the block 170a is sandwiched.
  • the connection is removed from the state (c)
  • the state may be shifted to the state (b) by loosening a screw.
  • connection method is not limited to this, and an air suction mechanism may be provided in the connection portion, and the connection surface may be connected by forming a vacuum on the connection surface under the control of the information processing apparatus 10.
  • connection may be made by using an adhesive surface such as a hook-and-loop fastener, and when removing, the eject pin may be protruded and peeled off under the control of the information processing apparatus 10.
  • two gripper arms may be provided on the real object 120a so that the real object 120a does not lose its stability due to a reaction at the time of connection. Accordingly, if both the blocks 170a and 170b to be connected are held and connected, the reaction is unlikely to occur. Alternatively, if the play field 20 can be connected to the block 170a that is in contact with the block 170b and the block 170a that is the connection destination is fixed when the block 170b is connected, the real object 120a is less likely to lose stability.
  • the mode of using the voice command in the environment as shown in FIG. 14 can be used not only for assembling the block but also for playing.
  • the block 170 is likened to fruits and vegetables such as bananas, apples and melons depending on the color.
  • the real object 120a carries the yellow block 170 to the vicinity of the user 8.
  • a sound such as “Yes, please” is generated from the real object 120a or the speaker 19 at the same time, thereby making it possible to produce a situation such as shopping at a fruit and vegetable store.
  • the scenario storage unit 56 stores the color of the block 170 in association with keywords such as “banana”, “apple”, and “melon”.
  • the information processing unit 62 detects a keyword from the audio signal acquired by the microphone 16, the information processing unit 62 controls the real object 120a so as to carry the block 170 of the corresponding color.
  • the block 170 may be shaped like an actual fruit or vegetable.
  • the blocks may be associated with rough classifications such as fruits and vegetables, meat, fish, etc. and produced as if shopping at different stores such as fruit and vegetable stores, butchers, and fish stores.
  • the area of each store may be formed on the play field 20 by separating the blocks according to the color of the real object 120a as described above. Further, by placing a block that the user 8 regards as money on the play field 20, the real object 120 a may take it away and make it more like shopping.
  • the block provided by the user 8 may be regarded as laundry, and the real object 120a may take away the block in response to a voice such as “Please bring this with the cleaner”.
  • a keyword such as “Take this with you” and a rule of movement that grabs a block placed in the vicinity of the user 8 and puts them together in a predetermined area on the play field 20 are associated with each other. It is stored in the storage unit 56.
  • a block may be regarded as an animal or a character, or a figure having such a shape may be used instead of the block 170.
  • FIG. 16 shows still another example of a mode realized when a function of grasping an object is added.
  • This example realizes the above-described "match" form
  • the play field 20 is a board game board.
  • an actual object 120a having a gripper and pieces 180 and 182 used for a board game, which are controlled by the information processing apparatus 10 are placed.
  • the pieces 180 and 182 may be a lump of synthetic resin or the like having no mechanism inside.
  • the shapes of the pieces 180 and 182 and the grids of the board are not limited to those shown in the drawings, and may be determined as appropriate depending on the game.
  • the game played here may be a general game such as chess, shogi, and reversi.
  • the real object 120a moves his piece 180 using a gripper, and the user 8 moves his piece 182 by hand.
  • the information processing unit 62 determines the piece 180 to which the real object 120a should move and the destination to move according to the battle situation including the position of the piece 182 moved by the user.
  • Such a strategy itself may be the same processing as a program in a conventional computer game.
  • the real object 120a may play an auxiliary role in a situation where two users are competing. For example, when the opponent's piece is picked up by one of the users with chess or shogi, the piece is carried toward the user. In the case of reversi, the front and back of the piece are reversed.
  • the scenario storage unit 56 stores rules for each game, particularly rules for changing other pieces by each hand. This aspect is not limited to a board game, and can be similarly realized in a card game or a six game.
  • a battle game may be realized in which the pieces 180 and 182 are character figures and attacked by hitting the opponent figure.
  • the real object 120a avoids being hit or bumped against the figure of the user 8 by grasping or pushing his / her own figure.
  • an actual object having a gripper is also placed on the user side, and the user operates to oppose it.
  • a third real object controlled by the information processing apparatus 10 may be introduced so that the weapon held by the figure is brought from the place where the weapon is placed. At this time, the user may be able to designate a weapon by voice, as shown in FIG.
  • FIG. 17 shows an example of a mode realized when a function for holding a pen is added to a real object.
  • a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed.
  • Each real object 120a, 120b includes a holding mechanism 190a, 190b that can hold the pen downward.
  • the holding mechanisms 190 a and 190 b are configured to be able to change the pen height so that the pen tip can be brought into contact with or separated from the play field 20 by a control signal from the information processing apparatus 10. Then, when the real objects 120a and 120b move while the pen tip is in contact, a line drawing corresponding to the movement is created on the play field 20.
  • the real object 120a controlled by the information processing apparatus 10 draws a line drawing so as to compensate for the line drawing. Complete the picture.
  • the information processing apparatus 10 may decorate the user's original line drawing according to a predetermined rule, or a complete picture of a sample picture may be displayed on the display device 18 and both may draw for that purpose. You may make it go.
  • a rule regarding how to decorate the user's line drawing is stored in the scenario storage unit 56.
  • the completed form of the picture is stored in the scenario storage unit 56, and the deficient portion is drawn on the real object 120a by the control of the information processing apparatus 10 by comparing with the progress of drawing at each time step.
  • the play field 20 may be a replaceable paper or a surface-treated plate that can be redrawn by erasing line drawings.
  • the above is a form of “cooperation / assistance” using a pen.
  • the form of “activity” may be realized by drawing only a real object 120 a controlled by the information processing apparatus 10.
  • the user designates an object to be drawn by voice or an operation via the input device 14.
  • the information processing unit 62 reads out a picture associated with the designated object from the scenario storage unit 56 and draws the picture as it is, thereby realizing an interaction with the user.
  • the user may give points to the picture drawn by the real object 120a or correct the picture. At this time, the user may directly write the score and correction to the play field 20 with a pen held in his hand.
  • the information processing apparatus 10 acquires and learns the score and the corrected part from the photographed image. For example, a picture with a bad score below the threshold is not drawn again because it is not the user's preference. The picture corrected by the user is reproduced when the picture is drawn again. Therefore, the information processing unit 62 associates the score of the user and the corrected picture data with the completed picture data stored in the scenario storage unit 56.
  • a projector is used to project an image on which a line is drawn on the path through which the real objects 120a and 120b have passed. Can also be realized. This eliminates the need to change paper or erase lines, and can easily store the resulting picture electronically.
  • FIG. 18 shows an example of a mode in which an interaction is generated between the real object and the user's body.
  • a real object 120a controlled by the information processing apparatus 10 is placed. Then, the real object 120 a self-runs so as to bounce off three sides of the four sides of the edge of the play field 20. The user protects the remaining side with his / her hand 196. That is, the real object 120a moves like a ball of a pinball, but by hitting it back with the hand 196, a "matching / competition" form is realized.
  • the information processing unit 62 physically calculates the movement and rebound of the ball when it is assumed that the play field 20 is inclined at a predetermined angle, and moves the real object 120a as such.
  • the state specifying unit 52 detects the image of the hand 196 from the captured image and tracks it with an existing method.
  • the information processing unit 62 can also reproduce the manner in which the ball rebounds by the movement of the hand with the real object 120a. Even if the user does not actually hit back the real object 120a, if the hand 196 catches up with the real object 120a before the real object 120a crosses the side, the success of the hit is determined, and the real object 120a is controlled to return to the opposite side. To do. If it does not catch up, the real object 120a goes out of the play field 20 and loses the user.
  • the inside of the play field 20 may be surrounded by a block, and the real object 120a may be moved so as to bounce off the block.
  • a block may be placed inside it as a rebound point. Even if the user places the block in a free position or shape, the position coordinates of each block can be determined from the captured image, so the speed and moving direction of the real object 120a can be determined for each time step by simple physical calculation. . If the material is assumed according to the color of the block, the change in the coefficient of restitution can be reflected in the speed calculation. When rebounding, a rebounding sound may be generated from the speaker 19 or the like.
  • a “competition” mode in which the hand 196 follows the real object 120a may be realized.
  • a plurality of real objects 120a may be placed so as to escape all at once according to the movement of the hand 196.
  • the information processing unit 62 determines the moving direction of each real object 120a so as to move away from the hand based on the position information of the hand at each time step. At this time, if the moving direction is determined by generating random numbers or the like so that the directions are dispersed as much as possible, it becomes difficult to catch and the game performance is further improved.
  • the user may be in a state where the user does not actually grab the real object 120a but has caught it in a predetermined area of the play field 20.
  • a living creature such as sheep, a mouse, or a fish
  • a cry from the speaker 19 By making the real object 120a into the shape of a living creature such as sheep, a mouse, or a fish, or generating a cry from the speaker 19, a more realistic feeling can be given.
  • FIG. 19 shows an example of a “cooperation” or “act” mode in which ball rolling is included in the movement of a real object.
  • a device such as a Rube Goldberg machine is assumed in which other objects and devices move so as to respond to the rolling of the ball 200.
  • the play field 20 several real objects 120a controlled by the information processing apparatus 10 and some real objects 120b operated by the user are placed.
  • an approximate course in which the ball 200 rolls is formed by the user in the block 202.
  • Real objects 120m, 120n, and 120p are further placed as real objects controlled by the information processing apparatus 10.
  • the real object 120m is composed of a quadrangular prism main body and a slide.
  • the quadrangular column is provided with holes 204 and 206 at the lower part and the upper part, and a mechanism for raising and lowering the internal cavity like an elevator.
  • the actuator that moves the table up and down is driven by a control signal from the information processing apparatus 10.
  • the real objects 120n and 120p are provided with a mechanism that causes the whole to emit light or generate a sound effect according to a control signal from the information processing apparatus 10.
  • the real objects 120n and 120p may appear to emit light by projecting light from the projector, or sound effects may be generated from the speaker 19.
  • the real object 120a controlled by the information processing apparatus 10 puts the ball 200 carried by the gripper into the hole 204 below the real object 120m.
  • the platform on which the ball 200 is placed rises inside the quadrangular column of the real object 120m, and the ball 200 goes out from the upper hole 206 by tilting the platform.
  • the ball 200 rolls on the slide and collides with the real object 120n.
  • the information processing apparatus 10 reacts by causing the real object 120n to emit light or generate sound.
  • the ball 200 that bounces and rolls off from the real object 120n is guided to a desired direction by the user operating the real object 120b to bounce or push it.
  • the real object 120a controlled by the information processing apparatus 10 is rebounded or pushed into the opening of the real object 120p, and the ball 200 thus rolled is set as a goal.
  • the information processing apparatus 10 tracks the position of the ball 200 based on the captured image, moves the real object 120a to a position where it rebounds in the direction of the real object 120p, or pushes the ball 200 with the real object 120a. Further, it is detected from the photographed image that the ball 200 has entered the opening, and the real object 120p is caused to emit light or a sound is generated to produce the goal moment. As described above, in the present embodiment, since the change in the field is detected based on the captured image, various functions of the real object can be activated in accordance with the movement of the ball that is not controlled.
  • the scenario storage unit 56 stores the rules related to the movement of the ball 200 as described above and the movement of the real object according to the movement.
  • FIG. 20 is a diagram for explaining a mode in which a plurality of users participate in one game using a network.
  • the information processing systems 1a, 1b, and 1c are similarly constructed in three places.
  • the information processing apparatuses 10a, 10b, and 10c of each system are connected to the server 210 via the network 212.
  • real objects that are moved by the users of the information processing systems 1b and 1c via the network 212 are placed in addition to the real objects operated by the users there.
  • the server 210 associates the information processing system of the participating user with the real object, and notifies the information processing apparatuses 10a, 10b, and 10c.
  • the server 210 notifies the other information processing system of the information.
  • Each of the information processing apparatuses 10a, 10b, and 10c moves a corresponding real object in accordance with an operation of the user via the input device 14, and responds to each of them according to operation information of other users notified from the server 210. Move a real object.
  • the real object corresponding to each user moves in the same manner.
  • the number of real objects may be increased, and at least one of the server 210 or the information processing apparatuses 10a, 10b, and 10c may determine the movement. In this way, for example, the battle game is executed in the play field as shown in FIG.
  • the game can be made more interesting by performing various adjustments. For example, the number of times the user has played in the past and the battle record are recorded in the server 210, and the information processing apparatuses 10a, 10b, and 10c are notified at the start of the game. The information processing apparatuses 10a, 10b, and 10c adjust the movement of the real object according to the information.
  • a handicap is given to the user, and the real object of another user is moved at a higher speed, or a single bombardment Score higher than other users.
  • the number of the first piece is made different. You may divide the proficiency level into multiple levels. By doing so, since it is possible to fight equally regardless of age and proficiency level, it is possible to enjoy the game without selecting an opponent. Moreover, depending on the case, it is also possible to select that only users with the same skill level fight without handicap.
  • the position and movement of an actual object on the play field are detected using an image obtained by shooting the play field. Then, the information processing apparatus controls and moves any of the real objects according to a preset rule. This makes it possible to create various spaces in the real world where objects react in various ways. At this time, by determining the movement of the real object at each time step of a minute time interval, it is possible to make it appear as if it is responding to the movement of the other real object flexibly with a relatively simple calculation.
  • a real object that is operated by the user via the input device or a real object that is placed or moved by a hand is introduced to influence the movement of the real object that is controlled by the information processing apparatus.
  • play such as a computer game and collaborative work with a device can be realized using a real object, and even an infant who is not familiar with an input device can easily play and use it.
  • the same effect can be obtained by acquiring a voice and reacting a real object to it.
  • the real object operated by the user is also moved through the information processing device, so that the moving speed can be limited depending on the place on the play field, and the user can be adjusted to give a handicap.
  • various adjustments that were easy in a computer game are possible in the real world, and the rules can be made more complex and multifunctional.
  • the difficulty level can be changed easily.
  • the movement of the real object is determined based on the captured image, it is possible to interact with the real object controlled by the information processing device and the image or output sound displayed in the play field, the movement of an object or human body that does not have a drive system It becomes. If the image is projected by the projector, the real object and the image can be linked, and thus the world view to be constructed can be easily changed.
  • the information processing apparatus can be included as a participating member. For example, it is possible to realize a relatively large aspect in which a multi-user Allied Force and an information processing device army battle each other using a large number of real objects.
  • Information processing system 10 Information processing device, 12 Imaging device, 14 Input device, 16 Microphone, 18 Display device, 19 Speaker, 20 Playfield, 22 CPU, 24 GPU, 26 Main memory, 50 Communication unit, 52 Status identification unit 54 real object information storage unit, 56 scenario storage unit, 60 real object control unit, 62 information processing unit, 106a drive unit, 108a communication unit, 120a real object, 122 wheels, 126 markers, 128 motors.
  • the present invention can be used for toys, learning devices, computers, game devices, information processing devices, and systems including them.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

In an information processing system 1, an image pickup device 12 picks up images of real objects 120a, 120b placed on a playfield 20. An information processing device 10 acquires the state of the playfield 20 from picked-up images or the like, and determines and controls the movement of the real object 120a according to the positional relationship between the real objects 120a, 120b, and user requests made through a microphone 16 and an input device 14. The information processing device also controls the real object 120b according to a user operation performed via the input device 14. The playfield 20 is formed by a projected image from a projector included in a display device 18. The projected image as well as audio from a speaker 19 are changed according to the movement of the real objects 120a, 120b.

Description

情報処理装置、情報処理システム、実物体システム、および情報処理方法Information processing apparatus, information processing system, real object system, and information processing method
 本発明は、実空間における物体を利用した情報処理技術に関する。 The present invention relates to information processing technology using an object in real space.
 近年、実空間における人や物などの物体に係るパラメータを何らかの手段により計測し、それを入力値としてコンピュータに取り込み解析したり、画像として表示したりする技術が様々な分野で利用されている。コンピュータゲームの分野においては、ユーザ自身やユーザが把持するマーカーの動きを取得し、それに応じて表示画面内の仮想世界におけるキャラクタを動作させるといったことにより、直感的かつ容易な操作が実現されている(例えば特許文献1参照)。このように実空間における物体の動きや形状変化を表示画像に反映させる技術は、ゲームのみならず玩具や学習用教材などへの応用が見込まれている(例えば非特許文献1、特許文献2参照)。 In recent years, a technique for measuring a parameter related to an object such as a person or an object in a real space by some means, importing it into a computer as an input value, and analyzing it or displaying it as an image has been used in various fields. In the field of computer games, intuitive and easy operations are realized by acquiring the movement of the user and the marker held by the user and moving the character in the virtual world in the display screen accordingly. (For example, refer to Patent Document 1). As described above, the technique for reflecting the movement and shape change of an object in the real space on a display image is expected to be applied not only to a game but also toys and learning materials (for example, see Non-Patent Document 1 and Patent Document 2). ).
WO 2007/050885 A2公報WO 2007/050885 A2 Publication 特開2008-73256号公報JP 2008-73256 A
 上述の従来技術は、主たる出力手段を表示装置への画像表示とすることで、実世界の簡素な物を、仮想世界における多様かつ魅力的な物へと置き換えたり、装置とコミュニケーションをとったりすることが容易にできる、という特徴を有する。一方、画像表現への依存度を小さくしていくと、実世界の物の用途やコンピュータが性能を発揮する余地が限定され、多様化しにくくなる。実世界における物を利用した情報処理技術は、ユーザにとって、直感的な理解が容易であり臨場感が得られやすいという点で非常に有効である。したがって画像表現への依存が少なくても、実世界にある物によって同様の態様を実現できる技術が求められている。 The above-mentioned conventional technology replaces simple things in the real world with various and attractive things in the virtual world or communicates with the equipment by using the main output means as image display on the display device. Is easily made. On the other hand, as the degree of dependence on image representation is reduced, the use of real-world objects and the room for computers to exhibit their performance are limited, making it difficult to diversify. Information processing technology using objects in the real world is very effective in that it is easy for the user to understand intuitively and easily obtain a sense of reality. Therefore, there is a need for a technique that can realize the same aspect with an object in the real world, even if it is less dependent on image representation.
 本発明はこのような課題に鑑みてなされたものであり、その目的は、実世界の物体を利用し多様な形態を実現できる情報処理技術を提供することにある。 The present invention has been made in view of such problems, and an object thereof is to provide an information processing technique that can realize various forms using real-world objects.
 本発明のある態様は情報処理装置に関する。この情報処理装置は、撮像装置が撮影した動画像の画像フレームを順次取得し、当該画像フレームにおける像を検出することにより、被写空間に存在する実物体の状態情報を所定の時間間隔で検出する状態特定部と、状態特定部が検出した状態情報に基づく所定の規則で、実物体のうち制御対象の実物体の作動の内容を所定の時間間隔で決定する情報処理部と、制御対象の実物体を、情報処理部が決定した内容で作動するように制御する実物体制御部とを備えたことを特徴とする。 An aspect of the present invention relates to an information processing apparatus. The information processing apparatus sequentially acquires image frames of moving images taken by the imaging apparatus, and detects the state information of the real object existing in the object space at predetermined time intervals by detecting the images in the image frames. An information processing unit that determines operation details of the real object to be controlled among the real objects at predetermined time intervals according to a predetermined rule based on the state information detected by the state specifying unit, and a control target And a real object control unit that controls the real object to operate according to the content determined by the information processing unit.
 本発明の別の態様は情報処理システムに関する。この情報処理システムは、情報処理装置と、当該情報処理装置が制御することにより動く実物体とを含む情報処理システムであって、情報処理装置は、撮像装置が撮影した動画像の画像フレームを順次取得し、当該画像フレームにおける像を検出することにより、被写空間に存在する実物体の状態情報を所定の時間間隔で検出する状態特定部と、状態特定部が検出した状態情報に基づく所定の規則で、実物体のうち制御対象の実物体の作動の内容を所定の時間間隔で決定する情報処理部と、制御対象の実物体を、情報処理部が決定した内容で作動するように制御する実物体制御部と、を備えたことを特徴とする。 Another aspect of the present invention relates to an information processing system. This information processing system is an information processing system including an information processing device and a real object that moves under the control of the information processing device, and the information processing device sequentially processes image frames of moving images taken by the imaging device. By acquiring and detecting an image in the image frame, a state specifying unit that detects state information of a real object existing in the object space at a predetermined time interval, and a predetermined based on the state information detected by the state specifying unit By rule, an information processing unit that determines the operation content of the real object to be controlled among the real objects at predetermined time intervals, and a control target real object to be operated with the content determined by the information processing unit And a real object control unit.
 本発明のさらに別の態様は実物体システムに関する。この実物体システムは、情報処理装置から制御信号を受信する通信部と、受信した制御信号に従いアクチュエータを動作させる駆動部と、を備えることにより、制御信号に基づき動作する実物体を複数個含む実物体システムであって、実物体のうちいずれかは、情報処理装置に接続した入力装置を介してなされたユーザ操作を反映した制御信号に基づき作動し、別の実物体は、情報処理装置が決定した動きを反映した制御信号に基づき作動することを特徴とする。 Still another aspect of the present invention relates to a real object system. This real object system includes a communication unit that receives a control signal from an information processing device and a drive unit that operates an actuator according to the received control signal, thereby including a plurality of real objects that operate based on the control signal. In the body system, one of the real objects is operated based on a control signal reflecting a user operation performed via an input device connected to the information processing device, and another real object is determined by the information processing device. The operation is based on a control signal reflecting the movement.
 本発明のさらに別の態様は情報処理方法に関する。この情報処理方法は、撮像装置が撮影した動画像の画像フレームを順次取得し、当該画像フレームにおける像を検出することにより、被写空間に存在する実物体の状態情報を所定の時間間隔で検出するステップと、検出した状態情報に基づく所定の規則で、実物体のうち制御対象の実物体の作動の内容を所定の時間間隔で決定するステップと、制御対象の実物体を、決定するステップで決定した内容で作動するように制御するステップと、を含むことを特徴とする。 Still another aspect of the present invention relates to an information processing method. This information processing method sequentially acquires image frames of moving images taken by the imaging device, and detects state information of the real object existing in the object space at predetermined time intervals by detecting images in the image frames. A step of determining the content of operation of the real object to be controlled among the real objects at predetermined time intervals according to a predetermined rule based on the detected state information, and a step of determining the real object to be controlled Controlling to operate according to the determined content.
 なお、以上の構成要素の任意の組合せ、本発明の表現を方法、装置、システム、コンピュータプログラム、コンピュータプログラムを記録した記録媒体などの間で変換したものもまた、本発明の態様として有効である。 Note that any combination of the above-described components, and the expression of the present invention converted between a method, an apparatus, a system, a computer program, a recording medium on which the computer program is recorded, and the like are also effective as an aspect of the present invention. .
 本発明によると、実世界の物体を利用した遊びや学習を様々な形態で容易に実現できる。 According to the present invention, play and learning using real-world objects can be easily realized in various forms.
本実施の形態を適用できる情報処理システムの構成例を示す図である。It is a figure which shows the structural example of the information processing system which can apply this Embodiment. 本実施の形態における情報処理装置の内部回路構成を示す図である。It is a figure which shows the internal circuit structure of the information processing apparatus in this Embodiment. 本実施の形態における実物体と情報処理装置の構成を詳細に示す図である。It is a figure which shows the structure of the real object and information processing apparatus in this Embodiment in detail. 本実施の形態における実物体の外観構成例を示す図である。It is a figure which shows the example of an external appearance structure of the real object in this Embodiment. 本実施の形態における複数の実物体の結合例を下面図により示す図である。It is a figure which shows the example of a coupling | bonding of the some real object in this Embodiment with a bottom view. 本実施の形態の実物体に設けるマーカーの数や位置の変形例を上面図により示す図である。It is a figure which shows the modification of the number and position of the marker provided in the real object of this Embodiment with a top view. 本実施の形態の実物体に設けるマーカーの別の例として、実物体の前後の区別にマーカーの発光を利用する場合を実物体の上面図により示す図であるAs another example of the marker provided on the real object of the present embodiment, it is a diagram showing a case where the light emission of the marker is used for distinguishing the front and back of the real object by a top view of the real object 本実施の形態において情報処理装置がプレイフィールド上の状態に応じて実物体を制御する処理手順を示すフローチャートである。It is a flowchart which shows the process sequence which an information processing apparatus controls a real object according to the state on a play field in this Embodiment. 本実施の形態で実現できる態様のうち「対戦・競争」の実現例を示す図である。It is a figure which shows the example of implementation | achievement of a "match / competition" among the aspects which can be implement | achieved in this Embodiment. 図8で示した形態を実現するためにシナリオ記憶部に準備するデータ例を示す図である。It is a figure which shows the example of data prepared for a scenario memory | storage part in order to implement | achieve the form shown in FIG. 本実施の形態で実現できる態様のうち「協力・補助」の実現例を示す図である。It is a figure which shows the implementation example of "cooperation and assistance" among the aspects which can be implement | achieved in this Embodiment. 本実施の形態で実現できる態様のうち「演技」の実現例を示す図である。It is a figure which shows the implementation example of "act" among the aspects which can be implement | achieved in this Embodiment. 本実施の形態において実物体に物をつかむ機能を追加したときに実現される態様の例を示す図である。It is a figure which shows the example of the aspect implement | achieved when the function to hold an object to a real object is added in this Embodiment. 本実施の形態において実物体に物をつかむ機能を追加したときに実現される態様の別の例を示す図である。It is a figure which shows another example of the aspect implement | achieved when the function to hold an object to a real object is added in this Embodiment. 本実施の形態の実物体がブロックを組み立てる態様におけるブロックの接続手法の例を説明するための図である。It is a figure for demonstrating the example of the connection method of the block in the aspect in which the real object of this Embodiment assembles a block. 本実施の形態において実物体に物をつかむ機能を追加したときに実現される態様のさらに別の例を示す図である。It is a figure which shows another example of the aspect implement | achieved when the function which grasps | emits an object in this Embodiment is added. 本実施の形態において実物体にペンを保持する機能を追加したときに実現される態様の例を示す図である。It is a figure which shows the example of the aspect implement | achieved when the function to hold | maintain a pen is added to the real object in this Embodiment. 本実施の形態において実物体とユーザの体とでインタラクションを発生させる態様の例を示す図である。It is a figure which shows the example of the aspect which generate | occur | produces interaction with a real object and a user's body in this Embodiment. 本実施の形態において玉の転がりを実物体の動きに含めた態様の例を示す図である。It is a figure which shows the example of the aspect which included the rolling of the ball in the motion of the real object in this Embodiment. 本実施の形態においてネットワークを利用して複数のユーザが1つのゲームに参加する態様を説明するための図である。It is a figure for demonstrating the aspect in which a some user participates in one game using a network in this Embodiment.
 図1は本実施の形態を適用できる情報処理システムの構成例を示す。情報処理システム1は、プレイフィールド20上に置かれた実物体120a、120b、プレイフィールド20上の空間を撮影する撮像装置12、所定の情報処理を行い実物体120a、120bの少なくともいずれかを動かす情報処理装置10、ユーザ操作を受け付ける入力装置14、周囲の音声を取得するマイクロフォン16、画像を表示する表示装置18、および音声を出力するスピーカー19を含む。なお入力装置14、マイクロフォン16、表示装置18、スピーカー19は、実施する態様によっては情報処理システム1に含めなくてもよい。 FIG. 1 shows a configuration example of an information processing system to which this embodiment can be applied. The information processing system 1 moves the real objects 120a and 120b placed on the play field 20, the imaging device 12 that captures the space on the play field 20, and moves at least one of the real objects 120a and 120b by performing predetermined information processing. The information processing device 10 includes an input device 14 that accepts user operations, a microphone 16 that acquires ambient sound, a display device 18 that displays an image, and a speaker 19 that outputs sound. Note that the input device 14, the microphone 16, the display device 18, and the speaker 19 may not be included in the information processing system 1 depending on the mode of implementation.
 プレイフィールド20は、情報処理装置10が実物体120a、120bを認識し、その位置座標を特定するための基準となる領域を規定する平面である。プレイフィールド20はそのような平面領域を規定できれば材質や形状は限定されず、紙、板、布、机の天板、ゲーム盤などのいずれでもよい。あるいは表示装置18に含まれるプロジェクタが机上や床上に投影した画像などでもよい。 The play field 20 is a plane that defines an area serving as a reference for the information processing apparatus 10 to recognize the real objects 120a and 120b and specify the position coordinates thereof. The play field 20 is not limited in material and shape as long as such a planar area can be defined, and may be any of paper, board, cloth, desk top board, game board, and the like. Or the image etc. which the projector contained in the display apparatus 18 projected on the desk or the floor may be sufficient.
 実物体120a、120bは、実空間に存在する物体であればその形状は限定されない。つまり図示するような単純な形状の物でもよいし、人形やミニカーなど現実世界にある物のミニチュアやその部品、ゲームの駒など、より複雑な形状の物でもよい。また実物体120a、120bのサイズ、材質、色、使用する個数なども限定されない。さらに、ユーザによって組み立てたり分解したりできる構造としてもよいし、完成物であってもよい。実物体120a、120bの少なくともいずれかは、情報処理装置10と通信を確立し、送信された制御信号によって駆動するアクチュエータを備える。 The shape of the real objects 120a and 120b is not limited as long as it is an object existing in real space. That is, it may be a simple shape as shown in the figure, or may be a more complex shape such as a doll or a miniature of a real world object such as a doll or a minicar, its parts, or a game piece. Further, the size, material, color, number of used objects, etc. of the real objects 120a, 120b are not limited. Furthermore, it may be a structure that can be assembled or disassembled by the user, or may be a finished product. At least one of the real objects 120a and 120b includes an actuator that establishes communication with the information processing apparatus 10 and is driven by the transmitted control signal.
 通信の確立は、Bluetooth(ブルートゥース)(登録商標)プロトコルやIEEE802.11等の無線インターフェースを用いた接続によって行ってもよいし、ケーブルを介してもよい。図示する例では、実物体120a、120bはそれぞれ、車輪122a、122bを備え、情報処理装置10からの制御信号によって車軸のモーターを回転させることにより自走する構成としている。ここで情報処理装置10は、例えば実物体120bを、入力装置14を介したユーザ操作に基づき移動させる一方、実物体120aを、実物体120bの位置や動きなどに応じて自らが決定した動きで移動させる。 The establishment of communication may be performed by connection using a wireless interface such as Bluetooth (registered trademark) protocol or IEEE802.11, or via a cable. In the illustrated example, each of the real objects 120a and 120b includes wheels 122a and 122b, and is configured to self-run by rotating an axle motor according to a control signal from the information processing apparatus 10. Here, for example, the information processing apparatus 10 moves the real object 120b based on a user operation via the input device 14, while the real object 120a is moved according to the position or movement of the real object 120b. Move.
 このように本実施の形態では、情報処理装置10がいずれかの実物体(例えば実物体120a)を、プレイフィールド20上の状態に応じた所定の規則に従い動かすことを基本とする。なお以後の説明において、情報処理装置10が動きを決定する実物体を特に「情報処理装置10が制御する実物体」と呼び、ユーザが入力装置14を介して操作する実物体と区別する。情報処理装置10が制御する実物体以外の実物体(例えば実物体120b)は、ユーザが自由に置いたり動かしたりできる物であればよく、内部にアクチュエータを備えていなくてもよい。またアクチュエータを備えていても、その操作手段は入力装置14に限らず、専用のコントローラなどを用いて直接操作できるようにしてもよい。 As described above, in this embodiment, the information processing apparatus 10 basically moves any real object (for example, the real object 120a) according to a predetermined rule corresponding to the state on the play field 20. In the following description, the real object for which the information processing apparatus 10 determines the movement is particularly referred to as “real object controlled by the information processing apparatus 10”, and is distinguished from the real object operated by the user via the input device 14. The real object (for example, the real object 120b) other than the real object controlled by the information processing apparatus 10 may be any object that can be freely placed and moved by the user, and may not include an actuator inside. Even if an actuator is provided, the operation means is not limited to the input device 14 and may be directly operated using a dedicated controller or the like.
 さらにアクチュエータで駆動させる対象は車輪に限らない。例えば実物体にグリッパーやアームを取り付け、その可動部分を動かすなどでもよく、一般的なロボットや玩具などで制御対象となる機構のいずれを採用してもよい。またそのような機械的な動きに限らず、発光素子、ディスプレイ、スピーカー、振動子などを内蔵することにより、それらを作動させてもよい。同時に複数の機構を作動させてもよい。いずれにしろそれらの機構は情報処理装置10によって制御される。 Furthermore, the object driven by the actuator is not limited to the wheel. For example, a gripper or an arm may be attached to a real object and the movable part thereof may be moved, or any of mechanisms that are controlled by a general robot or toy may be employed. In addition to such mechanical movement, a light emitting element, a display, a speaker, a vibrator, and the like may be incorporated to operate them. A plurality of mechanisms may be operated simultaneously. In any case, these mechanisms are controlled by the information processing apparatus 10.
 撮像装置12は、CCD(Charge Coupled Device)またはCMOS(Complementary Metal Oxide Semiconductor)等の撮像素子を備えた一般的なデジタルビデオカメラであり、実物体120a、120bが置かれたプレイフィールド20上の空間を動画撮影する。または近赤外線等の非可視光を検出するカメラとしたり、それと一般的な可視光を検出するカメラとを組み合わせた構成としたりしてもよい。当該動画像のフレームデータは撮影とともに情報処理装置10へ順次送信され、プレイフィールド20が構成する平面における実物体120a、120bの位置座標の取得に用いられる。したがって撮像装置12は、好適にはプレイフィールド20を俯瞰する配置とする。 The image pickup device 12 is a general digital video camera having an image pickup device such as a CCD (Charge-Coupled Device) or a CMOS (Complementary-Metal-Oxide-Semiconductor), and is a space on the play field 20 where real objects 120a and 120b are placed. Take a video. Alternatively, a camera that detects invisible light such as near-infrared light or a general camera that detects visible light may be combined. The frame data of the moving image is sequentially transmitted to the information processing apparatus 10 together with the photographing, and is used to acquire the position coordinates of the real objects 120a and 120b on the plane formed by the play field 20. Therefore, the imaging device 12 is preferably arranged so as to overlook the play field 20.
 ただし実物体120a、120bの位置座標が判明すれば、撮像装置12の位置や角度は特に限定されない。例えば実物体120a、120bに所定サイズのマーカーを設け、その見かけ上のサイズで撮像装置12の撮像面からの距離が判明する場合、それを利用してカメラ座標系における位置座標を取得し、それをプレイフィールド20を水平面とするワールド座標系における位置座標に変換してもよい。あるいは撮像装置12を、既知の間隔を有する左右の視点から同一空間を撮影するステレオカメラで構成することで、撮像面からの実物体120a、120bの距離を取得してもよい。 However, if the position coordinates of the real objects 120a and 120b are found, the position and angle of the imaging device 12 are not particularly limited. For example, when a marker of a predetermined size is provided on the real objects 120a and 120b and the distance from the imaging surface of the imaging device 12 is determined with the apparent size, the position coordinates in the camera coordinate system are acquired using the distance, May be converted into position coordinates in the world coordinate system with the play field 20 as a horizontal plane. Or you may acquire the distance of the real objects 120a and 120b from the imaging surface by comprising the imaging device 12 with the stereo camera which image | photographs the same space from the right-and-left viewpoint which has a known space | interval.
 ステレオカメラにより被写体の奥行き方向の位置を取得する技術は広く知られている。ステレオカメラに代えて視点移動カメラを用いてもよい。あるいは別途、近赤外線などの参照光を照射しその反射光を検出する装置を設け、TOF(Time of Flight)などの既存手法により実物体120a、120bの位置を特定してもよい。さらに、プレイフィールド20の上面をタッチパッドとし接触位置を検出することにより実物体120a、120bの位置を特定してもよい。TOFやタッチパッドを用いた場合は、撮影画像における各像の色情報などと統合することにより、実物体120a、120bを区別する。また一旦検出した実物体120a、120bについては、既存の視覚追跡技術を用いて追跡することにより、以後の位置座標取得を効率化できる。 The technology for acquiring the position of the subject in the depth direction using a stereo camera is widely known. A viewpoint moving camera may be used instead of the stereo camera. Alternatively, a device that irradiates reference light such as near infrared rays and detects the reflected light may be provided, and the positions of the real objects 120a and 120b may be specified by an existing method such as TOF (Time of Flight). Further, the positions of the real objects 120a and 120b may be specified by detecting the contact position using the upper surface of the play field 20 as a touch pad. When a TOF or touchpad is used, the real objects 120a and 120b are distinguished by integrating with color information of each image in the captured image. Further, the real objects 120a and 120b once detected can be tracked using the existing visual tracking technology, so that the subsequent position coordinate acquisition can be made efficient.
 情報処理装置10は、プレイフィールド20上の実物体、主に自らが制御する実物体120a以外の実物体の位置や動きを取得し、それに基づき自らが制御する実物体120aの動きを決定する。そして決定結果に従う制御信号を送信することで、実物体120aを動作させる。例えば情報処理装置10は、ゲーム装置やパーソナルコンピュータであってよく、必要なアプリケーションプログラムをロードすることで情報処理機能を実現してもよい。また情報処理装置10は、後述するようにネットワークを介し別の情報処理装置やサーバと通信を確立し、必要な情報を送受してもよい。 The information processing apparatus 10 acquires the position and movement of a real object other than the real object on the play field 20, mainly the real object 120a that it controls, and determines the movement of the real object 120a that it controls based on that. And the real object 120a is operated by transmitting the control signal according to the determination result. For example, the information processing apparatus 10 may be a game device or a personal computer, and may implement an information processing function by loading a necessary application program. Further, as will be described later, the information processing apparatus 10 may establish communication with another information processing apparatus or server via a network and send and receive necessary information.
 ここで実物体120a、120bの動きは、上述のように撮像装置12が撮影した動画像のデータを用いて、それらの位置座標の時間変化を追跡することにより取得する。なおユーザが入力装置14を介して実物体120bを動かしている場合は、当該実物体120bの初期位置を取得しておけば、以後は入力装置14に対するユーザの操作内容から位置の時間変化を特定できる。このように実物体120a、120bの動きは、撮影画像以外の手段で取得してもよいし、複数の手段で取得した情報を統合することによって精度を高めてもよい。 Here, the movements of the real objects 120a and 120b are obtained by tracking the time change of their position coordinates using the moving image data captured by the imaging device 12 as described above. When the user moves the real object 120b via the input device 14, if the initial position of the real object 120b is acquired, the time change of the position is specified from the user's operation content on the input device 14 thereafter. it can. As described above, the movements of the real objects 120a and 120b may be acquired by means other than the captured image, or accuracy may be improved by integrating information acquired by a plurality of means.
 入力装置14は、処理の開始/終了や実物体120bの駆動などのユーザ操作を受け付け、その操作内容を表す信号を情報処理装置10へ入力する。入力装置14はゲームコントローラ、キーボード、マウス、ジョイスティック、タッチパッドなど、一般的な入力装置のいずれか、またはいずれかの組み合わせでよい。マイクロフォン16は、周囲の音を取得しそれを電気信号に変換して情報処理装置10へ入力する。情報処理装置10は、マイクロフォン16から取得した音声信号のうち主にユーザの音声を認識し、自らが制御する実物体120aの動きに反映させる。 The input device 14 receives user operations such as start / end of processing and driving of the real object 120b, and inputs a signal representing the operation content to the information processing device 10. The input device 14 may be any common input device such as a game controller, a keyboard, a mouse, a joystick, or a touch pad, or any combination thereof. The microphone 16 acquires ambient sound, converts it into an electrical signal, and inputs it to the information processing apparatus 10. The information processing apparatus 10 mainly recognizes the user's voice out of the voice signal acquired from the microphone 16 and reflects it in the movement of the real object 120a controlled by the information processing apparatus 10 itself.
 表示装置18は情報処理装置10が生成した画像を表示する。図示する例では表示装置18をプロジェクタとし、プレイフィールド20上に画像を表示することを想定している。一方、表示装置18を一般的なテレビモニターのようなディスプレイとしてもよいし、プロジェクタとディスプレイの双方を構成に含めてもよい。あるいはプレイフィールド20をディスプレイとしてもよい。 The display device 18 displays an image generated by the information processing device 10. In the illustrated example, it is assumed that the display device 18 is a projector and an image is displayed on the play field 20. On the other hand, the display device 18 may be a display such as a general television monitor, or both the projector and the display may be included in the configuration. Alternatively, the play field 20 may be a display.
 スピーカー19は一般的なスピーカーのほか、ブザーなどの発音装置や、それらを組み合わせた物でもよく、情報処理装置10からの要求に従い所定の音や声を音響として出力する。情報処理装置10とその他の装置との接続は、有線、無線を問わず、また種々のネットワークを介していてもよい。あるいは情報処理装置10、撮像装置12、入力装置14、マイクロフォン16、表示装置18、スピーカー19のうちいずれか2つ以上、または全てが組み合わされて一体的に装備されていてもよい。情報処理装置10にはさらに、スピーカーなどの外部装置を接続してもよい。 The speaker 19 may be a general speaker, a sounding device such as a buzzer, or a combination thereof, and outputs a predetermined sound or voice as a sound according to a request from the information processing device 10. Connection between the information processing apparatus 10 and other apparatuses may be wired or wireless, and may be via various networks. Alternatively, any two or more of the information processing device 10, the imaging device 12, the input device 14, the microphone 16, the display device 18, and the speaker 19 may be combined or integrally provided. Further, an external device such as a speaker may be connected to the information processing apparatus 10.
 図2は情報処理装置10の内部回路構成を示している。情報処理装置10は、CPU(Central Processing Unit)22、GPU(Graphics Processing Unit)24、メインメモリ26を含む。これらは、バス30を介して相互に接続されている。バス30にはさらに入出力インターフェース28が接続されている。入出力インターフェース28には、USBやIEEE1394などの周辺機器インターフェースや、有線又は無線LANのネットワークインターフェースからなる通信部32、ハードディスクドライブや不揮発性メモリなどの記憶部34、表示装置18やスピーカー19などの出力装置へデータを出力する出力部36、撮像装置12、入力装置14、およびマイクロフォン16からデータを入力する入力部38、磁気ディスク、光ディスクまたは半導体メモリなどのリムーバブル記録媒体を駆動する記録媒体駆動部40が接続される。 FIG. 2 shows an internal circuit configuration of the information processing apparatus 10. The information processing apparatus 10 includes a CPU (Central Processing Unit) 22, a GPU (Graphics Processing Unit) 24, and a main memory 26. These are connected to each other via a bus 30. An input / output interface 28 is further connected to the bus 30. The input / output interface 28 includes a peripheral device interface such as USB or IEEE 1394, a communication unit 32 including a wired or wireless LAN network interface, a storage unit 34 such as a hard disk drive or a nonvolatile memory, a display device 18, a speaker 19, and the like. An output unit 36 that outputs data to the output device, an imaging device 12, an input device 14, and an input unit 38 that inputs data from the microphone 16, a recording medium drive unit that drives a removable recording medium such as a magnetic disk, an optical disk, or a semiconductor memory 40 is connected.
 CPU22は、記憶部34に記憶されているオペレーティングシステムを実行することにより情報処理装置10内部の構成要素における処理や信号伝送を制御する。CPU22はまた、リムーバブル記録媒体から読み出されてメインメモリ26にロードされた、あるいは通信部32を介してダウンロードされた各種プログラムを実行する。GPU24は、ジオメトリエンジンの機能とレンダリングプロセッサの機能とを有する。GPU24はCPU22からの描画命令に従って描画処理を行い、適宜、表示装置18に出力する。メインメモリ26はRAM(Random Access Memory)により構成され、処理に必要なプログラムやデータを記憶する。 The CPU 22 controls processing and signal transmission in the components inside the information processing apparatus 10 by executing the operating system stored in the storage unit 34. The CPU 22 also executes various programs read from the removable recording medium and loaded into the main memory 26 or downloaded via the communication unit 32. The GPU 24 has a function of a geometry engine and a function of a rendering processor. The GPU 24 performs a drawing process according to a drawing command from the CPU 22 and outputs it to the display device 18 as appropriate. The main memory 26 is composed of RAM (Random Access Memory) and stores programs and data necessary for processing.
 通信部32は、アクチュエータを備えた実物体120a、120bと通信を確立し制御信号を送信する。実物体120aなどにおいて音声を出力させたり画像を表示させたりする態様においては、それらのデータも送信する。後述するように実物体120aなどにセンサを設ける態様においては、通信部32は当該センサによる計測値を実物体120aなどから受信してもよい。通信部32はさらに、必要に応じてネットワークと通信を確立し、外部のサーバや情報処理装置との間で必要なファイルやデータの送受を行ってよい。 The communication unit 32 establishes communication with the real objects 120a and 120b including the actuator and transmits a control signal. In a mode in which sound is output or an image is displayed on the real object 120a or the like, those data are also transmitted. As will be described later, in a mode in which a sensor is provided on the real object 120a or the like, the communication unit 32 may receive a measurement value from the sensor from the real object 120a or the like. Further, the communication unit 32 may establish communication with the network as necessary, and send and receive necessary files and data to and from an external server or information processing apparatus.
 図3は実物体120a、120bと情報処理装置10の構成を詳細に示している。図3において、さまざまな処理を行う機能ブロックとして記載される各要素は、ハードウェア的には、CPU(Central Processing Unit)、メモリ、マイクロプロセッサ、その他のLSI、アクチュエータ、センサ等で構成することができ、ソフトウェア的には、メモリにロードされたプログラムなどによって実現される。したがって、これらの機能ブロックがハードウェアのみ、ソフトウェアのみ、またはそれらの組合せによっていろいろな形で実現できることは当業者には理解されるところであり、いずれかに限定されるものではない。 FIG. 3 shows the configuration of the real objects 120a and 120b and the information processing apparatus 10 in detail. In FIG. 3, each element described as a functional block for performing various processes may be configured by a CPU (Central Processing Unit), a memory, a microprocessor, other LSIs, actuators, sensors, and the like in hardware. In terms of software, it is realized by a program loaded in a memory. Therefore, it is understood by those skilled in the art that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof, and is not limited to any one.
 同図において、情報処理装置10が制御する実物体を実物体120a、ユーザが入力装置14を介して操作する実物体を実物体120bとしている。ただし上述のように、プレイフィールド20上には、ユーザが直接操作する実物体や、単に置いたり手で動かしたりする実物体があってよい。またそのような実物体のみで構成する場合、実物体120bはなくてもよい。ユーザが手で動かしたり置いたりするのみの実物体は、平板なカードや、内部構造を持たない人形、ゲームの駒、ブロック、装飾品のような物でよい。この場合であっても、色、模様、形状を異ならせたり、2次元バーコードを印刷したり、といった外装上の変化をつけることにより、撮影画像からそれぞれの像を高精度に識別することができる。 In the figure, a real object controlled by the information processing apparatus 10 is a real object 120a, and a real object operated by the user via the input device 14 is a real object 120b. However, as described above, there may be an actual object that is directly operated by the user or an actual object that is simply placed or moved by hand. In the case where the object is composed only of such a real object, the real object 120b may not be provided. A real object that the user simply moves or places by hand may be a flat card, a doll without an internal structure, a game piece, a block, or an ornament. Even in this case, it is possible to identify each image with high accuracy from the photographed image by adding a change on the exterior such as different colors, patterns, and shapes, or printing a two-dimensional barcode. it can.
 実物体120a、120bはいずれも、情報処理装置10からの制御信号に従い動作する駆動部106a、106b、情報処理装置10から必要な制御信号やデータを受信する通信部108a、108bを含む。ただし実物体120aが受信する制御信号は情報処理装置10がその内容を決定したものであり、実物体120bが受信する制御信号は入力装置14を介したユーザ操作を反映させたものである。駆動部106a、106bは、情報処理装置10からの制御信号により駆動するアクチュエータを含む。図1に示したように、実物体120a、120bが車輪122a、122bによって自走する場合、アクチュエータによって車軸を回転させたり舵角を変化させたりする。 The real objects 120a and 120b include drive units 106a and 106b that operate according to a control signal from the information processing apparatus 10, and communication units 108a and 108b that receive necessary control signals and data from the information processing apparatus 10, respectively. However, the control signal received by the real object 120a is determined by the information processing apparatus 10, and the control signal received by the real object 120b reflects a user operation via the input device 14. The drive units 106 a and 106 b include an actuator that is driven by a control signal from the information processing apparatus 10. As shown in FIG. 1, when the real objects 120a and 120b are self-propelled by the wheels 122a and 122b, the axle is rotated or the rudder angle is changed by the actuator.
 これにより、情報処理装置10からの制御信号に従った方向および速度で実物体120a、120bを走行させることができる。そのほか駆動部106a、106bは、上述のように、車輪以外の動きを発生させるアクチュエータや、発光素子、ディスプレイ、スピーカー、振動子などとそれを作動させる機構を含んでもよい。それらの機構も既存の技術を利用することによって、情報処理装置10からの制御信号により動作させる。 Thereby, the real objects 120a and 120b can be caused to travel at the direction and speed according to the control signal from the information processing apparatus 10. In addition, as described above, the driving units 106a and 106b may include an actuator that generates a motion other than the wheel, a light emitting element, a display, a speaker, a vibrator, and the like and a mechanism that operates the actuator. These mechanisms are also operated by a control signal from the information processing apparatus 10 by utilizing an existing technology.
 通信部108a、108bは、情報処理装置10から送信された制御信号を受信し、それぞれの駆動部106a、106bに通知する。通信部108a、108bは、自身の実物体120a、120bの個体識別情報を内部のメモリに保持する。そして情報処理装置10から送信された制御信号が自身の実物体120a、120bに宛てて送信されたものか否かを、制御信号とともに送信された個体識別情報に基づき判断する。 The communication units 108a and 108b receive the control signal transmitted from the information processing apparatus 10, and notify the respective driving units 106a and 106b. The communication units 108a and 108b hold the individual identification information of their own real objects 120a and 120b in an internal memory. Then, it is determined based on the individual identification information transmitted together with the control signal whether or not the control signal transmitted from the information processing apparatus 10 is transmitted to the real objects 120a and 120b.
 なお実物体120a、120bにはさらに、自身の状態を計測する、図示しないセンサを備えてもよい。上述のように撮影画像を用いて実物体120a、120bの位置座標を取得するのに加え、各実物体が自らの状態を計測することで、位置関係や実物体の形状の変化などをより正確に求めてもよい。この場合、通信部108a、108bは、センサによる計測値を、自らの個体識別情報とともに情報処理装置10に送信する。図1に示すように実物体120a、120bに車輪を122a、122b設けた場合、車輪122a、122bにロータリーエンコーダおよび舵角センサを設け、実際の移動量や移動方向を特定してもよい。 The real objects 120a and 120b may further include a sensor (not shown) that measures its own state. In addition to acquiring the position coordinates of the real objects 120a and 120b using the captured images as described above, each real object measures its own state, so that the positional relationship, the change in the shape of the real object, and the like can be more accurately determined. You may ask for. In this case, the communication units 108a and 108b transmit the measurement value obtained by the sensor to the information processing apparatus 10 together with its individual identification information. As shown in FIG. 1, when wheels 122a and 122b are provided on the real objects 120a and 120b, a rotary encoder and a rudder angle sensor may be provided on the wheels 122a and 122b to specify the actual movement amount and movement direction.
 同様に、実物体120a、120bの絶対位置を取得する位置センサや、加速度センサ、ジャイロセンサ、地磁気センサなどのモーションセンサを設けてもよい。実物体120a、120bは情報処理装置10からの制御信号によって動くことを基本とするが、このようにして実際の計測値を取得すれば、誤差を修正するようにフィードバック制御ができる。実物体120a、120bに、ユーザによって屈伸可能な関節を設け、その角度を特定するポテンショメータを導入してもよい。これにより実物体120a、120bの形状がいかに変化してもそれを認識できるため、撮影画像における像の追跡を精度よく行える。ただし要求される位置検出精度や製造コストなどに鑑み、センサの要、不要を決定する。 Similarly, a position sensor that acquires the absolute position of the real objects 120a and 120b, and a motion sensor such as an acceleration sensor, a gyro sensor, or a geomagnetic sensor may be provided. Although the real objects 120a and 120b are basically moved by a control signal from the information processing apparatus 10, if actual measurement values are acquired in this way, feedback control can be performed so as to correct the error. A joint that can be bent and stretched by the user may be provided on the real objects 120a and 120b, and a potentiometer for specifying the angle may be introduced. As a result, it is possible to recognize any change in the shape of the real objects 120a and 120b, and therefore it is possible to accurately track the image in the captured image. However, in view of the required position detection accuracy, manufacturing cost, etc., the necessity of the sensor is determined.
 情報処理装置10は、実物体120a、120bへ制御信号を送信する通信部50、実物体120a、120bの位置関係を含むプレイフィールド20上の状態を特定する状態特定部52、実物体の情報を格納する実物体情報記憶部54、実物体120a、120bの位置関係等に応じて情報処理を行う情報処理部62、当該位置関係とそれに応じてなすべき処理との対応を設定したシナリオ記憶部56、ユーザ操作または情報処理部62における処理の結果として制御信号を生成する実物体制御部60を含む。 The information processing apparatus 10 includes a communication unit 50 that transmits a control signal to the real objects 120a and 120b, a state specifying unit 52 that specifies a state on the play field 20 including the positional relationship between the real objects 120a and 120b, and information on the real objects. The stored real object information storage unit 54, the information processing unit 62 that performs information processing according to the positional relationship between the real objects 120a and 120b, and the scenario storage unit 56 that sets the correspondence between the positional relationship and processing to be performed accordingly. In addition, a real object control unit 60 that generates a control signal as a result of a user operation or processing in the information processing unit 62 is included.
 状態特定部52は、撮像装置12から動画像の画像フレームを順次取得し、それを解析することによって実物体120a、120bの位置を所定の時間間隔で特定する。画像解析により対象物の像を検出したり追跡したりする技術としては様々な手法が広く知られており、本実施の形態ではそのいずれを採用してもよい。また上述のように、撮影画像中の像の情報と、ステレオ画像やTOFなどを用いた奥行き方向の位置情報とを統合し、3次元空間での位置を特定する技術も一般的な手法を利用することができる。 The state specifying unit 52 sequentially acquires image frames of moving images from the imaging device 12 and analyzes them to specify the positions of the real objects 120a and 120b at predetermined time intervals. Various techniques are widely known as techniques for detecting and tracking an image of an object by image analysis, and any of them may be adopted in the present embodiment. In addition, as described above, a technique for identifying a position in a three-dimensional space by integrating image information in a captured image and position information in the depth direction using a stereo image, TOF, etc. also uses a general method. can do.
 さらに上述のように実物体120a、120bにセンサを設ける場合、状態特定部52は実物体120a、120bから送信された計測値を用いて、動き、位置、形状、姿勢などを詳細に特定してもよい。この場合、通信部50は実物体120a、120bから送信された計測値を受信して状態特定部52に供給する。なお情報処理装置10が制御する実物体120aと、ユーザが操作する実物体120bとを撮影画像上で区別するために、実物体120a、120bには異なる特定色で発光するマーカーや2次元バーコードを設けてもよい。 Furthermore, when providing the sensors on the real objects 120a and 120b as described above, the state specifying unit 52 uses the measurement values transmitted from the real objects 120a and 120b to specify the movement, position, shape, posture, and the like in detail. Also good. In this case, the communication unit 50 receives the measurement values transmitted from the real objects 120 a and 120 b and supplies them to the state specifying unit 52. In addition, in order to distinguish the real object 120a controlled by the information processing apparatus 10 and the real object 120b operated by the user on the captured image, the real objects 120a and 120b have markers or two-dimensional barcodes that emit light with different specific colors. May be provided.
 色、形状、模様、サイズなど、それ以外の外観上の特徴により実物体120a、120bを区別してもよい。実物体情報記憶部54には、各実物体120a、120bの個体識別情報と、それらの外観上の特徴とを対応づけた情報を格納しておく。状態特定部52は当該情報を参照し、撮影画像から検出した像の外観上の特徴に対応する個体識別情報を特定することにより、各位置にある実物体120a、120bを区別する。実物体120a、120bの個体識別情報とその位置座標は、情報処理部62に逐次供給する。 The real objects 120a and 120b may be distinguished by other appearance features such as color, shape, pattern, size, and the like. The real object information storage unit 54 stores information associating individual identification information of the real objects 120a and 120b with features on their appearance. The state specifying unit 52 refers to the information and specifies individual identification information corresponding to the appearance feature of the image detected from the captured image, thereby distinguishing the real objects 120a and 120b at each position. The individual identification information and the position coordinates of the real objects 120a and 120b are sequentially supplied to the information processing unit 62.
 情報処理部62は、実物体120a、120bの位置関係、入力装置14に対するユーザ操作、マイクロフォン16が取得した音声信号などに基づき、なすべき情報処理を実行する。例えばユーザが操作する実物体120bと情報処理装置10が制御する実物体120aとで競争するゲームを実現する場合、情報処理部62は情報処理装置10が制御する実物体120aの動きを決定するほか、得点の算出、勝敗の決定、表示画像の生成、出力音声の決定などを行いながらゲームを進捗させる。ゲームのプログラムや動きを決定づける規則、画像の生成に必要なデータなどは、シナリオ記憶部56に格納しておく。 The information processing unit 62 performs information processing to be performed based on the positional relationship between the real objects 120a and 120b, a user operation on the input device 14, an audio signal acquired by the microphone 16, and the like. For example, when realizing a game in which a real object 120b operated by a user competes with a real object 120a controlled by the information processing apparatus 10, the information processing unit 62 determines the movement of the real object 120a controlled by the information processing apparatus 10. The game progresses while calculating the score, determining the win / loss, generating the display image, and determining the output sound. A game program, rules for determining movement, data necessary for image generation, and the like are stored in the scenario storage unit 56.
 なおマイクロフォン16が取得した音声信号を利用する場合、当該音声信号から所定のキーワードを検出するため、情報処理部62には一般的な手法による音声認識の機能を設ける。情報処理部62は、生成した表示画像のデータを表示装置18へ出力したり、発生すべき音声データを復号するなどしてスピーカー19へ出力したりする。表示装置18を、プレイフィールド20に画像を投影するプロジェクタとした場合、ゲームの進捗や実物体120a、120bの位置などに応じてプレイフィールド20上の画像を変化させることができる。当該画像に得点などの情報表示を含めてもよい。 When using the voice signal acquired by the microphone 16, the information processing unit 62 is provided with a voice recognition function using a general technique in order to detect a predetermined keyword from the voice signal. The information processing unit 62 outputs the generated display image data to the display device 18 or outputs to the speaker 19 by decoding audio data to be generated. When the display device 18 is a projector that projects an image onto the play field 20, the image on the play field 20 can be changed according to the progress of the game, the positions of the real objects 120a and 120b, and the like. Information such as a score may be included in the image.
 実物体制御部60は、情報処理部62が決定した動きで実物体120aが動くように制御信号を生成する。また、入力装置14に対するユーザ操作の内容を解釈し、ユーザの操作対象である実物体120bへの制御信号を生成する。これらの制御信号は、情報処理部62から供給された各実物体120a、120bの個体識別情報と対応づける。通信部50は実物体120a、120bと通信を確立し、実物体制御部60が生成した制御信号を送信する。具体的に送信する信号は制御方式によって様々であり、ロボット工学の分野などで一般的に用いられる技術を適宜採用してよい。 The real object control unit 60 generates a control signal so that the real object 120a moves with the movement determined by the information processing unit 62. Further, it interprets the contents of the user operation on the input device 14 and generates a control signal for the real object 120b that is the operation target of the user. These control signals are associated with the individual identification information of each of the real objects 120a and 120b supplied from the information processing unit 62. The communication unit 50 establishes communication with the real objects 120a and 120b, and transmits the control signal generated by the real object control unit 60. Specifically, the signal to be transmitted varies depending on the control method, and a technique generally used in the field of robot engineering or the like may be appropriately employed.
 図4は実物体120aの外観構成例を示している。実物体120bも同様でよい。この例の実物体120aは、図1で示したように直方体形状の本体の下部に車輪122aを備える。また上部には所定の色で発光するマーカー126を備える。情報処理装置10の状態特定部52は、撮影画像中、マーカー126の像を検出することにより、実物体120aの位置座標を取得する。すなわちマーカー126の中心129を実物体120aの追跡点とする。図示した例では、実物体120aの上面124aを構成する矩形の一つの辺に近接した位置にマーカー126を配置している。このようにマーカー126を実物体120aの中心からずらした配置とすることで、例えばマーカー126に近い側面を実物体の前と規定でき、前進や後退についての認識をユーザと共有できる。 FIG. 4 shows an external configuration example of the real object 120a. The same applies to the real object 120b. The real object 120a of this example includes wheels 122a at the lower part of a rectangular parallelepiped main body as shown in FIG. In addition, a marker 126 that emits light of a predetermined color is provided at the top. The state specifying unit 52 of the information processing apparatus 10 acquires the position coordinates of the real object 120a by detecting the image of the marker 126 in the captured image. That is, the center 129 of the marker 126 is set as the tracking point of the real object 120a. In the illustrated example, the marker 126 is disposed at a position close to one side of the rectangle that forms the upper surface 124a of the real object 120a. By arranging the marker 126 so as to be shifted from the center of the real object 120a in this way, for example, the side surface close to the marker 126 can be defined as the front of the real object, and recognition of forward and backward movement can be shared with the user.
 マーカー126は例えば、赤外線発光ダイオードや白色発光ダイオードで実装する。また白色発光ダイオードの表面にカラーフィルターキャップを装着することにより、色を切り替えられるようにしてもよい。情報処理装置10が制御する実物体120aとユーザが操作する実物体120bとで発光色を異ならせることにより、撮影画像中の像の色でそれらを区別できる。この場合、ユーザ自身も両者の区別が可能になる。また1つの実物体120aにマーカーを複数設け発光色を異ならせることにより、実物体120aの向きを色によって定義してもよい。このように色で何らかの区別をする必要がある場合と、その必要がない場合とで、発光を赤外線などの非可視光とするか可視光とするかを切り替えられるようにしてもよい。そのためマーカー126はアップコンバージョン蛍光体などの波長変換素子を内蔵してもよい。 The marker 126 is mounted with, for example, an infrared light emitting diode or a white light emitting diode. The color may be switched by attaching a color filter cap to the surface of the white light emitting diode. By making the emission color different between the real object 120a controlled by the information processing apparatus 10 and the real object 120b operated by the user, they can be distinguished by the color of the image in the captured image. In this case, the user himself can also distinguish between the two. Further, the orientation of the real object 120a may be defined by the color by providing a plurality of markers on one real object 120a and changing the emission color. In this way, it may be possible to switch whether the light emission is invisible light such as infrared rays or visible light depending on whether it is necessary to make some distinction by color or not. Therefore, the marker 126 may incorporate a wavelength conversion element such as an up-conversion phosphor.
 また実物体120aの下面124bに示すように、車輪122aにはそれぞれ車軸を回転させるモーター128を設ける。モーター128は、図3で示した駆動部106aに対応し、情報処理装置10から送信される制御信号により駆動して、要求された回転速度および方向に車輪122aを回転させる。車輪122aには、アクチュエータにより舵角を変化させる機構を設けてもよい。これにより実物体120aの移動方向を変化させたりその場で旋回させるなどの動きも発生させられる。また実物体120aに、車輪122aの回転をモーター128から開放させるスイッチを設けることにより、場合によって実物体120aをユーザが手で走らせられるようにしてもよい。 As shown on the lower surface 124b of the real object 120a, the wheels 122a are each provided with a motor 128 that rotates the axle. The motor 128 corresponds to the drive unit 106a shown in FIG. 3 and is driven by a control signal transmitted from the information processing apparatus 10 to rotate the wheel 122a at the requested rotation speed and direction. The wheel 122a may be provided with a mechanism for changing the rudder angle by an actuator. As a result, a movement such as changing the moving direction of the real object 120a or turning on the spot is also generated. In addition, the real object 120a may be provided with a switch for releasing the rotation of the wheel 122a from the motor 128 so that the user can run the real object 120a by hand.
 図4で示した実物体120aを最小単位として1つの実物体を組み立てられるようにしてもよい。図5は複数の実物体の結合例を下面図により示している。まず同図左側に示した実物体130aは、最小単位の実物体120aを前後に2つ接続した構成を有する。この場合、矢印に示すように実物体130aは長手方向に走行する。結合は例えば、2つの実物体120aをはめ込み可能な板、全体を覆うカバー、結束帯などにより実現する。各実物体120aに凹凸形状の接続箇所、面ファスナー、磁石などを設け、それにより接続してもよい。その他の例も同様である。 4 A single real object may be assembled using the real object 120a shown in FIG. 4 as a minimum unit. FIG. 5 shows a connection example of a plurality of real objects in a bottom view. First, the real object 130a shown on the left side of the figure has a configuration in which two real objects 120a of the smallest unit are connected to the front and rear. In this case, the real object 130a travels in the longitudinal direction as indicated by the arrow. The coupling is realized by, for example, a plate capable of fitting two real objects 120a, a cover covering the whole, a binding band, and the like. Each real object 120a may be provided with a concavo-convex connection portion, a hook-and-loop fastener, a magnet, and the like to be connected. The same applies to other examples.
 図5の右上に示した実物体130bは、最小単位の実物体120aを左右に2つ接続した構成を有する。この場合、矢印に示すように実物体130bは長手方向に垂直な方向に走行する。同図の右下に示した実物体130cは、最小単位の実物体120aが内側で三角形を形成するように互いに接続した構成を有する。この場合、矢印に示すように実物体130cは同じ場所で旋回することになる。複数の実物体120aを接続できるようにすることにより、別の実物体を上に乗せて運んだり牽引したりする態様であっても、その重さや大きさに応じた実物体を臨機応変に準備できる。またユーザの好みや遊びの目的などによって進行方向や動きに変化をつけることができる。 The real object 130b shown in the upper right of FIG. 5 has a configuration in which two minimum real objects 120a are connected to the left and right. In this case, as indicated by the arrow, the real object 130b travels in a direction perpendicular to the longitudinal direction. The real object 130c shown in the lower right of the drawing has a configuration in which the smallest real object 120a is connected to each other so as to form a triangle inside. In this case, the real object 130c turns at the same place as shown by the arrow. By allowing multiple real objects 120a to be connected, even if it is a mode of carrying or towing another real object on top, it prepares the real object according to its weight and size flexibly it can. In addition, the direction of movement and movement can be changed according to the user's preference and the purpose of play.
 図6は実物体に設けるマーカーの数や位置の変形例を上面図により示している。まず同図左側に示した実物体120cは、2つのマーカー126a、126bを備えている。情報処理装置10の状態特定部52は、撮影画像中、当該マーカー126a、126bの像を検出することにより、それらを結ぶ線分の中点132の位置を実物体120cの追跡点として取得する。図4で示した例と同様、中点132を実物体120cの上面の中心からずらすことによりその前後を区別できる。また図示するように、実物体120c上にブーム134を設け、その両端にマーカー126a、126bを配置することにより、周囲の状態によらず撮像装置12の視野に入るようにその位置を調整可能としてもよい。 FIG. 6 is a top view showing a modification of the number and positions of markers provided on a real object. First, the real object 120c shown on the left side of the figure includes two markers 126a and 126b. The state specifying unit 52 of the information processing apparatus 10 acquires the position of the midpoint 132 of the line segment connecting the markers 126a and 126b in the captured image as a tracking point of the real object 120c. Similar to the example shown in FIG. 4, the middle point 132 can be distinguished from the front and back by shifting the center point 132 from the center of the upper surface of the real object 120c. Also, as shown in the figure, a boom 134 is provided on the real object 120c, and markers 126a and 126b are arranged at both ends thereof, so that the position thereof can be adjusted so that it enters the field of view of the imaging device 12 regardless of the surrounding state. Also good.
 図の右側に示した実物体120dは、3つのマーカー126c、126d、126eを備えている。状態特定部52は撮影画像中、当該マーカー126c、126d、126eの像を検出することにより、それらを頂点とする三角形の重心136の位置を実物体120dの追跡点として取得する。この場合、実物体120dの前後でマーカーの数を異ならせることによりその前後を区別できる。 The real object 120d shown on the right side of the figure includes three markers 126c, 126d, and 126e. The state specifying unit 52 detects the images of the markers 126c, 126d, and 126e in the photographed image, and acquires the position of the center of gravity 136 of the triangle having the vertexes as the tracking points of the real object 120d. In this case, the front and rear can be distinguished by changing the number of markers before and after the real object 120d.
 図7は実物体に設けるマーカーの別の例として、実物体の前後の区別にマーカーの発光を利用する場合を実物体の上面図により示している。実物体120eは図6の実物体120cと同様、ブーム134の両端にマーカー126f、126gを備える。ただしブーム134は、図示するような実物体120eの上面の中心など、任意の位置に配置してよい。またマーカー126f、126gの位置もユーザが自由に配置できるようにしてもよい。場合によってはブーム134を設けず、実物体120e本体にマーカー126f、126gを直接設置してもよい。いずれにしろマーカー126f、126gは、実物体120e上面の左右を結ぶ直線など、前後の方向を規定できる直線上に複数、配置する。 FIG. 7 shows, as another example of the marker provided on the real object, a top view of the real object when the light emission of the marker is used to distinguish the front and back of the real object. The real object 120e includes markers 126f and 126g at both ends of the boom 134 as in the real object 120c of FIG. However, the boom 134 may be disposed at an arbitrary position such as the center of the upper surface of the real object 120e as illustrated. The positions of the markers 126f and 126g may be freely arranged by the user. In some cases, the booms 134 may not be provided, and the markers 126f and 126g may be directly installed on the real object 120e main body. In any case, a plurality of markers 126f and 126g are arranged on a straight line that can define the front-rear direction, such as a straight line connecting the left and right of the upper surface of the real object 120e.
 このような構成において、2つのマーカー126f、126gのどちらか一方のみを発光させることにより、実物体120eの左右を定義し、ひいては実物体120eの前後を明示する。例えば前進方向に向かって左のマーカー126fを点灯させ、右のマーカー126gを消灯する、という規則をあらかじめ設ける。同図矢印の方向を前進方向とすることがモーターの配置や回転方向から確定している場合、マーカーの左右も自ずと確定するため情報処理装置10はその情報に従い左側のマーカー126fを点灯させる。 In such a configuration, only one of the two markers 126f and 126g is caused to emit light, thereby defining the left and right of the real object 120e, and thus clearly showing the front and back of the real object 120e. For example, a rule is provided in advance that the left marker 126f is turned on and the right marker 126g is turned off in the forward direction. When the direction of the arrow in the figure is determined to be the forward direction from the arrangement and rotation direction of the motor, the information processing apparatus 10 lights the left marker 126f according to the information because the left and right of the marker are determined naturally.
 前進方向が確定していない場合は、マーカー126f、126gの一方を点灯させた状態で車輪を正逆2方向に回転させ、点灯させたマーカーが左となる回転方向を撮影画像におけるマーカーの動きから特定し、前進の回転方向として決定する。情報書処理装置10の状態特定部52はそのように決定した情報を、実物体120eの個体識別情報に対応づけて実物体情報記憶部54に記録し、以後の実物体制御に用いる。ユーザは上記規則のみ認識しておけば情報処理装置10の制御との齟齬が生じない。 If the forward direction is not fixed, the wheel is rotated in two forward and reverse directions with one of the markers 126f and 126g lit, and the rotation direction in which the lit marker is on the left is determined from the movement of the marker in the captured image. Identify and determine the direction of forward rotation. The state specifying unit 52 of the information processing apparatus 10 records the information thus determined in the real object information storage unit 54 in association with the individual identification information of the real object 120e, and uses it for subsequent real object control. If the user recognizes only the above rules, there is no discrepancy with the control of the information processing apparatus 10.
 なお図7で示したように片方のマーカーを消灯させても、その形状や色などに基づき撮影画像から当該マーカーの像を検出すれば、追跡点は図6の実物体120cと同様、2つのマーカー126f、126gを結ぶ線分の中点132となる。マーカー126f、126gの左右を点灯/消灯で区別するほか、点灯/点滅としたり、可視光/非可視光の発光としたり、発光色を異ならせたりしてもよい。なお図6、図7で示したマーカーの配置や数は例示であり、実物体のサイズや形状等に応じて適宜決定してよい。また上述のようにマーカー以外の外観的特徴により実物体を認識できる場合は、マーカーを設けなくてもよい。 As shown in FIG. 7, even if one of the markers is turned off, if the image of the marker is detected from the photographed image based on the shape, color, etc., the tracking point is the same as the real object 120c in FIG. This is the midpoint 132 of the line segment connecting the markers 126f and 126g. In addition to distinguishing the left and right of the markers 126f and 126g by lighting / extinguishing, the markers 126f and 126g may be lit / flashing, may emit visible / invisible light, or may have different emission colors. Note that the arrangement and number of markers shown in FIGS. 6 and 7 are examples, and may be appropriately determined according to the size and shape of the real object. In addition, as described above, when an actual object can be recognized by appearance features other than the marker, the marker need not be provided.
 本実施の形態では、これまで述べたようなシンプルな形状の実物体であっても、様々な物に見立てることによって意図した世界を構築し、ごっこ遊びやゲームを実現できるようにする。すなわち構築する世界に応じて実物体の外観を変化させる必要はない。このような場合であっても、例えば一方の実物体のマーカーを赤く発光させ、他方の実物体のマーカーを白く発光させることで、前者を救急車、後者を普通車、などとする区別をユーザが直感的に把握できる。このとき赤い発光を点滅させるなどの演出を施せば、ユーザはさらに臨場感を味わうことができる。 In this embodiment, even if a real object has a simple shape as described above, an intended world is constructed by considering it as a variety of objects so that pretend play and games can be realized. That is, it is not necessary to change the appearance of the real object according to the world to be constructed. Even in such a case, for example, the user can distinguish the former as an ambulance, the latter as a normal car, etc. by emitting one real object's marker in red and the other real object's marker in white. Intuitive grasp. At this time, if an effect such as blinking red light emission is given, the user can further experience a sense of reality.
 また上述のとおりマーカーの色を異ならせることにより、情報処理装置10が制御する実物体とユーザが操作する実物体とをユーザ自身も区別できる。この場合、どちらの色の実物体が、ユーザが操作する対象であるかを、情報処理装置10が事前に明示してもよい。例えば情報処理装置10は、マーカーをそれぞれの色に点灯させたうえ、一方の実物体を少し動かして見せる。動かされた実物体を情報処理装置10が制御する実物体(あるいはユーザが操作する実物体)、などと規則づけておけば、ユーザはどの色の実物体を自分が操作するのかを知ることができ、遊びやゲームなどを円滑に開始できる。 Also, by changing the color of the marker as described above, the user himself / herself can distinguish between the real object controlled by the information processing apparatus 10 and the real object operated by the user. In this case, the information processing apparatus 10 may clearly indicate in advance which color of the real object is the target operated by the user. For example, the information processing apparatus 10 turns on the markers in the respective colors and moves one of the real objects slightly. If the moved real object is ruled as a real object controlled by the information processing apparatus 10 (or a real object operated by the user), the user can know which color of the real object he / she operates. You can play and play smoothly.
 次に、これまで述べた構成による情報処理装置10の基本的な動作について説明する。図8は情報処理装置10がプレイフィールド20上の状態に応じて実物体を制御する処理手順を示すフローチャートである。このフローチャートは、プレイフィールド20上に実物体120a、120bなどが置かれた状態で、ユーザが入力装置14などを介して処理の開始要求を入力したときなどに開始される。なおユーザが入力装置14を介して実物体120bを操作する場合、そのための処理は独立したタイミングでなされるため図示したフローチャートには含めていない。 Next, the basic operation of the information processing apparatus 10 configured as described above will be described. FIG. 8 is a flowchart illustrating a processing procedure in which the information processing apparatus 10 controls a real object according to the state on the play field 20. This flowchart is started when a user inputs a processing start request via the input device 14 or the like in a state where the real objects 120a and 120b are placed on the play field 20. Note that when the user operates the real object 120b via the input device 14, the processing for that is performed at an independent timing and is not included in the illustrated flowchart.
 まず情報処理装置10の状態特定部52は、撮像装置12に対し撮影開始を要求する(S10)。撮像装置12がそれに応じて動画像の撮影を開始することにより、所定のフレームレートで画像フレームが取得され情報処理装置10に送信される。ここで処理対象となる画像フレームを取得する時間ステップtの進捗をt=0、1、2、・・・と表記する。時間ステップtの周期は、撮像装置12における画像フレームの撮影周期と同じでもよいし、それより長くてもよい。すなわち動画像の全ての画像フレームを処理対象としてもよいし、所定数ごとに画像フレームを間引きながら処理を行ってもよい。 First, the state specifying unit 52 of the information processing device 10 requests the imaging device 12 to start shooting (S10). When the imaging device 12 starts capturing a moving image accordingly, an image frame is acquired at a predetermined frame rate and transmitted to the information processing device 10. Here, the progress of the time step t for acquiring the image frame to be processed is expressed as t = 0, 1, 2,. The period of the time step t may be the same as the imaging period of the image frame in the imaging device 12, or may be longer than that. That is, all image frames of a moving image may be processed, or processing may be performed while thinning out image frames every predetermined number.
 まず最初の時間ステップt=0の画像フレームを取得すると(S12、S14)、状態特定部52は、当該画像フレームを解析し、プレイフィールド20の平面における実物体120a、120bの位置座標などの状態情報を取得する(S16)。最も単純には、画像フレームから各実物体のマーカーの像を検出し、それに基づき適宜座標変換を行うなどしてプレイフィールド20上での位置座標を求める。ただし後述するように、実施する形態によってここで取得する情報は様々でよい。上述のように実物体に備えたセンサが計測した値、入力装置14を介したユーザ操作の内容、マイクロフォン16が取得した音声情報などをさらに利用してもよい。これらの情報は取得された時点で随時、後段の情報処理に反映させる。 First, when an image frame of the first time step t = 0 is acquired (S12, S14), the state specifying unit 52 analyzes the image frame, and states such as the position coordinates of the real objects 120a and 120b on the plane of the play field 20 Information is acquired (S16). Most simply, the position coordinates on the play field 20 are obtained by detecting an image of each real object marker from the image frame and appropriately performing coordinate conversion based on the detected image. However, as described later, the information acquired here may vary depending on the embodiment. As described above, the value measured by the sensor provided on the real object, the content of the user operation via the input device 14, the audio information acquired by the microphone 16 and the like may be further used. These pieces of information are reflected in subsequent information processing as needed.
 次に情報処理部62は、シナリオ記憶部56に格納された規則に基づき、ゲーム等を進行させながら実物体120aの動きをその場の状態に応じて決定する(S18)。例えばユーザが操作する実物体120bを追いかけるように実物体120aを動かす場合、実物体120aから実物体120bへの方向ベクトルを計算し、その方向に基づき実物体120aの移動方向を決定する。さらに、時間ステップt=1以後の時間ステップでは、各時間ステップ間の位置の差をとることにより、各実物体の速度ベクトルや加速度ベクトルを求めてもよい。これらの情報を併用することで、たとえばユーザが操作する実物体120bの位置を予測して、制御対象の実物体120aを移動させる、といった、より高度な運動制御も可能となる。 Next, based on the rules stored in the scenario storage unit 56, the information processing unit 62 determines the movement of the real object 120a according to the current state while a game or the like is progressing (S18). For example, when the real object 120a is moved so as to follow the real object 120b operated by the user, a direction vector from the real object 120a to the real object 120b is calculated, and the moving direction of the real object 120a is determined based on the direction. Furthermore, in time steps after time step t = 1, the velocity vector and acceleration vector of each real object may be obtained by taking the difference in position between the time steps. By using these pieces of information together, it is possible to perform more advanced motion control such as, for example, predicting the position of the real object 120b operated by the user and moving the real object 120a to be controlled.
 移動方向のみならずその他の動きであっても、プレイフィールド20上の時間ステップごとの状況に対し、それが改善される方向の動きを逐一求めるようにすると、周囲の変化に臨機応変に対応する実物体を実現できる。また多数の実物体が存在するような複雑な系でも統計処理によって最善の方向性を逐一選択することにより、比較的容易に動きを決定できる。 Even if the movement is not limited to the moving direction, if the movement in the direction in which it is improved is determined for each time step on the play field 20, the change in the surroundings can be dealt with flexibly. Real objects can be realized. Further, even in a complicated system where there are many real objects, the motion can be determined relatively easily by selecting the best directivity by statistical processing one by one.
 上述のように情報処理部62は、実物体の移動方向のみならず、関節など実物体の可動部位の変化量や変化方向、発光素子の点灯開始、実物体内蔵のディスプレイに表示すべき画像、実物体内蔵のスピーカーから発生させる音声などを決定してよい。さらにプロジェクタなどの表示装置18に新たに表示させる画像や、スピーカー19から発生させる効果音などを決定してもよい。そして実物体制御部60は、決定結果に基づく制御信号を生成し、必要に応じて画像データや音声データも付加して通信部50から送信することで、実物体120aを制御する(S20)。このとき情報処理部62は適宜、表示装置18やスピーカー19へ、表示画像や音声のデータを出力してもよい。 As described above, the information processing unit 62 not only changes the moving direction of the real object, but also the amount and direction of change of the movable part of the real object such as a joint, the lighting start of the light emitting element, the image to be displayed on the display built in the real object, The sound generated from a speaker with a built-in real object may be determined. Further, an image to be newly displayed on the display device 18 such as a projector or a sound effect generated from the speaker 19 may be determined. Then, the real object control unit 60 generates a control signal based on the determination result, adds image data and audio data as necessary, and transmits the control signal from the communication unit 50, thereby controlling the real object 120a (S20). At this time, the information processing unit 62 may appropriately output display image and audio data to the display device 18 and the speaker 19.
 ユーザから、入力装置14を介した処理の終了要求がなされない間は(S22のN)、次の時間ステップt=1に対しS14からS20の処理を繰り返す(S24、S14~S20)。このように本実施の形態では、動画像の画像フレームの撮影周期など、微少時間の周期でその場の状況を認識し動きを決定するため、長い周期での変化を計算するより処理を単純化できるとともに状況変化に対し動きを詳細に対応させることができる。ユーザが処理の終了を要求した時点で全体の処理を終了させる(S22のY)。 While the process is not requested from the user via the input device 14 (N in S22), the process from S14 to S20 is repeated for the next time step t = 1 (S24, S14 to S20). As described above, in the present embodiment, since the situation is recognized and the movement is determined in a minute time period such as the shooting period of the image frame of the moving image, the process is simplified rather than calculating the change in the long period. As well as being able to respond to changes in the situation in detail. When the user requests the end of the process, the entire process is terminated (Y in S22).
 なお実物体にディスプレイやスピーカーなどの視聴覚機器を設けることにより、個々の実物体の表現能力が増すため没入感が高まることが想定できる一方、製造コストや消費電力の増加、大型化、といったデメリットも生じる。そこでそのような構成の代わりとして情報処理装置10に接続したプロジェクタや投光器などにより画像や光を実物体に投影させたり、スピーカー19により動きと合った音声を発生させたりしてもよい。両者は同時に行ってもよいし、いずれか一方のみを行ってもよい。いずれの場合も、あたかも実物体が発光したり発声したりしているようにみせることができ、上記デメリットを抑えつつ可能な限りの表現を実現することが可能となる。 By providing audiovisual equipment such as displays and speakers on real objects, it can be assumed that the immersive feeling will increase because the expression ability of each real object will increase, but there are also disadvantages such as increase in manufacturing cost, power consumption, and enlargement Arise. Therefore, instead of such a configuration, an image or light may be projected onto a real object by a projector or a projector connected to the information processing apparatus 10, or a sound that matches the movement may be generated by the speaker 19. Both may be performed simultaneously, or only one of them may be performed. In either case, it is possible to make it appear as if a real object is emitting light or uttering, and it is possible to realize as much expression as possible while suppressing the above disadvantages.
 次にこれまで述べた構成、手法を用いて実現できる具体的な形態を例示する。以後で説明する形態は、およそ以下の8つに大別される。
(1)対戦・競争:ユーザと情報処理装置が敵/味方で争う
(2)協力・補助:ユーザと情報処理装置がチームとなる
(3)妨害:情報処理装置とユーザのどちらかが他方の邪魔をする
(4)教育:情報処理装置がユーザに教授する
(5)演技:情報処理装置がユーザを楽しませる
(6)記録・再生:ユーザが発生させた動きを情報処理装置が記録・再生する
(7)模倣:情報処理装置とユーザのどちらかが発生させた動きを他方が真似る
(8)活動:情報処理装置が何らかのきっかけで独立して作業する
Next, specific modes that can be realized by using the configurations and methods described so far are exemplified. The forms described below are roughly divided into the following eight.
(1) Competition / competition: User and information processing device compete with each other / friends (2) Cooperation / Assistance: User and information processing device team up (3) Disturbance: Either information processing device or user is the other (4) Education: Information processing device teaches the user (5) Acting: Information processing device entertains the user (6) Recording / playback: Information processing device records / plays back the movement generated by the user (7) Imitation: imitation of movement generated by either information processing device or user (8) Activity: Information processing device works independently for some reason
 図9は本実施の形態で実現できる態様のうち「対戦・競争」の実現例を示している。プレイフィールド20には、情報処理装置10が制御する実物体120aとユーザが操作する実物体120bが置かれている。プレイフィールド20にはそのほか、木140、家142a、142bなどのミニチュアが置かれている。これらは駆動機構を持たない実物体であり、ユーザが自由に置くことができる。またプレイフィールド20には川144、橋146、砂地148が表示されている。これらはプロジェクタによって投影してもよいし、紙などに描かれたものでもよい。後者の場合、情報処理装置10はプレイフィールド20上におけるそれらの領域の位置および範囲の情報を取得しておく。そのように印刷されたプレイフィールド20を用いる場合は、シナリオ記憶部56に当該情報を格納しておけばよい。ユーザがその場で描いた場合でも、撮影画像から検出することにより当該情報を取得できる。 FIG. 9 shows a realization example of “competition / competition” among the aspects that can be realized in the present embodiment. In the play field 20, a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed. In addition, the play field 20 includes miniatures such as trees 140 and houses 142a and 142b. These are real objects that do not have a drive mechanism and can be placed freely by the user. In the play field 20, a river 144, a bridge 146, and sand 148 are displayed. These may be projected by a projector or drawn on paper or the like. In the latter case, the information processing apparatus 10 acquires information on the position and range of those areas on the play field 20. When the play field 20 printed in such a manner is used, the information may be stored in the scenario storage unit 56. Even when the user draws on the spot, the information can be acquired by detecting from the captured image.
 このような環境において、ユーザは実物体120bを動かし、情報処理装置10が制御する実物体120aと対戦したり競争したりする。この場合のルールは様々考えられる。例えば木140など所定のゴールまでの速さを競争したり、直接ぶつかり合ったりしてもよい。その過程において、木140や家142a、142bなど、置かれた実物体にぶつかったら得点を下げるなどしてもよい。また実物体120a、120bに、図示しないレーザー照射手段などを設けて銃砲に見立て、相手の実物体に照準を合わせて砲撃できたら得点を与えるなどでもよい。このような対戦ゲームを実現する場合、実物体120a、120bを、例えば戦車などの形状としてもよい。またスピーカー19などから、実物体120a、120bの動きに合わせて戦車などの走行音を発生させるとより臨場感を与えられる。 In such an environment, the user moves the real object 120b to play against or compete with the real object 120a controlled by the information processing apparatus 10. Various rules can be considered in this case. For example, the speed to a predetermined goal such as the tree 140 may be competed or directly hit. In the process, the score may be lowered when hitting a placed real object such as the tree 140 or the houses 142a and 142b. Further, a laser irradiation means (not shown) may be provided on the real objects 120a and 120b so as to be regarded as a gun, and a score may be given if the object can be fired while aiming at the opponent's real object. When realizing such a battle game, the real objects 120a and 120b may have a shape such as a tank, for example. Further, when a traveling sound of a tank or the like is generated from the speaker 19 or the like in accordance with the movement of the real objects 120a and 120b, a more realistic feeling is given.
 このような対戦ゲームにおいて、プレイフィールド20上に表示されている川144や砂地148では、実物体120a、120bの動きが制限されるように情報処理装置10が制御する。例えばユーザが自分の実物体120bを、橋146のかかっていない川144の方向へ動かした場合、川144の縁で強制的に止まるようにする。すなわち入力装置14に対し川144の方向への移動操作がなされても、情報処理部62は川144の縁で速度が0になるように実物体120bを制御する。結果としてユーザは実物体120bを後退させるか方向転換させるなどして移動を再開させることになる。砂地148では例えば、移動速度を遅くしたり方向転換を遅くしたりする。この場合も情報処理部62は、入力装置14に対しユーザが要求した移動速度を所定の割合だけ減少させたり舵角変化の応答性を下げたりして実物体120bを制御する。 In such a battle game, the information processing apparatus 10 controls the river 144 and the sand 148 displayed on the play field 20 so that the movement of the real objects 120a and 120b is restricted. For example, when the user moves his / her real object 120b in the direction of the river 144 where the bridge 146 is not applied, the user is forced to stop at the edge of the river 144. That is, even if the input device 14 is moved in the direction of the river 144, the information processing unit 62 controls the real object 120b so that the speed becomes zero at the edge of the river 144. As a result, the user resumes the movement by retreating or changing the direction of the real object 120b. In the sand 148, for example, the moving speed is slowed or the direction change is slowed. Also in this case, the information processing unit 62 controls the real object 120b by reducing the moving speed requested by the user to the input device 14 by a predetermined rate or reducing the response of the change in the steering angle.
 情報処理装置10が制御する実物体120aについても同様の制限が課されているように速度や方向を算出する。川144のような進入禁止領域をプレイフィールド20を囲む辺の外に設けることにより、実物体120a、120bがプレイフィールド20の外に出ないようにしてもよい。また動きに制限を課すばかりでなく、速度や方向転換が早くなる領域を設けてもよい。なお実物体120a、12bをそれぞれ複数、置くことによりチーム戦としてもよい。例えば複数のユーザがそれぞれ自分の実物体120bを持ち寄ってチームを結成し、情報処理装置10が制御する同じ数の実物体120aと対戦するようにしてもよい。 The speed and direction are calculated so that the same restriction is imposed on the real object 120a controlled by the information processing apparatus 10. By providing an entry prohibition area such as the river 144 outside the side surrounding the play field 20, the real objects 120 a and 120 b may be prevented from going out of the play field 20. In addition to imposing restrictions on movement, a region where speed and direction change can be accelerated may be provided. A team battle may be made by placing a plurality of real objects 120a and 12b. For example, a plurality of users may each bring their own real objects 120b to form a team and play against the same number of real objects 120a controlled by the information processing apparatus 10.
 図10は、図9で示した形態を実現するためにシナリオ記憶部56に準備するデータ例を示している。プレイフィールド20をプロジェクタにより投影する画像とした場合、シナリオ記憶部56には投影対象の画像110のデータを準備する。図9で示した例では、画像110には川144、橋146、砂地148が描かれる。情報処理部62は対戦開始時にシナリオ記憶部56から画像110のデータを読み出し、表示装置18であるプロジェクタに出力して投影させる。画像110の変形規則もシナリオ記憶部56に格納しておき、投影する画像を時間変化させてもよい。例えば時間経過に応じて川144の水があふれたり、実物体120bが渡るタイミングで橋146が崩落したりする画像を表示することにより、実物体と画像とを融合させ、よりスリルのある対戦を実現できる。 FIG. 10 shows an example of data prepared in the scenario storage unit 56 in order to realize the form shown in FIG. When the play field 20 is an image projected by the projector, the scenario storage unit 56 prepares data of the image 110 to be projected. In the example shown in FIG. 9, a river 144, a bridge 146, and a sand 148 are drawn on the image 110. The information processing unit 62 reads the data of the image 110 from the scenario storage unit 56 at the start of the battle, and outputs the data to the projector that is the display device 18 for projection. The deformation rule of the image 110 may also be stored in the scenario storage unit 56, and the projected image may be changed over time. For example, by displaying an image in which the water of the river 144 overflows over time or the bridge 146 collapses at the timing when the real object 120b crosses, the real object and the image are fused, and a more thrilling battle realizable.
 また画像110を投影するか描画済みのプレイフィールド20を用いるかによらず、上述のように川144や砂地148の領域を取得しプレイフィールド20に対応する平面にマッピングしたデータ112を準備する。当該マッピングデータには、各領域における動きの制限量を設定する。上述の例では、橋146以外の川144の領域で速度を0と設定するため「V=0」と設定されている。また砂地148の領域で速度を80%に制限するとし「V=80%」と設定されている。 Regardless of whether the image 110 is projected or the pre-drawn play field 20 is used, the area of the river 144 and the sand 148 is acquired as described above, and the data 112 mapped to the plane corresponding to the play field 20 is prepared. In the mapping data, a motion limit amount in each region is set. In the above example, “V = 0” is set in order to set the speed to 0 in the region of the river 144 other than the bridge 146. Further, if the speed is limited to 80% in the area of the sandy ground 148, “V = 80%” is set.
 同図においてデータ112には、各時間ステップで取得される実物体の位置を示す点線領域を重ねて表示している。このデータは状態特定部52が撮影画像から検出した像の位置を、プレイフィールド20の平面(図のxy平面)における位置座標に換算しマッピングしたものである。上述したように、外観的特徴に基づき各実物体の個体識別情報が得られるため、同図では各実物体の領域近傍に、木140の識別情報「#T0」、「#T1」、家142a、142bの識別情報「#H0」、「#H1」、実物体120a、120bの識別情報「#C0」、「#C1」も示している。 In the figure, the data 112 displays a dotted line area indicating the position of the real object acquired at each time step. This data is obtained by converting and mapping the position of the image detected from the captured image by the state specifying unit 52 into the position coordinates on the plane of the play field 20 (xy plane in the drawing). As described above, since individual identification information of each real object is obtained based on the appearance characteristics, the identification information “# T0”, “# T1” of the tree 140, and the house 142a are located in the vicinity of the area of each real object in FIG. , 142b identification information “# H0” and “# H1”, and real object 120a and 120b identification information “# C0” and “# C1” are also shown.
 状態特定部52は時間ステップごとに、各実物体の個体識別情報とその位置に係る情報を作成し情報処理部62に供給する。ここで実物体120a、120bの位置(識別情報「#C0」、「#C1」に対応する領域)は、ユーザ操作や情報処理装置10による制御により変化することになる。情報処理部62は、それらが速度制限を課した領域に入るか否かを監視し、入ったら設定された量で速度を制限する。また情報処理装置10が制御する実物体120aの位置(識別情報「#C0」)と、ユーザが操作する実物体120bの位置(識別情報「C1」)とを比較し、砲撃しやすい位置に移動させるなど実物体120aのその時点での最適な移動方向を決定し制御する。以後に述べる形態においても、位置情報に係るデータの取り扱いは同様である。 The state specifying unit 52 creates individual identification information of each real object and information related to the position for each time step, and supplies the information to the information processing unit 62. Here, the positions of the real objects 120a and 120b (areas corresponding to the identification information “# C0” and “# C1”) are changed by a user operation or control by the information processing apparatus 10. The information processing unit 62 monitors whether or not they enter the area where the speed limit is imposed, and limits the speed by a set amount. In addition, the position of the real object 120a controlled by the information processing apparatus 10 (identification information “# C0”) is compared with the position of the real object 120b operated by the user (identification information “C1”), and the position is easily moved. For example, the optimum moving direction of the real object 120a at that time is determined and controlled. The handling of the data related to the position information is the same in the forms described later.
 図11は本実施の形態で実現できる態様のうち「協力・補助」の実現例を示している。プレイフィールド20には環状の道路150および横断歩道156を表示する。そして情報処理装置10が制御する実物体120aとユーザが操作する実物体120bを置く。プレイフィールド20にはそのほか、木152a、152b、信号158などのミニチュアと人形154a、154bを置く。木152a、152b、および人形154a、154bは駆動機構を持たない実物体であり、ユーザが自由に置いたり手で移動させたりできる。信号158は青、黄色、赤の発光ダイオードを備え、ユーザが入力装置14を介して、あるいは情報処理装置10が、発光色を切り替えられるように構成する。いずれの場合も、実際の制御は情報処理装置10が行うため、当然、情報処理装置10は発光色の切り替えを認識できる。 FIG. 11 shows an implementation example of “cooperation / assistance” among the modes that can be realized in the present embodiment. An annular road 150 and a pedestrian crossing 156 are displayed in the play field 20. Then, a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed. In addition, miniatures such as trees 152a and 152b and a signal 158 and dolls 154a and 154b are placed in the play field 20. The trees 152a and 152b and the dolls 154a and 154b are real objects that do not have a driving mechanism, and can be freely placed by the user or moved by hand. The signal 158 includes blue, yellow, and red light emitting diodes, and is configured so that the user can switch the light emission color via the input device 14 or the information processing device 10. In any case, since the actual control is performed by the information processing apparatus 10, the information processing apparatus 10 can naturally recognize the switching of the emission color.
 このような環境において実物体120a、120bは道路150を反時計回りに走行する。ユーザは自分が操作する実物体120bを、情報処理装置10が制御する実物体120aより先に走行させ、情報処理装置10が制御する実物体120aの走行の邪魔になる物を道路150から排除することで、実物体120aが速やかに走行できるように「協力」する。図示する例では木152bが倒れ道路150を塞いでいるため、ユーザは実物体120bによって木152bを押すなどして道路150の外に出す。情報処理装置10が制御する実物体120aは、実物体120bの所まで到達しても、木152bが排除されるまではそこで停止させる。 In such an environment, the real objects 120a and 120b travel on the road 150 counterclockwise. The user causes the real object 120b that he / she operates to travel ahead of the real object 120a controlled by the information processing apparatus 10, and excludes from the road 150 objects that obstruct the traveling of the real object 120a controlled by the information processing apparatus 10. Thus, “cooperation” is performed so that the real object 120a can travel quickly. In the illustrated example, the tree 152b falls down and blocks the road 150, so the user pushes the tree 152b with the real object 120b to get out of the road 150. Even if the real object 120a controlled by the information processing apparatus 10 reaches the real object 120b, the real object 120a is stopped there until the tree 152b is eliminated.
 すなわち木152bの排除に要する時間が短いほど、実物体120aは早く道路150を一周できることになる。図示する例では情報処理装置10が制御する実物体120aとユーザが操作する実物体120bが一組のみ存在するが、複数のレーンを設けて複数の組が走行できるようにし、複数のユーザによる対抗戦を行ってもよい。このような態様では、情報処理装置10が制御する実物体120aを救急車、ユーザが操作する実物体120bを自家用車に見立てることにより、救急車を早く走行させるために協力する市民、といった状況を演出できる。各実物体をそれらの車の形状としてもよい。また各実物体の動きに合わせて自動車の走行音やブレーキ音などをスピーカー19などから発生させてもよい。 That is, the shorter the time required to eliminate the tree 152b, the faster the real object 120a can go around the road 150. In the example shown in the figure, there is only one set of the real object 120a controlled by the information processing apparatus 10 and the real object 120b operated by the user. You may fight. In such an aspect, the real object 120a controlled by the information processing apparatus 10 can be represented as an ambulance, and the real object 120b operated by the user can be regarded as a private car, thereby producing a situation such as a citizen cooperating to run the ambulance quickly. . Each real object may be in the shape of their car. Further, a driving sound or a brake sound of a car may be generated from the speaker 19 or the like in accordance with the movement of each real object.
 そのような対抗戦において、相手のチームの実物体120aが通れないように、自分が操作する実物体120bで邪魔するようにし、上述の「妨害」の態様を実現してもよい。この場合、情報処理装置10は、邪魔をしてくる実物体120bから遠ざかりかつ後退量の少ない方向を決定して実物体120aを動かすことにより、妨害を避けながら先を急ぐような動きを実現できる。 In such a competition, the above-described “disturbance” mode may be realized by interfering with the real object 120b operated by the player so that the real object 120a of the opponent team cannot pass. In this case, the information processing apparatus 10 can realize a movement that moves forward while avoiding interference by moving the real object 120a by moving away from the disturbing real object 120b and determining a direction with a small retraction amount. .
 あるいはさらにもう一つ、情報処理装置10が制御するか、別のユーザが操作する第3の実物体を準備し、情報処理装置10が制御する実物体120aが当該第3の実物体に突進していくのをユーザが操作する実物体120bが妨害するゲームとしてもよい。この場合、第3の実物体を救急車、情報処理装置10が制御する実物体120aを戦車、ユーザが操作する実物体120bを自家用車などとすると、戦車で攻撃されそうな救急車を自家用車で守る、といった状況を演出できる。あるいは逆に、ユーザが操作する実物体120bが第3の実物体に連結しようとするのを情報処理装置10が制御する実物体120aが妨害するゲームとしてもよい。 Alternatively, the third real object to be controlled by the information processing apparatus 10 or operated by another user is prepared, and the real object 120a controlled by the information processing apparatus 10 rushes to the third real object. It is good also as a game which the real object 120b which a user operates obstructs going. In this case, if the third real object is an ambulance, the real object 120a controlled by the information processing apparatus 10 is a tank, and the real object 120b operated by the user is a private car, the ambulance that is likely to be attacked by the tank is protected by the private car. It is possible to produce a situation such as Or conversely, it is good also as a game which the real object 120a which the information processing apparatus 10 obstructs that the real object 120b which a user operates tries to connect with a 3rd real object.
 また同じプレイフィールド20を用いて、子供に交通ルールを学ばせる「教育」の態様も実現できる。この場合、例えば子供は人形154a、154bを手で動かす。信号158が赤のうちは、情報処理装置10が制御する実物体120aは道路150を走行し続ける。そして信号158が青になったときに横断歩道156の手前で停止することにより、子供が人形154bなどを動かし横断歩道156を渡れるようにする。信号158が赤のうちに渡ろうとしたり、横断歩道のない場所で道路を横断しようとした場合には、実物体120aまたはスピーカー19から、その直前で急停止したりブレーキ音を発生させたりして危険な行為であることを認識させる。 Also, using the same play field 20, it is possible to realize an “education” mode in which children can learn traffic rules. In this case, for example, the child moves the dolls 154a and 154b by hand. While the signal 158 is red, the real object 120 a controlled by the information processing apparatus 10 continues to travel on the road 150. Then, when the signal 158 turns blue, it stops before the pedestrian crossing 156 so that the child can move the doll 154b or the like to cross the pedestrian crossing 156. When the signal 158 tries to cross in red or cross the road without a pedestrian crossing, the real object 120a or the speaker 19 may suddenly stop or generate a brake sound just before that. To recognize that it is a dangerous act.
 このように情報処理装置10が制御する実物体120aが反応する対象は、同様の構成を有する実物体120bでなく人形など手で持って動かせる物であってもよい。特に幼児のように入力装置14による操作が困難なユーザにはこのような態様が特に有効である。上述の「協力」の態様において、実物体120bの代わりに手を用いて木152bを排除するようにしてもよい。逆に「教育」の態様において、人形154a、154bの代わりに実物体120bをユーザが操作し、より複雑な道路上を走行させながら情報処理装置10が制御する実物体120aとのインタラクションを発生させることにより、自動車の交通法規を学ぶようにしてもよい。 In this way, the target to which the real object 120a controlled by the information processing apparatus 10 reacts may be an object that can be held and moved by hand instead of the real object 120b having the same configuration. Such a mode is particularly effective for a user who is difficult to operate with the input device 14 such as an infant. In the above-mentioned “cooperation” mode, the tree 152b may be excluded using a hand instead of the real object 120b. Conversely, in the “education” mode, the user operates the real object 120b instead of the dolls 154a and 154b, and generates an interaction with the real object 120a controlled by the information processing apparatus 10 while traveling on a more complicated road. By doing so, you may be allowed to learn the traffic regulations of automobiles.
 また同じプレイフィールド20において、情報処理装置10が制御する実物体120aを道端にいる人形154a、154bの前で停止させる規則としてもよい。ユーザが、停止した実物体120aに人形を乗り込ませた状態で、実物体120aを出発させ、別途置かれた家のミニチュアの前で再び停止させる。これによりタクシーのごっこ遊びを実現できる。バス停のミニチュアの前で実物体120aを停止させる規則とし、人形を乗り降りさせることによりバスのごっこ遊びもできる。 In the same play field 20, the real object 120a controlled by the information processing apparatus 10 may be stopped in front of the dolls 154a and 154b on the roadside. The user leaves the real object 120a in a state where the doll is put on the stopped real object 120a, and stops again in front of the miniature of the house placed separately. This makes it possible to play a taxi. It is a rule to stop the real object 120a in front of the miniature at the bus stop, and you can play with the bus by getting on and off the doll.
 図12は、本実施の形態で実現できる態様のうち「演技」の実現例を示している。プレイフィールド20には情報処理装置10が制御する実物体120aが複数個置かれ、統一された動きでマスゲームをするのを、ユーザが鑑賞する。この態様ではユーザが操作する実物体とのその場でのインタラクションはなくてもよいが、例えばユーザがプレイフィールド20上に実物体120aをランダムに置いても、その位置を撮影画像から取得することにより、設定されたフォーメーションに戻りマスゲームが開始されるようにできる。マスゲームの最中にユーザが手で実物体120aの邪魔をしたりしても同様に、規定された位置に戻ってマスゲームを続行することもできる。 FIG. 12 shows an implementation example of “act” among the modes that can be realized in the present embodiment. A plurality of real objects 120a controlled by the information processing apparatus 10 are placed in the play field 20, and the user appreciates playing a mass game with a unified movement. In this aspect, there is no need to interact with a real object operated by the user, but even if the user randomly places the real object 120a on the play field 20, the position is acquired from the captured image. Thus, it is possible to return to the set formation and start the mass game. Even if the user obstructs the real object 120a by hand during the mass game, the user can return to the specified position and continue the mass game.
 この場合、各実物体120aの位置座標の時間変化を設定したデータをシナリオ記憶部56に格納しておく。これにより初期のフォーメーションや途中での位置が規定されるため、それから逸脱している実物体120aを規定された位置に戻すことができる。戻る過程において他の実物体120aの動きの妨げとならないよう、情報処理部62は位置座標の設定を参照しながら各時間ステップで最適な移動方向を決定していく。 In this case, data in which the time change of the position coordinates of each real object 120a is set is stored in the scenario storage unit 56. As a result, the initial formation and the position in the middle are defined, so that the actual object 120a deviating from it can be returned to the defined position. The information processing unit 62 determines an optimal moving direction at each time step while referring to the setting of the position coordinates so as not to hinder the movement of the other real object 120a in the returning process.
 マスゲーム中、情報処理装置10は、プロジェクタや図示しない照明機器を用いてスポットライトを当てるなど光の動きにより演出を加えたり、実物体120a自体、あるいはスピーカー19から音楽を流したりしてもよい。また、置かれた実物体120aの形状やマーカーの色の組み合わせによって、マスゲームの内容や曲が切り替わるようにしてもよい。さらに、ユーザの行動に応じて何らかの変化が生じるようにしてもよい。例えば、マスゲームが終了したときにユーザがした拍手や声援の音が大きいと、アンコールとして再度マスゲームが開始されるようにしてもよい。この場合、マイクロフォン16によって取得した音声信号の大きさがしきい値を超えたとき、情報処理部62がアンコールの開始を決定する。 During the mass game, the information processing apparatus 10 may add effects by the movement of light such as applying a spotlight using a projector or a lighting device (not shown), or may play music from the real object 120 a itself or the speaker 19. . Moreover, the contents of the mass game and the music may be switched depending on the combination of the shape of the placed real object 120a and the color of the marker. Furthermore, some change may occur depending on the user's behavior. For example, when the applause or cheering sound made by the user when the mass game is finished, the mass game may be started again as an encore. In this case, when the size of the audio signal acquired by the microphone 16 exceeds the threshold value, the information processing unit 62 determines the start of encore.
 さらにユーザの指揮に合わせてマスゲームが進行したり、リズム、テンポ、曲調が切り替わったりするようにしてもよい。例えばユーザが入力装置14の所定のボタンを所望の拍子で押下すると、その拍子に合わせた動きで実物体120aが動くようにする。ユーザがカウントをとる音声をマイクロフォン16で取得してもよい。あるいはユーザが実際に手で指揮をしたり、指揮棒を模したマーカーを振ったりする様子を撮像装置12で撮影し、その拍子に合わせてもよい。 Furthermore, a mass game may proceed in accordance with the user's command, or the rhythm, tempo, and tune may be switched. For example, when the user presses a predetermined button of the input device 14 with a desired time signature, the real object 120a is moved by the movement according to the time signature. The voice for which the user counts may be acquired by the microphone 16. Alternatively, the imaging device 12 may take a picture of a user actually conducting a command by hand or shaking a marker that imitates a command stick, and the time may be adjusted.
 なおこの例における複数の実物体120aのうち1つを、ユーザが操作する実物体、あるいはユーザが手で動かす実物体としてもよい。このとき情報処理装置10は、当該実物体に合わせて動くように他の実物体120aを制御する。ユーザが動かす実物体と全く同じ動きでその他の実物体120aが動くようにしてもよいし、ユーザが動かす実物体に、その他の実物体120aが一列になってついていくような動きをしてもよい。これらは上述の「模倣」の形態に対応する。 Note that one of the plurality of real objects 120a in this example may be a real object operated by the user or a real object moved by the user's hand. At this time, the information processing apparatus 10 controls the other real object 120a so as to move in accordance with the real object. The other real object 120a may move with the same movement as the real object moved by the user, or the other real object 120a may move in a line with the real object moved by the user. Good. These correspond to the “imitation” forms described above.
 あるいは、ユーザが動かす実物体が常に中心になるように他の実物体が周りを囲んだり整列したりして、ユーザが動かす実物体が主役になるような動きをしてもよい。この場合、ユーザが動かす実物体との距離や位置関係、その他の実物体120a同士の距離や位置関係、およびそれらの時間変化をあらかじめ設定してシナリオ記憶部56に格納しておく。そして制御対象の全ての実物体120aについて、時間ステップごとに次の移動方向を算出すれば、ユーザが動かす実物体に合わせるような動きを実現できる。 Alternatively, another real object may be surrounded or aligned so that the real object moved by the user is always at the center, so that the real object moved by the user may move. In this case, the distance and positional relationship with the real object moved by the user, the distance and positional relationship between the other real objects 120a, and their temporal changes are set in advance and stored in the scenario storage unit 56. If the next moving direction is calculated for every time step for all the real objects 120a to be controlled, a movement that matches the real object that the user moves can be realized.
 ユーザが操作する実物体を含めたこれらの態様は、その場でユーザが楽しむのみでもよいし、各実物体の位置の時間変化を記録しておき、後にユーザが要求したタイミングで再現できるようにして、上述の「記録・再生」を実現してもよい。再現時には、元はユーザが操作していた実物体の動きも、情報処理装置10が制御する実物体120aによって再現する。これにより、好みの動きでマスゲームや行進を作り上げていく楽しさを提供することができる。 These aspects including the actual objects operated by the user may be enjoyed only by the user on the spot, or the time change of the position of each actual object is recorded so that it can be reproduced later at the timing requested by the user. Thus, the above-described “recording / reproduction” may be realized. At the time of reproduction, the movement of the real object that was originally operated by the user is also reproduced by the real object 120a controlled by the information processing apparatus 10. As a result, it is possible to provide the pleasure of creating a mass game or a march with a desired movement.
 また、情報処理装置10が制御する複数の実物体120aの中にユーザが操作する実物体を混ぜ、お手本通りにマスゲームを遂行できるかに挑戦するゲームとしてもよい。この場合、情報処理装置10はスピーカー19から音楽を流すとともに、表示装置18に、お手本となる動きを表した動画を表示させる。ユーザは初期のフォーメーションのいずれかに自分が操作する実物体を配置し、お手本を見ながらその通りになるように操作する。情報処理部62は、それ以外の実物体を制御するとともに、撮影画像に基づき、ユーザが操作する実物体がお手本からずれた事象を検出する。そしてずれた回数に応じて減点していくことで最終的な得点を計算し、マスゲームの終了時などに表示装置18に表示させる。 Also, a game that challenges whether a mass game can be performed according to a model by mixing a plurality of real objects 120a controlled by the information processing apparatus 10 with real objects operated by the user. In this case, the information processing device 10 plays music from the speaker 19 and causes the display device 18 to display a moving image representing a model movement. The user places an actual object that he / she operates in one of the initial formations, and operates it so that he / she follows the example. The information processing unit 62 controls other real objects and detects an event in which the real object operated by the user deviates from the model based on the captured image. Then, a final score is calculated by deducting points according to the number of deviations and displayed on the display device 18 at the end of the mass game.
 これまで示したようにユーザが実物体120bを操作する場合において、情報処理装置10は、その操作をアシストするようにしてもよい。具体的には、実物体120bの実際の動きが、何らかの事情により、入力装置14を介して要求された動きと異なっていた場合、入力装置14への操作内容を反映させた制御信号に微調整を加えることにより、実際の動きを要求に近づける。例えば実物体120bに荷重がかかり過ぎている場合、ユーザの操作が直進であっても蛇行してしまうことが考えられる。このような場合、情報処理部62は、蛇行を抑える方向に舵角を調整することにより、実物体120bが実際に直進できるようにする。 As described above, when the user operates the real object 120b, the information processing apparatus 10 may assist the operation. Specifically, if the actual movement of the real object 120b is different from the movement requested via the input device 14 due to some circumstances, fine adjustment is made to the control signal reflecting the operation content to the input device 14 To make the actual movement closer to the request. For example, when an excessive load is applied to the real object 120b, it may be meandering even if the user's operation is straight ahead. In such a case, the information processing unit 62 adjusts the rudder angle in a direction to suppress meandering so that the real object 120b can actually go straight.
 具体的には、撮影画像における実物体の位置の変化やセンサによる計測値に基づく実際の動きと、入力装置14から入力されたユーザの操作内容とを比較することにより、要求と実際の差を検出する。そして差が小さくなるように制御信号を調整する。この形態は上述の「補助」に対応する。なおアシスト対象は実物体の走行に係るものに限らず、グリッパーやアームの動きなどでも同様である。またこのような調整は、その他のいかなる形態を実施しているときでも随時行える。 Specifically, the actual difference based on the change in the position of the real object in the photographed image or the measurement value by the sensor is compared with the user's operation content input from the input device 14, thereby obtaining the difference between the request and the actual. To detect. And a control signal is adjusted so that a difference may become small. This form corresponds to the “auxiliary” described above. Note that the assist target is not limited to that related to the traveling of the real object, but the same applies to the movement of the gripper and the arm. Further, such adjustment can be performed at any time when any other mode is implemented.
 これまで示した例は、主に実物体の走行を利用して実現できる態様であったが、実物体にさらに機能を追加することで態様を多様化させることができる。図13は実物体に物をつかむ機能を追加したときに実現される態様の例を示している。プレイフィールド20には、情報処理装置10が制御する実物体120aと、ユーザが操作する実物体120b、および複数のブロック162が置かれている。ブロック162は内部に機構を持たない合成樹脂などの塊でよい。そして各実物体120a、120bは、ブロック162を掴むためのグリッパー160a、160bを備える。 The example shown so far is a mode that can be realized mainly by using the running of the real object, but the mode can be diversified by adding more functions to the real object. FIG. 13 shows an example of a mode realized when a function for grasping an object is added. In the play field 20, a real object 120a controlled by the information processing apparatus 10, a real object 120b operated by a user, and a plurality of blocks 162 are placed. The block 162 may be a lump such as a synthetic resin having no mechanism inside. Each real object 120 a and 120 b includes grippers 160 a and 160 b for gripping the block 162.
 例えばユーザは、入力装置14を介して実物体120bを操作し、1つのブロック162に近づけたうえグリッパー160を開閉させて掴ませる。そしてプレイフィールド20上に表示された自分の陣地164bに、掴んだブロック162を運び込む。情報処理装置10も同様に、実物体120aを制御して、別の陣地164aにブロック162を運び込むように動かす。そしてより多くのブロック162を自分の陣地に運び込んだ方を勝ちとする。これにより、上述の「対戦」や「妨害」の態様を実現できる。 For example, the user operates the real object 120b via the input device 14 to bring it close to the one block 162 and open / close the gripper 160 so as to grasp it. Then, the grasped block 162 is carried into his position 164b displayed on the play field 20. Similarly, the information processing apparatus 10 controls the real object 120a to move the block 162 to another position 164a. The winner is the one who brings more blocks 162 to his position. Thereby, the above-described “match” and “disturbance” modes can be realized.
 なおグリッパー160a、160bは、開閉することによりブロック162を左右から挟み込むが、さらにその根元部分を上下方向に動かせるようにしてブロック162を持ち上げられるようにしてもよい。関節角を制御可能なアームの先端にグリッパーを設けることにより、高い位置にブロック162を持ち上げたりブロック162を裏返したりするなど、より複雑な作業が行えるようにしてもよい。グリッパーの代わりに、ブロック162と床の間にツメを差し込み持ち上げるフォークリフトの機構を設けてもよい。あるいは実物体を荷台のついたクレーン車の形態とすることで、ブロック162を荷台に載せて運べるようにしてもよい。これらの実物体の形態に合わせ、ブロック162の形状も適宜最適化し、運びやすくする。 Note that the grippers 160a and 160b sandwich the block 162 from the left and right by opening and closing, but the block 162 may be lifted so that its base portion can be moved in the vertical direction. By providing a gripper at the tip of the arm capable of controlling the joint angle, a more complicated operation such as lifting the block 162 to a high position or turning the block 162 upside down may be performed. Instead of the gripper, a forklift mechanism for inserting and lifting a claw between the block 162 and the floor may be provided. Or you may make it carry the block 162 on a loading platform by making a real object into the form of the crane vehicle with a loading platform. In accordance with the form of these real objects, the shape of the block 162 is also appropriately optimized to facilitate carrying.
 図14は、実物体に物をつかむ機能を追加したときに実現される態様の別の例を示している。この例は上述の「協力・補助」の形態を実現している。プレイフィールド20には、情報処理装置10が制御する、グリッパーを有する実物体120aと、様々な色を有する複数のブロック170が置かれている。ユーザ8はプレイフィールド20に設けられた作業領域172でブロック170を組み立てる。図示するように立体的に組み上げていってもよいし、平面状に並べるようにしてもよい。前者の場合、積み木のように単に乗せていくのでもよいし、ブロック170を相互に接続可能な構造として組み立てられるようにしてもよい。 FIG. 14 shows another example of a mode realized when a function of grasping an object is added. This example realizes the above-mentioned form of “cooperation / assistance”. In the play field 20, a real object 120a having a gripper and a plurality of blocks 170 having various colors, which are controlled by the information processing apparatus 10, are placed. The user 8 assembles the block 170 in the work area 172 provided in the play field 20. As shown in the figure, they may be assembled three-dimensionally or arranged in a plane. In the former case, it may be simply placed like a building block, or the blocks 170 may be assembled as a mutually connectable structure.
 いずれにしろユーザ8は、組み立て中、次に必要なブロックの色を声で指定する。図では「RED!(赤!)」と指定している。すると実物体120aは、プレイフィールド20上で作業領域172以外の領域にあるブロック170のうち、指定された色のブロックを探しだし、ユーザの近くに運ぶようにする。作業領域172を固定とすれば、ユーザ8はその近傍にいることが推定できる。情報処理部62は、マイクロフォン16が取得した音声信号に基づき指定された色を認識し、その色のブロック170を実物体120aに掴ませて、推定位置まで運ぶように制御する。 In any case, the user 8 specifies the color of the next necessary block by voice during assembly. In the figure, “RED! (Red!)” Is designated. Then, the real object 120a searches for the block of the designated color among the blocks 170 in the area other than the work area 172 on the play field 20, and carries it close to the user. If the work area 172 is fixed, it can be estimated that the user 8 is in the vicinity thereof. The information processing unit 62 recognizes the designated color based on the audio signal acquired by the microphone 16 and controls the block 170 of that color to be grasped by the real object 120a and carried to the estimated position.
 撮像装置12の視野内にユーザ8がいるように規則づけておくことによりユーザ8の位置を検出してもよい。この場合、作業領域172を明に定めずとも、検出したユーザ8の位置からより遠くに置かれたブロック170から運ぶなどすれば、組み立て中のブロックを運ぶ対象とせずにすむ。また声によるブロックの指定は色に限らず、大きさ、形状などのいかなる属性でもよくそれらの組み合わせでもよい。この形態では、ユーザは自分の好きなようにブロックを組み立て、その過程でランダムに選択されるブロックを実物体120aが運ぶことにより作業を効率化できる。組み立てには手を使用するため、声によるブロックの指定を実現することにより、指定のために却って作業効率が落ちるといったことがない。 The position of the user 8 may be detected by providing a rule that the user 8 is within the field of view of the imaging device 12. In this case, even if the work area 172 is not clearly defined, if it is carried from the block 170 placed farther from the detected position of the user 8, it is not necessary to carry the block being assembled. The designation of the block by voice is not limited to color, and any attribute such as size and shape may be used, or a combination thereof. In this form, the user can assemble the blocks as he / she likes, and the real object 120a can carry out the work efficiently by carrying blocks randomly selected in the process. Since hands are used for assembling, by realizing the designation of the block by voice, the work efficiency is not reduced by the designation.
 一方、あらかじめ準備された完成形のモデルから選択したものを組み立てるようにしてもよい。この場合ユーザは、紙面や表示装置に表示されたモデルから組み立てたい物を選択し、入力装置14を用いて指定する。情報処理装置10のシナリオ記憶部56には、組み立て順とそれに用いるブロックの識別情報とをモデルごとに格納しておく。これにより、ユーザがブロックを指定することなく、情報処理装置10による判断で、次に必要なブロックを実物体120aが選択し運んでくる態様を実現できる。このとき、運ばれたブロックをどのように接続するか、に係る情報などを、表示装置18に表示させたりプレイフィールド20上に投影させたりしてもよい。 On the other hand, a model selected from a completed model prepared in advance may be assembled. In this case, the user selects an object to be assembled from the models displayed on the paper or the display device, and designates it using the input device 14. The scenario storage unit 56 of the information processing apparatus 10 stores an assembly order and block identification information used for each model. Accordingly, it is possible to realize a mode in which the real object 120a selects and carries the next necessary block by determination by the information processing apparatus 10 without designating the block by the user. At this time, information relating to how the carried blocks are connected may be displayed on the display device 18 or projected onto the play field 20.
 実物体120aは複数のブロック170を、色、形状、大きさなど所定の基準で分別するようにしてもよい。この場合、情報処理部62は、まずプレイフィールド20上にあるブロック170の種類の数、例えば色の数を撮影画像に基づき特定する。そして当該数分の領域をプレイフィールド20上に設定する。そして実物体120aに掴ませたブロック170を、その色などの属性によって対応する領域に運ばせる制御を繰り返すことにより、種類ごとにブロック170のまとまりを形成する。これにより、多量のブロック170がある場合は特に、所望のブロック170が格段に見つけやすくなる。分別作業は、図14で示したようにユーザ8がブロック170を組み立て中に並行して行ってもよいし、組み立て中に限らず、例えば後述するごっこ遊びの最中に行ってもよい。さらに、散乱したブロック170を属性によらず一つの場所にまとめさせることで、実物体120aにブロックを片付けさせてもよい。 The real object 120a may be configured to sort the plurality of blocks 170 according to a predetermined standard such as color, shape, and size. In this case, the information processing unit 62 first specifies the number of types of the blocks 170 on the play field 20, for example, the number of colors based on the captured image. Then, the corresponding number of areas are set on the play field 20. Then, by repeating the control of bringing the block 170 held by the real object 120a to the corresponding area according to the attribute such as the color, a group of blocks 170 is formed for each type. This makes it much easier to find the desired block 170, especially when there are a large number of blocks 170. As shown in FIG. 14, the user 8 may perform the sorting operation in parallel during the assembly of the block 170, or may be performed not only during the assembly but also during a play game described later, for example. Further, the blocks may be cleared on the real object 120a by collecting scattered blocks 170 in one place regardless of the attribute.
 またユーザ8の作業領域172に加え、実物体120aのための作業領域を設け、実物体120aも並行してブロック170を組み立てるようにしてもよい。例えばユーザが選択した同一のモデルを、ユーザと実物体120aが別個に組み立てていく。実物体120aが組み立てている様子をお手本にしてユーザ8も組み立てていくようにすれば、組み立て手順を示しているのと同等となる。あるいは別途、完成形を表示装置18に表示させ、同時に組み立て始めることで完成までの早さを競うようにしてもよい。1つの作業領域で協力して1つの物を組み立ててもよい。このように実物体120aが組み立て作業を行う場合、平面状に並べる場合はその位置にブロック170を置くのみでよいが、立体的に組み立てる場合は、実物体120aに上述のアームやクレーンを設けることによりブロック170を高い位置まで持ち上げられるようにする。 In addition to the work area 172 of the user 8, a work area for the real object 120a may be provided, and the block 170 may be assembled in parallel with the real object 120a. For example, the user and the real object 120a assemble the same model selected by the user separately. If the user 8 assembles the actual object 120a as an example, the assembly procedure is equivalent to that shown. Alternatively, the completed form may be displayed on the display device 18 and simultaneously assembled to compete for the speed to completion. You may assemble one thing in cooperation in one work area. When the real object 120a is assembled as described above, the blocks 170 need only be placed at the position when arranged in a plane, but when the three-dimensional assembly is performed, the above-described arm or crane is provided on the real object 120a. Thus, the block 170 can be lifted to a high position.
 ただし単に積み木のように乗せるのでなく何らかの接続機構により結合させながら組み立てる場合は、接続に要する力、およびその反作用を考慮する必要がある。例えば一般的な接続手法としてブロックの凹部分に凸部分をはめ込む手法を採用した場合、実物体120aのアームには多大な力が必要となる。また反作用により実物体120aの安定が保てない可能性がある。そこでブロック自体を情報処理装置10の制御対象として、接続状態を情報処理装置10の制御により実現してもよい。図15は、実物体がブロックを組み立てる態様におけるブロックの接続手法の例を説明するための図である。同図は、ブロック170aに設けた円柱状の凸部分174に、ブロック170bに設けた円筒状の凹部分176をはめ込むことによりブロックを接続する過程を横から見た状態である。 However, when assembling while being connected by some kind of connection mechanism rather than just being placed like a building block, it is necessary to consider the force required for connection and its reaction. For example, when a method of fitting a convex portion into the concave portion of the block is adopted as a general connection method, a large force is required for the arm of the real object 120a. In addition, the stability of the real object 120a may not be maintained due to the reaction. Therefore, the connection state may be realized by the control of the information processing apparatus 10 with the block itself as a control target of the information processing apparatus 10. FIG. 15 is a diagram for explaining an example of a block connection method in a mode in which a real object assembles a block. This figure is a side view of the process of connecting the blocks by fitting the cylindrical concave portions 176 provided in the block 170b into the columnar convex portions 174 provided in the block 170a.
 まずブロックが未接続の状態(a)では、ブロック170aの凸部分174の直径r1と比較し、ブロック170bの凹部分176の直径r2が大きい状態とする。そして状態(b)のようにブロック170bの凹部分176がブロック170aの凸部分174にはめ込まれた状態を検出したら、情報処理部62は、状態(c)のように、ブロック170bの凹部分176の内部機構により凸部分174が締めつけられるようにする。これにより、はめ込み作業において実物体120aが要する力は、ブロック170bを持ち上げブロック170aの上に置くのと同等でよくなるとともに、内部機構で接続を強固にすることにより、組み立てた物を傾けても崩壊しないようにできる。 First, in the state where the block is not connected (a), the diameter r2 of the concave portion 176 of the block 170b is set to be larger than the diameter r1 of the convex portion 174 of the block 170a. When the state where the concave portion 176 of the block 170b is fitted into the convex portion 174 of the block 170a as in the state (b) is detected, the information processing unit 62 performs the concave portion 176 of the block 170b as in the state (c). The convex portion 174 is tightened by the internal mechanism. As a result, the force required for the real object 120a in the fitting operation can be equivalent to lifting the block 170b and placing it on the block 170a, and the internal mechanism strengthens the connection, so that the assembled object collapses even when tilted. You can avoid it.
 ここで凹部分176の内部は、例えばネジの回転などにより内壁の少なくとも一部が狭まるようなクランプ構造とする。そして情報処理部62は、制御対象の実物体120aが、ブロック170bをブロック170aの上に乗せたら、当該ネジを回転させるアクチュエータを動作させる制御信号をブロック170bに送信してネジを締めさせることにより、ブロック170aの凸部分174を挟み込んだ状態とする。状態(c)から結合を外す場合は逆に、ネジを緩めさせることにより状態(b)へと移行させればよい。なおブロックの接続手法はこれに限らず、接続部分に空気の吸引機構を設け、情報処理装置10からの制御により接続面に真空を形成することで接続してもよい。あるいは面ファスナーなどの接着面により接続し、外す際は情報処理装置10からの制御によりイジェクトピンを突出させて引きはがすようにしてもよい。 Here, the concave portion 176 has a clamp structure in which at least a part of the inner wall is narrowed by, for example, rotation of a screw. When the control target real object 120a places the block 170b on the block 170a, the information processing unit 62 transmits a control signal for operating the actuator that rotates the screw to the block 170b to tighten the screw. The convex portion 174 of the block 170a is sandwiched. On the contrary, when the connection is removed from the state (c), the state may be shifted to the state (b) by loosening a screw. The block connection method is not limited to this, and an air suction mechanism may be provided in the connection portion, and the connection surface may be connected by forming a vacuum on the connection surface under the control of the information processing apparatus 10. Alternatively, the connection may be made by using an adhesive surface such as a hook-and-loop fastener, and when removing, the eject pin may be protruded and peeled off under the control of the information processing apparatus 10.
 さらに接続時の反作用により実物体120a本体が安定を失わないように、実物体120aには2つのグリッパーアームを設けてもよい。これにより、接続するブロック170aと170bの双方を保持して接続すれば反作用は生じにくい。あるいはプレイフィールド20と、それに接するブロック170aとを接続できるようにし、ブロック170bを接続する際に接続先のブロック170aが固定されているようにすれば、実物体120aが安定を失いにくくなる。 Furthermore, two gripper arms may be provided on the real object 120a so that the real object 120a does not lose its stability due to a reaction at the time of connection. Accordingly, if both the blocks 170a and 170b to be connected are held and connected, the reaction is unlikely to occur. Alternatively, if the play field 20 can be connected to the block 170a that is in contact with the block 170b and the block 170a that is the connection destination is fixed when the block 170b is connected, the real object 120a is less likely to lose stability.
 一方、図14で示したような環境でボイスコマンドを利用する態様は、ブロックを組み立てるばかりでなく、ごっこ遊びにも利用できる。例えばブロック170を色に応じてバナナ、リンゴ、メロンなどの青果に見立てる。そしてユーザ8が「バナナをください。」と言ったら、実物体120aが黄色いブロック170をユーザ8の近傍に運ぶ。このとき同時に実物体120aまたはスピーカー19から「はい、どうぞ」といった音声を発生させることにより、青果店で買い物をしているような状況を演出できる。 On the other hand, the mode of using the voice command in the environment as shown in FIG. 14 can be used not only for assembling the block but also for playing. For example, the block 170 is likened to fruits and vegetables such as bananas, apples and melons depending on the color. When the user 8 says “Please give me a banana”, the real object 120a carries the yellow block 170 to the vicinity of the user 8. At this time, a sound such as “Yes, please” is generated from the real object 120a or the speaker 19 at the same time, thereby making it possible to produce a situation such as shopping at a fruit and vegetable store.
 この場合、シナリオ記憶部56にはブロック170の色と「バナナ」、「リンゴ」、「メロン」といったキーワードとを対応づけて格納しておく。そして情報処理部62は、マイクロフォン16が取得した音声信号からキーワードを検出したら、それに対応する色のブロック170を運ぶように実物体120aを制御する。ブロック170を実際の青果などを模した形としてもよい。ブロックを青果、肉、魚など大まかな分類に対応づけ、青果店、肉屋、魚屋、など異なる店で買い物をしているように演出してもよい。このごっこ遊びの開始時には、上述のように実物体120aが色に応じてブロックを分別することにより、プレイフィールド20上に各店の領域を形成しておいてもよい。またユーザ8が金銭に見立てたブロックをプレイフィールド20に置くことで、実物体120aがそれを持ち去るようにしてさらに買い物らしくしてもよい。 In this case, the scenario storage unit 56 stores the color of the block 170 in association with keywords such as “banana”, “apple”, and “melon”. When the information processing unit 62 detects a keyword from the audio signal acquired by the microphone 16, the information processing unit 62 controls the real object 120a so as to carry the block 170 of the corresponding color. The block 170 may be shaped like an actual fruit or vegetable. The blocks may be associated with rough classifications such as fruits and vegetables, meat, fish, etc. and produced as if shopping at different stores such as fruit and vegetable stores, butchers, and fish stores. At the start of this game of play, the area of each store may be formed on the play field 20 by separating the blocks according to the color of the real object 120a as described above. Further, by placing a block that the user 8 regards as money on the play field 20, the real object 120 a may take it away and make it more like shopping.
 ユーザ8が差し出したブロックを洗濯物に見立て、「クリーニング屋さん、これを持って行って。」といった声に反応して、実物体120aが当該ブロックを持ち去るようにしてもよい。この場合、「これを持って行って。」といったキーワードと、ユーザ8の近傍に置かれたブロックを掴み、プレイフィールド20上の所定の領域にまとめて置く、という動きの規則を対応づけ、シナリオ記憶部56に格納しておく。ブロックを動物やキャラクタなどに見立てたり、そのような形状のフィギュアをブロック170の代わりに用いたりしてもよい。これらの態様はごっこ遊びに実物体が「協力」するものであるが、幼児が買い物の仕方を学ぶという意味では「教育」の目的でも利用できる。 The block provided by the user 8 may be regarded as laundry, and the real object 120a may take away the block in response to a voice such as “Please bring this with the cleaner”. In this case, a keyword such as “Take this with you” and a rule of movement that grabs a block placed in the vicinity of the user 8 and puts them together in a predetermined area on the play field 20 are associated with each other. It is stored in the storage unit 56. A block may be regarded as an animal or a character, or a figure having such a shape may be used instead of the block 170. These modes are those in which real objects "cooperate" with pretend play, but they can also be used for the purpose of "education" in the sense that an infant learns how to shop.
 図16は実物体に物をつかむ機能を追加したときに実現される態様のさらに別の例を示している。この例は上述の「対戦」の形態を実現しており、プレイフィールド20をボードゲームの盤とする。プレイフィールド20には、情報処理装置10が制御する、グリッパーを有する実物体120aと、ボードゲームに用いる駒180、182が置かれている。駒180、182は内部に機構を持たない合成樹脂などの塊でよい。また駒180、182の形状や盤のマス目などは図示するものに限らず、ゲームによって適宜決定してよい。ここで行うゲームはチェス、将棋、リバーシなど一般的なものでよい。 FIG. 16 shows still another example of a mode realized when a function of grasping an object is added. This example realizes the above-described "match" form, and the play field 20 is a board game board. In the play field 20, an actual object 120a having a gripper and pieces 180 and 182 used for a board game, which are controlled by the information processing apparatus 10, are placed. The pieces 180 and 182 may be a lump of synthetic resin or the like having no mechanism inside. Further, the shapes of the pieces 180 and 182 and the grids of the board are not limited to those shown in the drawings, and may be determined as appropriate depending on the game. The game played here may be a general game such as chess, shogi, and reversi.
 実物体120aは自分の駒180を、グリッパーを用いて動かし、ユーザ8は自分の駒182を手で動かす。情報処理部62は、ユーザが動かす駒182の位置を含めた戦況に応じて、実物体120aが動かすべき駒180やその移動先を決定する。このような作戦自体は、従来のコンピュータゲームにおけるプログラムと同様の処理でよい。実物体120aが対戦相手となる代わりに、2人のユーザが対戦している状況で、実物体120aが補助の役割を果たしてもよい。例えばチェスや将棋でどちらかのユーザの1手により相手の駒が取れた場合、その駒を当該ユーザの方へ運び込む。リバーシの場合は駒の表裏を反転させる。いずれの場合もシナリオ記憶部56には、各ゲームのルール、特に各手によるその他の駒の変化の規則を格納しておく。この態様は、ボードゲームに限らず、カードゲームや双六などでも同様に実現できる。 The real object 120a moves his piece 180 using a gripper, and the user 8 moves his piece 182 by hand. The information processing unit 62 determines the piece 180 to which the real object 120a should move and the destination to move according to the battle situation including the position of the piece 182 moved by the user. Such a strategy itself may be the same processing as a program in a conventional computer game. Instead of the real object 120a becoming an opponent, the real object 120a may play an auxiliary role in a situation where two users are competing. For example, when the opponent's piece is picked up by one of the users with chess or shogi, the piece is carried toward the user. In the case of reversi, the front and back of the piece are reversed. In either case, the scenario storage unit 56 stores rules for each game, particularly rules for changing other pieces by each hand. This aspect is not limited to a board game, and can be similarly realized in a card game or a six game.
 さらに、駒180、182をキャラクタのフィギュアとし、相手のフィギュアに体当たりさせるなどして攻撃する対戦ゲームを実現してもよい。この場合、実物体120aは自分のフィギュアを掴んだり押したりしてユーザ8のフィギュアにぶつけたり、ぶつけられるのを回避したりする。あるいは、ユーザ側にもグリッパーを有する実物体を置き、ユーザが操作することにより対抗する。さらに情報処理装置10が制御する第3の実物体を登場させ、フィギュアが持つ武器を、武器の置き場所から持ってこさせるようにしてもよい。このときユーザは、図14で示したのと同様に、声によって武器を指定できるようにしてもよい。 Furthermore, a battle game may be realized in which the pieces 180 and 182 are character figures and attacked by hitting the opponent figure. In this case, the real object 120a avoids being hit or bumped against the figure of the user 8 by grasping or pushing his / her own figure. Alternatively, an actual object having a gripper is also placed on the user side, and the user operates to oppose it. Further, a third real object controlled by the information processing apparatus 10 may be introduced so that the weapon held by the figure is brought from the place where the weapon is placed. At this time, the user may be able to designate a weapon by voice, as shown in FIG.
 図17は実物体にペンを保持する機能を追加したときに実現される態様の例を示している。プレイフィールド20には情報処理装置10が制御する実物体120aと、ユーザが操作する実物体120bが置かれている。各実物体120a、120bは、ペンを下向きに保持することのできる保持機構190a、190bを備えている。保持機構190a、190bは、情報処理装置10からの制御信号により、ペン先をプレイフィールド20に接触させたり離したりできるようにペンの高さを変更可能に構成される。そしてペン先を接触させた状態で実物体120a、120bが移動することにより、プレイフィールド20上に、移動に応じた線画が作成される。 FIG. 17 shows an example of a mode realized when a function for holding a pen is added to a real object. In the play field 20, a real object 120a controlled by the information processing apparatus 10 and a real object 120b operated by the user are placed. Each real object 120a, 120b includes a holding mechanism 190a, 190b that can hold the pen downward. The holding mechanisms 190 a and 190 b are configured to be able to change the pen height so that the pen tip can be brought into contact with or separated from the play field 20 by a control signal from the information processing apparatus 10. Then, when the real objects 120a and 120b move while the pen tip is in contact, a line drawing corresponding to the movement is created on the play field 20.
 例えばユーザが操作する実物体120bが描く線画に応じて、情報処理装置10が制御する実物体120aが、それを補うように線画を描くことにより、ユーザと情報処理装置10の共同作業により1つの絵を完成させる。このときユーザ独自の線画に対し所定の規則で情報処理装置10が装飾を施していく態様としてもよいし、見本となる絵の完成形を表示装置18に表示させ、双方がそれを目指して描いていくようにしてもよい。前者の場合、ユーザの線画に対する装飾の仕方に関する規則をシナリオ記憶部56に格納しておく。後者の場合、絵の完成形をシナリオ記憶部56に格納したおき、各時間ステップにおける描画の途中経過と比較することにより、不足部分を情報処理装置10の制御によって実物体120aに描かせる。 For example, according to the line drawing drawn by the real object 120b operated by the user, the real object 120a controlled by the information processing apparatus 10 draws a line drawing so as to compensate for the line drawing. Complete the picture. At this time, the information processing apparatus 10 may decorate the user's original line drawing according to a predetermined rule, or a complete picture of a sample picture may be displayed on the display device 18 and both may draw for that purpose. You may make it go. In the former case, a rule regarding how to decorate the user's line drawing is stored in the scenario storage unit 56. In the latter case, the completed form of the picture is stored in the scenario storage unit 56, and the deficient portion is drawn on the real object 120a by the control of the information processing apparatus 10 by comparing with the progress of drawing at each time step.
 プレイフィールド20は交換が可能な紙としたり、線画を消して描き直すことのできる表面加工を施した板としたりする。上記はペンを用いた「協力・補助」の形態であったが、情報処理装置10が制御する実物体120aのみが線画を描くようにして「活動」の形態を実現してもよい。この場合、例えばユーザは声や入力装置14を介した操作により描画してほしい物を指定する。情報処理部62は、指定された物に対応づけられた絵をシナリオ記憶部56から読み出し、その通りに絵を描くことによりユーザとのインタラクションを実現する。あるいはユーザは、実物体120aが描画した絵に点数をつけたり絵を修正したりしてもよい。このときユーザは点数や修正を、手に持ったペンでプレイフィールド20に直接書き込んでよい。 The play field 20 may be a replaceable paper or a surface-treated plate that can be redrawn by erasing line drawings. The above is a form of “cooperation / assistance” using a pen. However, the form of “activity” may be realized by drawing only a real object 120 a controlled by the information processing apparatus 10. In this case, for example, the user designates an object to be drawn by voice or an operation via the input device 14. The information processing unit 62 reads out a picture associated with the designated object from the scenario storage unit 56 and draws the picture as it is, thereby realizing an interaction with the user. Alternatively, the user may give points to the picture drawn by the real object 120a or correct the picture. At this time, the user may directly write the score and correction to the play field 20 with a pen held in his hand.
 情報処理装置10は、その点数や修正箇所を撮影画像から取得し学習する。例えばしきい値以下の悪い点数がつけられた絵はユーザの好みでないとして2度と描かない。ユーザが修正した絵は、再度その絵を描く際、修正が再現されるようにする。そのため情報処理部62は、シナリオ記憶部56に格納しておいた絵の完成形のデータに、ユーザがつけた点数や修正後の絵のデータを対応づける。なおこの態様は、実物体120a、120bに実際にペンを保持させる代わりに、プロジェクタを用いて、実物体120a、120bが通過した経路に線が描かれるような画像をプレイフィールド20上に投影することによっても実現できる。これにより、紙を交換したり線を消したりする必要がなくなるとともに、出来上がった絵を電子的に保存することが容易にできる。 The information processing apparatus 10 acquires and learns the score and the corrected part from the photographed image. For example, a picture with a bad score below the threshold is not drawn again because it is not the user's preference. The picture corrected by the user is reproduced when the picture is drawn again. Therefore, the information processing unit 62 associates the score of the user and the corrected picture data with the completed picture data stored in the scenario storage unit 56. In this aspect, instead of actually holding the pen on the real objects 120a and 120b, a projector is used to project an image on which a line is drawn on the path through which the real objects 120a and 120b have passed. Can also be realized. This eliminates the need to change paper or erase lines, and can easily store the resulting picture electronically.
 図18は、実物体とユーザの体とでインタラクションを発生させる態様の例を示している。プレイフィールド20には、情報処理装置10が制御する実物体120aが置かれている。そして当該実物体120aは、プレイフィールド20の縁の4辺のうち3辺で跳ね返るように自走する。ユーザは残りの1辺を自分の手196で守る。つまり実物体120aがピンボールの玉のような動きをするのに対し、それを手196で打ち返すことにより「対戦・競争」の形態を実現する。 FIG. 18 shows an example of a mode in which an interaction is generated between the real object and the user's body. In the play field 20, a real object 120a controlled by the information processing apparatus 10 is placed. Then, the real object 120 a self-runs so as to bounce off three sides of the four sides of the edge of the play field 20. The user protects the remaining side with his / her hand 196. That is, the real object 120a moves like a ball of a pinball, but by hitting it back with the hand 196, a "matching / competition" form is realized.
 情報処理部62は、プレイフィールド20に所定角度の傾斜がついていると仮定したときの玉の動きや跳ね返りを物理計算し、実物体120aをそのように移動させる。一方、状態特定部52は、手196の像を撮影画像から検出し、既存の手法で追跡しておく。これにより情報処理部62は、手の動きによって玉が跳ね返る様子も実物体120aで再現できる。ユーザが実際に実物体120aを打ち返さずとも、実物体120aが辺を越える前に手196が実物体120aに追いついていれば打ち返しの成功を判定し、反対側に戻るように実物体120aを制御する。追いつかなければ実物体120aはプレイフィールド20の外に出てユーザの負けとする。なおプレイフィールド20の内部をブロックで囲み、当該ブロックで跳ね返るように実物体120aを動かしてもよい。 The information processing unit 62 physically calculates the movement and rebound of the ball when it is assumed that the play field 20 is inclined at a predetermined angle, and moves the real object 120a as such. On the other hand, the state specifying unit 52 detects the image of the hand 196 from the captured image and tracks it with an existing method. As a result, the information processing unit 62 can also reproduce the manner in which the ball rebounds by the movement of the hand with the real object 120a. Even if the user does not actually hit back the real object 120a, if the hand 196 catches up with the real object 120a before the real object 120a crosses the side, the success of the hit is determined, and the real object 120a is controlled to return to the opposite side. To do. If it does not catch up, the real object 120a goes out of the play field 20 and loses the user. The inside of the play field 20 may be surrounded by a block, and the real object 120a may be moved so as to bounce off the block.
 さらにその内部にも、跳ね返りポイントとしてブロックを置くようにしてもよい。ユーザが自由な位置や形状にブロックを置いても、撮影画像によって各ブロックの位置座標が判明するため、簡単な物理計算で実物体120aの速度や移動方向を時間ステップごとに決定することができる。ブロックの色などに応じて材質を仮定しておけば、速度の計算に反発係数の変化を反映させることもできる。跳ね返る際、スピーカー19などから跳ね返り音を発生させてもよい。 In addition, a block may be placed inside it as a rebound point. Even if the user places the block in a free position or shape, the position coordinates of each block can be determined from the captured image, so the speed and moving direction of the real object 120a can be determined for each time step by simple physical calculation. . If the material is assumed according to the color of the block, the change in the coefficient of restitution can be reflected in the speed calculation. When rebounding, a rebounding sound may be generated from the speaker 19 or the like.
 実物体120aと手196のインタラクションの別の例として、手196が実物体120aを追いかける「競争」の態様を実現してもよい。この場合、実物体120aを複数個置き、手196の動きに合わせて一斉に逃げ回るようにしてもよい。情報処理部62は、各時間ステップにおける手の位置情報に基づき、手から遠ざかるように各実物体120aの移動方向を決定する。このときなるべく方向が分散するように、乱数を発生させるなどして移動方向を決定すると、捕まえづらくなりゲーム性がより高まる。なおユーザは、実物体120aを実際に掴むのでなく、プレイフィールド20の所定の領域内に追い込むことで捕まえた状態としてもよい。実物体120aを羊、ネズミ、魚などの生物の形状としたり、スピーカー19から鳴き声などを発生させたりすることにより、より臨場感を与えられる。 As another example of the interaction between the real object 120a and the hand 196, a “competition” mode in which the hand 196 follows the real object 120a may be realized. In this case, a plurality of real objects 120a may be placed so as to escape all at once according to the movement of the hand 196. The information processing unit 62 determines the moving direction of each real object 120a so as to move away from the hand based on the position information of the hand at each time step. At this time, if the moving direction is determined by generating random numbers or the like so that the directions are dispersed as much as possible, it becomes difficult to catch and the game performance is further improved. The user may be in a state where the user does not actually grab the real object 120a but has caught it in a predetermined area of the play field 20. By making the real object 120a into the shape of a living creature such as sheep, a mouse, or a fish, or generating a cry from the speaker 19, a more realistic feeling can be given.
 これまで述べた例では、実物体の動きはアクチュエータの駆動によっていた。さらに重力など自然に発生する力により動く実物体と組み合わせて遊びを多様化させることもできる。図19は、玉の転がりを実物体の動きに含めた「協力」や「演技」の態様の例を示している。この例ではルーブゴールドバーグマシン(Rube Goldberg machine)のように、玉200の転がりに呼応するように他の物や仕掛けが動く装置を想定している。プレイフィールド20には、情報処理装置10が制御する実物体120a、ユーザが操作する実物体120bがいくつか置かれている。またユーザにより、玉200が転がるおよそのコースがブロック202で形成されている。 In the example described so far, the movement of the real object was driven by the actuator. Furthermore, play can be diversified in combination with real objects that move due to naturally occurring forces such as gravity. FIG. 19 shows an example of a “cooperation” or “act” mode in which ball rolling is included in the movement of a real object. In this example, a device such as a Rube Goldberg machine is assumed in which other objects and devices move so as to respond to the rolling of the ball 200. In the play field 20, several real objects 120a controlled by the information processing apparatus 10 and some real objects 120b operated by the user are placed. In addition, an approximate course in which the ball 200 rolls is formed by the user in the block 202.
 情報処理装置10が制御する実物体としてさらに、実物体120m、120n、120pを置く。実物体120mは四角柱の本体と滑り台とで構成される。四角柱は下部と上部に穴204、206を備えるとともに、エレベータのように内部の空洞を台が昇降する機構を備える。台を昇降させるアクチュエータは情報処理装置10からの制御信号により駆動させる。また実物体120n、120pは、情報処理装置10からの制御信号により全体を発光させたり効果音を発生させたりする機構を備える。なお上述のとおり、プロジェクタからの光の投影により実物体120n、120pが発光させているように見せたり、効果音をスピーカー19から発生させたりしてもよい。このような構成において、情報処理装置10が制御する実物体120aは、グリッパーにより運んだ玉200を、実物体120mの下部の穴204に入れる。 Real objects 120m, 120n, and 120p are further placed as real objects controlled by the information processing apparatus 10. The real object 120m is composed of a quadrangular prism main body and a slide. The quadrangular column is provided with holes 204 and 206 at the lower part and the upper part, and a mechanism for raising and lowering the internal cavity like an elevator. The actuator that moves the table up and down is driven by a control signal from the information processing apparatus 10. In addition, the real objects 120n and 120p are provided with a mechanism that causes the whole to emit light or generate a sound effect according to a control signal from the information processing apparatus 10. As described above, the real objects 120n and 120p may appear to emit light by projecting light from the projector, or sound effects may be generated from the speaker 19. In such a configuration, the real object 120a controlled by the information processing apparatus 10 puts the ball 200 carried by the gripper into the hole 204 below the real object 120m.
 すると情報処理装置10の制御により、実物体120mの四角柱の内部で玉200が乗せられた台が上昇し、当該台が傾くことにより、上部の穴206から玉200が外へ出る。その結果、玉200は滑り台を転がり実物体120nに衝突する。情報処理装置10は玉200の衝突を撮影画像に基づき検出すると、実物体120nを発光させたり音声を発生させたりして反応させる。実物体120nから跳ね返り転がって来た玉200を、ユーザが実物体120bを操作し跳ね返したり押したりして所望の方向へ導く。そのようにして転がってきた玉200を、次は情報処理装置10が制御する実物体120aが、跳ね返したり押したりして実物体120pの開口部へ入れ、ゴールとする。 Then, under the control of the information processing apparatus 10, the platform on which the ball 200 is placed rises inside the quadrangular column of the real object 120m, and the ball 200 goes out from the upper hole 206 by tilting the platform. As a result, the ball 200 rolls on the slide and collides with the real object 120n. When the collision of the ball 200 is detected based on the captured image, the information processing apparatus 10 reacts by causing the real object 120n to emit light or generate sound. The ball 200 that bounces and rolls off from the real object 120n is guided to a desired direction by the user operating the real object 120b to bounce or push it. Next, the real object 120a controlled by the information processing apparatus 10 is rebounded or pushed into the opening of the real object 120p, and the ball 200 thus rolled is set as a goal.
 このとき情報処理装置10は、撮影画像に基づき玉200の位置を追跡し、実物体120pの方向へ跳ね返るような位置に実物体120aを動かしたり、実物体120aにより玉200を押したりする。また開口部に玉200が入ったことも撮影画像などから検出し、実物体120pを発光させたり音声を発生させたりしてゴールの瞬間を演出する。このように本実施の形態では、撮影画像に基づき場の変化を検出するため、制御対象外の玉などの動きに対応させて、実物体が備える様々な機能を作動させることができる。シナリオ記憶部56には、上述したような玉200の動きと、それに応じた実物体の動きに係る規則を格納しておく。 At this time, the information processing apparatus 10 tracks the position of the ball 200 based on the captured image, moves the real object 120a to a position where it rebounds in the direction of the real object 120p, or pushes the ball 200 with the real object 120a. Further, it is detected from the photographed image that the ball 200 has entered the opening, and the real object 120p is caused to emit light or a sound is generated to produce the goal moment. As described above, in the present embodiment, since the change in the field is detected based on the captured image, various functions of the real object can be activated in accordance with the movement of the ball that is not controlled. The scenario storage unit 56 stores the rules related to the movement of the ball 200 as described above and the movement of the real object according to the movement.
 図示したコースは比較的単純な構成であったが、玉200の衝突によって並べた平板が次々に倒れるドミノ倒しの部分を作ったり、縮めたバネをアクチュエータによって開放することにより玉200を跳ばしたり、といった、より複雑な構成を加えてもよい。また玉200の転がる力を他の物に伝えることにより、動く主体が途中で切り替わるようにしてもよい。実物体120m、120n、120pのような基本的な実物体を様々に組み合わせることにより、自然の力による動きと駆動系による動きとを融合させたユニークな装置を、ユーザの発想に応じて作成することができる。 Although the course shown in the figure has a relatively simple configuration, a domino-inclined portion in which flat plates arranged one after another due to the collision of the balls 200 are collapsed, or the balls 200 are jumped by releasing the shrunken springs with an actuator. More complicated configurations such as, etc. may be added. Moreover, you may make it the moving main body switch in the middle by telling the force which the ball 200 rolls to another thing. By combining various basic real objects such as real objects 120m, 120n, and 120p, a unique device that combines the movements of the natural force and the movements of the drive system is created according to the user's ideas. be able to.
 図9、11、16などで例示した対戦や競争の形態を、複数のユーザ間でネットワークを介して実現してもよい。図20はネットワークを利用して複数のユーザが1つのゲームに参加する態様を説明するための図である。同図の例では3つの場所で、情報処理システム1a、1b、1cがそれぞれ同様に構築されている。各システムの情報処理装置10a、10b、10cは、ネットワーク212を介してサーバ210に接続する。情報処理システム1aのプレイフィールドには、そこにいるユーザが操作する実物体のほか、情報処理システム1b、1cのユーザがネットワーク212を介して動かす実物体が置かれている。 The battle and competition forms illustrated in FIGS. 9, 11, 16 and the like may be realized among a plurality of users via a network. FIG. 20 is a diagram for explaining a mode in which a plurality of users participate in one game using a network. In the example of the figure, the information processing systems 1a, 1b, and 1c are similarly constructed in three places. The information processing apparatuses 10a, 10b, and 10c of each system are connected to the server 210 via the network 212. In the play field of the information processing system 1a, real objects that are moved by the users of the information processing systems 1b and 1c via the network 212 are placed in addition to the real objects operated by the users there.
 その他の情報処理システム1b、1cにおいても同様に、自らのユーザが操作する実物体と、その他の情報処理システムのユーザがネットワーク212を介して操作する実物体がプレイフィールドに置かれる。そのため図示する例では、全ての情報処理システム1a、1b、1cのプレイフィールドに、3つの実物体が置かれている。ゲーム開始前に、参加ユーザの情報処理システムと実物体との対応づけをサーバ210が行い、各情報処理装置10a、10b、10cに通知しておく。そしてサーバ210は、情報処理システム1a、1b、1cの各ユーザが入力装置を介して自分の実物体を操作した際、その情報を他の情報処理システムに通知する。 Similarly, in the other information processing systems 1b and 1c, the real object operated by the user and the real object operated by the user of the other information processing system via the network 212 are placed in the play field. Therefore, in the illustrated example, three real objects are placed in the play fields of all the information processing systems 1a, 1b, and 1c. Before the game is started, the server 210 associates the information processing system of the participating user with the real object, and notifies the information processing apparatuses 10a, 10b, and 10c. When each user of the information processing systems 1a, 1b, and 1c operates his / her real object via the input device, the server 210 notifies the other information processing system of the information.
 各情報処理装置10a、10b、10cは、自らのユーザによる入力装置14を介した操作に従い、対応する実物体を動かすとともに、サーバ210から通知された他のユーザの操作情報に従い、それぞれに対応する実物体を動かす。結果として全ての情報処理システム1a、1b、1cで、各ユーザに対応する実物体が同じように動くことになる。さらに実物体を増やし、サーバ210あるいは情報処理装置10a、10b、10cの少なくともいずれかが動きを決定してもよい。このようにして、例えば図9で示したようなプレイフィールドで対戦ゲームを実施する。 Each of the information processing apparatuses 10a, 10b, and 10c moves a corresponding real object in accordance with an operation of the user via the input device 14, and responds to each of them according to operation information of other users notified from the server 210. Move a real object. As a result, in all the information processing systems 1a, 1b, and 1c, the real object corresponding to each user moves in the same manner. Further, the number of real objects may be increased, and at least one of the server 210 or the information processing apparatuses 10a, 10b, and 10c may determine the movement. In this way, for example, the battle game is executed in the play field as shown in FIG.
 上述したように実物体を戦車としたり砲撃により得点を与えたりすれば、離れた場所にいるユーザが箱庭の世界で実際に対戦する状況を実現できる。ここで実物体の実際の動きは各情報処理装置10a、10b、10cの制御によるため、様々な調整を行うことで、ゲームをより面白くすることができる。例えばユーザが過去に対戦した回数や戦績をサーバ210に記録しておき、ゲーム開始時に各情報処理装置10a、10b、10cに通知しておく。情報処理装置10a、10b、10cは、当該情報に応じて実物体の動きに調整を加える。 As described above, if a real object is used as a tank or a score is given by shelling, it is possible to realize a situation where a user in a remote place actually fights in the world of a miniature garden. Here, since the actual movement of the real object is controlled by the information processing apparatuses 10a, 10b, and 10c, the game can be made more interesting by performing various adjustments. For example, the number of times the user has played in the past and the battle record are recorded in the server 210, and the information processing apparatuses 10a, 10b, and 10c are notified at the start of the game. The information processing apparatuses 10a, 10b, and 10c adjust the movement of the real object according to the information.
 例えば、あるユーザの対戦回数が少なかったり戦績がよくないことをしきい値との比較などにより判定したら、当該ユーザにハンデを与え、他ユーザの実物体をより高速に移動したり1度の砲撃による得点を他ユーザより高くする。ボードゲームでは最初の駒の数に差をつけたりする。習熟度などを複数段階に分けてもよい。このようにすることで、年齢や習熟度によらず対等に戦うことができるため、相手を選ぶことなくゲームを楽しむことができる。また場合によって、同じ習熟度のユーザのみでハンデなく戦うことも選択できる。 For example, if it is determined by comparison with a threshold value that the number of battles of a certain user is low or the performance is not good, a handicap is given to the user, and the real object of another user is moved at a higher speed, or a single bombardment Score higher than other users. In the board game, the number of the first piece is made different. You may divide the proficiency level into multiple levels. By doing so, since it is possible to fight equally regardless of age and proficiency level, it is possible to enjoy the game without selecting an opponent. Moreover, depending on the case, it is also possible to select that only users with the same skill level fight without handicap.
 以上述べた本実施の形態によれば、プレイフィールドを撮影した画像を用いて、プレイフィールド上の実物体の位置や動きを検出する。そしてあらかじめ設定された規則に従い、実物体のうちいずれかを情報処理装置が制御して動かす。これにより、物体が様々な反応をする多様な空間を実世界において作り上げることができる。このとき微小時間間隔の時間ステップごとに実物体の動きを決定していくことにより、比較的単純な計算で、その他の実物体の動きに臨機応変に反応しているように見せることができる。 According to the present embodiment described above, the position and movement of an actual object on the play field are detected using an image obtained by shooting the play field. Then, the information processing apparatus controls and moves any of the real objects according to a preset rule. This makes it possible to create various spaces in the real world where objects react in various ways. At this time, by determining the movement of the real object at each time step of a minute time interval, it is possible to make it appear as if it is responding to the movement of the other real object flexibly with a relatively simple calculation.
 またユーザが入力装置を介して操作する実物体や、手で置いたり動かしたりする実物体を導入し、情報処理装置が制御する実物体の動きに影響を与えるようにする。これにより、コンピュータゲームのような遊びや装置との共同作業を、実物体を用いて実現できるとともに、入力装置を使い慣れていない幼児などでも容易に遊んだり利用したりすることができる。声を取得しそれに実物体を反応させることによっても、同様の効果が得られる。 In addition, a real object that is operated by the user via the input device or a real object that is placed or moved by a hand is introduced to influence the movement of the real object that is controlled by the information processing apparatus. Thus, play such as a computer game and collaborative work with a device can be realized using a real object, and even an infant who is not familiar with an input device can easily play and use it. The same effect can be obtained by acquiring a voice and reacting a real object to it.
 ユーザが操作する実物体も、情報処理装置を介して動かすことにより、プレイフィールド上の場所によって移動速度に制限を設けたり、ユーザによってハンデを与えたりする調整が可能となる。結果として、コンピュータゲームでは容易であった各種調整が実世界で可能になり、よりルールを複雑化したり多機能化させたりすることができる。また同じプレイフィールドを利用した同じゲームであっても、その難易度を容易に変えることができる。また撮影画像に基づき実物体の動きを決定するため、プレイフィールドに表された画像や出力音声、駆動系を持たない物や人体の動きと、情報処理装置が制御する実物体とのインタラクションも可能となる。当該画像をプロジェクタにより投影するようにすれば、実物体と画像を連動させ、ひいては構築する世界観を変化させることも容易にできる。 The real object operated by the user is also moved through the information processing device, so that the moving speed can be limited depending on the place on the play field, and the user can be adjusted to give a handicap. As a result, various adjustments that were easy in a computer game are possible in the real world, and the rules can be made more complex and multifunctional. Moreover, even if it is the same game using the same play field, the difficulty level can be changed easily. In addition, since the movement of the real object is determined based on the captured image, it is possible to interact with the real object controlled by the information processing device and the image or output sound displayed in the play field, the movement of an object or human body that does not have a drive system It becomes. If the image is projected by the projector, the real object and the image can be linked, and thus the world view to be constructed can be easily changed.
 さらにネットワークを介して複数のユーザのシステムを接続すれば、遠隔地にいるユーザと実物体を用いて遊ぶ、という状況を実現できる。この場合も、ローカルにある情報処理装置が実物体を制御するため、動作の制限やユーザによるハンデといった調整が可能になる。また参加メンバーとして、ユーザ以外に情報処理装置を含めることもできる。例えば複数ユーザの連合軍と情報処理装置の軍とで多数の実物体を用いて対戦する、といった比較的規模の大きい態様も実現できる。 Furthermore, if a system of multiple users is connected via a network, it is possible to realize a situation where a user who is in a remote place plays with a real object. Also in this case, since the local information processing apparatus controls the real object, it is possible to make adjustments such as restriction of operation and handicap by the user. In addition to the user, the information processing apparatus can be included as a participating member. For example, it is possible to realize a relatively large aspect in which a multi-user Allied Force and an information processing device army battle each other using a large number of real objects.
 以上、本発明を実施の形態をもとに説明した。上記実施の形態は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The present invention has been described based on the embodiments. Those skilled in the art will understand that the above-described embodiment is an exemplification, and that various modifications can be made to combinations of the respective constituent elements and processing processes, and such modifications are also within the scope of the present invention. is there.
 1 情報処理システム、 10 情報処理装置、 12 撮像装置、 14 入力装置、 16 マイクロフォン、 18 表示装置、 19 スピーカー、 20 プレイフィールド、 22 CPU、 24 GPU、 26 メインメモリ、 50 通信部、 52 状態特定部、 54 実物体情報記憶部、 56 シナリオ記憶部、 60 実物体制御部、 62 情報処理部、 106a 駆動部、 108a 通信部、 120a 実物体、 122 車輪、 126 マーカー、 128 モーター。 1 Information processing system, 10 Information processing device, 12 Imaging device, 14 Input device, 16 Microphone, 18 Display device, 19 Speaker, 20 Playfield, 22 CPU, 24 GPU, 26 Main memory, 50 Communication unit, 52 Status identification unit 54 real object information storage unit, 56 scenario storage unit, 60 real object control unit, 62 information processing unit, 106a drive unit, 108a communication unit, 120a real object, 122 wheels, 126 markers, 128 motors.
 以上のように本発明は玩具、学習機器、コンピュータ、ゲーム装置、情報処理装置や、それらを含むシステムなどに利用可能である。 As described above, the present invention can be used for toys, learning devices, computers, game devices, information processing devices, and systems including them.

Claims (23)

  1.  撮像装置が撮影した動画像の画像フレームを順次取得し、当該画像フレームにおける像を検出することにより、被写空間に存在する実物体の状態情報を所定の時間間隔で検出する状態特定部と、
     前記状態特定部が検出した状態情報に基づく所定の規則で、前記実物体のうち制御対象の実物体の作動の内容を前記時間間隔で決定する情報処理部と、
     前記制御対象の実物体を、前記情報処理部が決定した内容で作動するように制御する実物体制御部と、
     を備えたことを特徴とする情報処理装置。
    A state specifying unit that sequentially acquires image frames of moving images taken by the imaging device and detects state information of a real object existing in the object space at a predetermined time interval by detecting images in the image frames;
    An information processing unit that determines the operation content of the real object to be controlled among the real objects at the time interval according to a predetermined rule based on the state information detected by the state specifying unit;
    A real object control unit that controls the real object to be controlled to operate according to the content determined by the information processing unit;
    An information processing apparatus comprising:
  2.  前記実物体制御部はさらに、前記実物体のうちユーザの操作対象の実物体が、入力装置を介したユーザ操作を反映する内容で作動するように制御することを特徴とする請求項1に記載の情報処理装置。 The said real object control part further controls so that the real object of a user's operation target among the said real objects may operate | move by the content which reflects user operation via an input device. Information processing device.
  3.  前記実物体制御部は、前記制御対象の実物体と前記ユーザの操作対象の実物体にそれぞれ備えられた発光素子が互いに異なる発光状態となるように制御したうえ、どちらか一方の実物体を所定の内容で動作させることにより、前記制御対象の実物体と前記ユーザの操作対象の実物体の区別を前記発光状態の差によってユーザに明示することを特徴とする請求項2に記載の情報処理装置。 The real object control unit performs control so that light emitting elements provided in the real object to be controlled and the real object to be operated by the user are in different light emission states, and either one of the real objects is predetermined. The information processing apparatus according to claim 2, wherein a distinction between the real object to be controlled and the real object to be operated by the user is clearly indicated to the user by the difference in the light emission state. .
  4.  前記情報処理部は、前記状態情報に基づき決定した音声をスピーカーから発生させることを特徴とする請求項1から3のいずれかに記載の情報処理装置。 The information processing apparatus according to any one of claims 1 to 3, wherein the information processing unit generates a sound determined based on the state information from a speaker.
  5.  前記情報処理部は、前記実物体が存在する位置に応じて、作動に所定の制限を与えることを特徴とする請求項1から4のいずれかに記載の情報処理装置。 The information processing apparatus according to any one of claims 1 to 4, wherein the information processing unit gives a predetermined restriction to the operation according to a position where the real object exists.
  6.  前記情報処理部は、前記被写空間に投影する画像を生成してプロジェクタに表示させるとともに、前記状態情報に基づく所定の規則で当該画像を変化させることを特徴とする請求項1から5のいずれかに記載の情報処理装置。 The said information processing part produces | generates the image projected on the said to-be-photographed space, displays it on a projector, and changes the said image according to the predetermined rule based on the said status information. An information processing apparatus according to claim 1.
  7.  前記実物体制御部は、制御対象の実物体の車輪の回転により当該実物体を走行させ、
     前記情報処理部は、ユーザ操作によって走行する別の実物体、ユーザが置いた別の実物体、および駆動系を利用せずに動く実物体、ユーザの体の一部、の少なくともいずれかとの位置関係に応じて、前記制御対象の実物体の走行方向を決定することを特徴とする請求項1から6のいずれかに記載の情報処理装置。
    The real object control unit causes the real object to travel by rotation of a wheel of the real object to be controlled,
    The information processing unit is a position of at least one of another real object traveling by a user operation, another real object placed by the user, a real object that moves without using a driving system, and a part of the user's body The information processing apparatus according to claim 1, wherein a traveling direction of the real object to be controlled is determined according to a relationship.
  8.  前記情報処理部は、前記実物体制御部の制御により作動している前記制御対象の実物体の前記状態情報を利用して、当該実物体のその後の作動の内容を決定することを特徴とする請求項1から7のいずれかに記載の情報処理装置。 The information processing unit determines the content of the subsequent operation of the real object using the state information of the real object to be controlled that is operating under the control of the real object control unit. The information processing apparatus according to claim 1.
  9.  前記情報処理部は、入力装置を介したユーザ操作をそのまま反映させた場合に発生する動きに対し所定の規則で調整量を決定し、前記実物体制御部は、調整後の動きで作動するように、前記操作対象の実物体を制御することを特徴とする請求項2に記載の情報処理装置。 The information processing unit determines an adjustment amount according to a predetermined rule with respect to a movement that occurs when a user operation via the input device is reflected as it is, and the real object control unit operates with the adjusted movement. The information processing apparatus according to claim 2, wherein the real object to be operated is controlled.
  10.  前記情報処理部は、ユーザ操作により要求されている動きと前記状態情報に基づく実際の動きとの差分を小さくするように前記調整量を決定することを特徴とする請求項9に記載の情報処理装置。 The information processing unit according to claim 9, wherein the information processing unit determines the adjustment amount so as to reduce a difference between a motion requested by a user operation and an actual motion based on the state information. apparatus.
  11.  前記情報処理部は、ユーザ個人の操作履歴に基づき前記調整量を決定することを特徴とする請求項9に記載の情報処理装置。 10. The information processing apparatus according to claim 9, wherein the information processing unit determines the adjustment amount based on a user's individual operation history.
  12.  前記情報処理部は、ユーザの操作対象の実物体に対し所定の位置関係で動きつづけるように前記制御対象の実物体の作動の内容を決定するとともに、ユーザの操作対象の実物体と前記制御対象の実物体の位置の時間変化を記録したデータを記憶装置に格納しておくことにより、別のタイミングで、前記制御対象の実物体によって記録した動きを再現させることを特徴とする請求項1または2に記載の情報処理装置。 The information processing unit determines the content of the operation of the real object to be controlled so as to continue to move in a predetermined positional relationship with respect to the real object to be operated by the user, and the real object to be controlled by the user and the control target The movement recorded by the real object to be controlled is reproduced at another timing by storing data that records the time change of the position of the real object in a storage device. 2. The information processing apparatus according to 2.
  13.  前記実物体制御部は、実物体が備える運搬機構により他の実物体を運搬させるように制御対象の実物体を制御し、
     前記情報処理部は、運搬された実物体によって所定の創作物が完成するように前記制御対象の実物体の作動の内容を決定することを特徴とする請求項1または2に記載の情報処理装置。
    The real object control unit controls a real object to be controlled so that another real object is transported by a transport mechanism included in the real object,
    The information processing apparatus according to claim 1, wherein the information processing unit determines the content of the operation of the real object to be controlled so that a predetermined creation is completed by the conveyed real object. .
  14.  前記創作物は、ユーザが操作する実物体が備える運搬機構によって運搬された実物体を含み、
     前記情報処理部は、創作過程と完成体の差分を小さくするように前記制御対象の実物体の作動の内容を決定することを特徴とする請求項13に記載の情報処理装置。
    The creation includes a real object transported by a transport mechanism included in a real object operated by a user,
    The information processing apparatus according to claim 13, wherein the information processing unit determines an operation content of the real object to be controlled so as to reduce a difference between a creation process and a completed body.
  15.  前記情報処理部は、創作物を構成する複数の実物体が互いに接して置かれるように前記制御対象の実物体の作動の内容を決定し、
     前記実物体制御部は、接して置かれた実物体が結合するように当該実物体の結合機構を制御することを特徴とする請求項13または14に記載の情報処理装置。
    The information processing unit determines the content of the action of the real object to be controlled so that a plurality of real objects constituting the creation are placed in contact with each other,
    The information processing apparatus according to claim 13 or 14, wherein the real object control unit controls a coupling mechanism of the real objects so that real objects placed in contact with each other are coupled.
  16.  前記実物体制御部はさらに、ネットワークを介して接続した別の情報処理装置に接続された入力装置に対する操作を反映させる内容で作動するように、当該別の情報処理装置に対応する実物体を制御することを特徴とする請求項1または2に記載の情報処理装置。 The real object control unit further controls a real object corresponding to the other information processing apparatus so as to operate with a content reflecting an operation on an input device connected to the other information processing apparatus connected via the network. The information processing apparatus according to claim 1, wherein the information processing apparatus is an information processing apparatus.
  17.  前記情報処理部はさらに、マイクロフォンが取得した音声信号からユーザが発した言葉を認識し、当該言葉に基づく所定の規則で、前記実物体のうち制御対象の実物体の作動の内容を前記時間間隔で決定することを特徴とする請求項1に記載の情報処理装置。 The information processing unit further recognizes a word uttered by a user from an audio signal acquired by a microphone, and determines a content of operation of a real object to be controlled among the real objects according to a predetermined rule based on the word in the time interval. The information processing apparatus according to claim 1, wherein the information processing apparatus is determined by:
  18.  前記情報処理部は、実物体に装着された描画具により所定の規則で被写空間の水平面に線画が描かれるように、前記制御対象の実物体の作動の内容を決定することを特徴とする請求項1または2に記載の情報処理装置。 The information processing unit determines the content of the operation of the real object to be controlled so that a line drawing is drawn on a horizontal plane of the subject space according to a predetermined rule by a drawing tool attached to the real object. The information processing apparatus according to claim 1 or 2.
  19.  情報処理装置と、当該情報処理装置が制御することにより動く実物体と、を含む情報処理システムであって、
     前記情報処理装置は、
     撮像装置が撮影した動画像の画像フレームを順次取得し、当該画像フレームにおける像を検出することにより、被写空間に存在する実物体の状態情報を所定の時間間隔で検出する状態特定部と、
     前記状態特定部が検出した状態情報に基づく所定の規則で、前記実物体のうち制御対象の実物体の作動の内容を前記時間間隔で決定する情報処理部と、
     前記制御対象の実物体を、前記情報処理部が決定した内容で作動するように制御する実物体制御部と、
     を備えたことを特徴とする情報処理システム。
    An information processing system including an information processing device and a real object that moves under the control of the information processing device,
    The information processing apparatus includes:
    A state specifying unit that sequentially acquires image frames of moving images taken by the imaging device and detects state information of a real object existing in the object space at a predetermined time interval by detecting images in the image frames;
    An information processing unit that determines the operation content of the real object to be controlled among the real objects at the time interval according to a predetermined rule based on the state information detected by the state specifying unit;
    A real object control unit that controls the real object to be controlled to operate according to the content determined by the information processing unit;
    An information processing system comprising:
  20.  情報処理装置から制御信号を受信する通信部と、
     受信した制御信号に従いアクチュエータを動作させる駆動部と、
     を備えることにより、前記制御信号に基づき動作する実物体を複数個含む実物体システムであって、
     前記実物体のうちいずれかは、前記情報処理装置に接続した入力装置を介してなされたユーザ操作を反映した制御信号に基づき作動し、別の実物体は、前記情報処理装置が決定した動きを反映した制御信号に基づき作動することを特徴とする実物体システム。
    A communication unit that receives a control signal from the information processing device;
    A drive unit for operating the actuator according to the received control signal;
    A real object system including a plurality of real objects that operate based on the control signal,
    Any one of the real objects operates based on a control signal reflecting a user operation performed via an input device connected to the information processing apparatus, and another real object has a movement determined by the information processing apparatus. An actual object system that operates based on a reflected control signal.
  21.  撮像装置が撮影した動画像の画像フレームを順次取得し、当該画像フレームにおける像を検出することにより、被写空間に存在する実物体の状態情報を所定の時間間隔で検出するステップと、
     検出した状態情報に基づく所定の規則で、前記実物体のうち制御対象の実物体の作動の内容を前記時間間隔で決定するステップと、
     前記制御対象の実物体を、前記決定するステップで決定した内容で作動するように制御するステップと、
     を含むことを特徴とする、情報処理装置による情報処理方法。
    Detecting sequentially the state information of the real object existing in the object space at predetermined time intervals by sequentially acquiring the image frames of the moving images taken by the imaging device and detecting the images in the image frames;
    Determining a content of operation of a real object to be controlled among the real objects at the time interval according to a predetermined rule based on the detected state information;
    Controlling the real object to be controlled to operate with the content determined in the determining step;
    An information processing method by an information processing apparatus, comprising:
  22.  撮像装置が撮影した動画像の画像フレームを順次取得し、当該画像フレームにおける像を検出することにより、被写空間に存在する実物体の状態情報を所定の時間間隔で検出する機能と、
     検出した状態情報に基づく所定の規則で、前記実物体のうち制御対象の実物体の作動の内容を前記時間間隔で決定する機能と、
     前記制御対象の実物体を、前記決定するステップで決定した内容で作動するように制御する機能と、
     をコンピュータに実現させることを特徴とするコンピュータプログラム。
    A function of sequentially acquiring image frames of a moving image captured by the imaging device and detecting state information of a real object existing in the object space at a predetermined time interval by detecting an image in the image frame;
    A function of determining the content of the operation of the real object to be controlled among the real objects at the time interval according to a predetermined rule based on the detected state information;
    A function of controlling the real object to be controlled to operate according to the content determined in the determining step;
    A computer program for causing a computer to realize the above.
  23.  撮像装置が撮影した動画像の画像フレームを順次取得し、当該画像フレームにおける像を検出することにより、被写空間に存在する実物体の状態情報を所定の時間間隔で検出する機能と、
     検出した状態情報に基づく所定の規則で、前記実物体のうち制御対象の実物体の作動の内容を前記時間間隔で決定する機能と、
     前記制御対象の実物体を、前記決定するステップで決定した内容で作動するように制御する機能と、
     をコンピュータに実現させるコンピュータプログラムを記録したことを特徴とするコンピュータにて読み取り可能な記録媒体。
    A function of sequentially acquiring image frames of a moving image captured by the imaging device and detecting state information of a real object existing in the object space at a predetermined time interval by detecting an image in the image frame;
    A function of determining the content of the operation of the real object to be controlled among the real objects at the time interval according to a predetermined rule based on the detected state information;
    A function of controlling the real object to be controlled to operate according to the content determined in the determining step;
    A computer-readable recording medium having recorded thereon a computer program that causes a computer to realize the above.
PCT/JP2015/074169 2014-11-07 2015-08-27 Information processing device, information processing system, real object system, and information processing method WO2016072132A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-227275 2014-11-07
JP2014227275A JP6352151B2 (en) 2014-11-07 2014-11-07 Information processing apparatus, information processing system, and information processing method

Publications (1)

Publication Number Publication Date
WO2016072132A1 true WO2016072132A1 (en) 2016-05-12

Family

ID=55908862

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/074169 WO2016072132A1 (en) 2014-11-07 2015-08-27 Information processing device, information processing system, real object system, and information processing method

Country Status (2)

Country Link
JP (1) JP6352151B2 (en)
WO (1) WO2016072132A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112106004A (en) * 2018-05-09 2020-12-18 索尼公司 Information processing apparatus, information processing method, and program
JP2022509986A (en) * 2019-08-30 2022-01-25 上▲海▼商▲湯▼智能科技有限公司 Vehicle positioning systems and methods, vehicle control methods and equipment
JP7335538B1 (en) 2022-12-22 2023-08-30 株式会社カプコン Information processing method, information processing system and program
US11944887B2 (en) 2018-03-08 2024-04-02 Sony Corporation Information processing device and information processing method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102105093B1 (en) 2015-06-08 2020-04-28 배틀카트 유럽 Environment creation system
JP7091160B2 (en) * 2018-06-14 2022-06-27 鹿島建設株式会社 Structure construction method
JP2023102972A (en) * 2022-01-13 2023-07-26 凸版印刷株式会社 Aerial display device
WO2023233583A1 (en) * 2022-06-01 2023-12-07 株式会社ソニー・インタラクティブエンタテインメント Electronic device, and information processing system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0866564A (en) * 1994-06-22 1996-03-12 Konami Co Ltd Remote controller for traveling object
JPH0944249A (en) * 1995-07-31 1997-02-14 Nippon Steel Corp Moving object controlling method
JP2001306145A (en) * 2000-04-25 2001-11-02 Casio Comput Co Ltd Moving robot device and program record medium therefor
JP2005063184A (en) * 2003-08-13 2005-03-10 Toshiba Corp Self-propelled moving device and position correction method
JP2005313303A (en) * 2004-04-30 2005-11-10 Japan Science & Technology Agency Robot remote control system
WO2006134778A1 (en) * 2005-06-14 2006-12-21 The University Of Electro-Communications Position detecting device, position detecting method, position detecting program, and composite reality providing system
JP2008030136A (en) * 2006-07-27 2008-02-14 Sony Corp Apparatus and method for compiling action of robot, as well as computer/program
WO2008065458A2 (en) * 2006-11-28 2008-06-05 Dalnoki Adam System and method for moving real objects through operations performed in a virtual environment
WO2009037679A1 (en) * 2007-09-21 2009-03-26 Robonica (Proprietary) Limited Display of information in a mobile toy gaming system
WO2010026710A1 (en) * 2008-09-03 2010-03-11 村田機械株式会社 Route planning method, route planning unit, and autonomous mobile device
JP2014136141A (en) * 2013-01-18 2014-07-28 Iai Corp Robot game system
WO2014178272A1 (en) * 2013-05-01 2014-11-06 村田機械株式会社 Autonomous moving body

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4787963B2 (en) * 2006-08-10 2011-10-05 国立大学法人北海道大学 Route estimation device and control method therefor, route estimation device control program, and recording medium recording the program

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0866564A (en) * 1994-06-22 1996-03-12 Konami Co Ltd Remote controller for traveling object
JPH0944249A (en) * 1995-07-31 1997-02-14 Nippon Steel Corp Moving object controlling method
JP2001306145A (en) * 2000-04-25 2001-11-02 Casio Comput Co Ltd Moving robot device and program record medium therefor
JP2005063184A (en) * 2003-08-13 2005-03-10 Toshiba Corp Self-propelled moving device and position correction method
JP2005313303A (en) * 2004-04-30 2005-11-10 Japan Science & Technology Agency Robot remote control system
WO2006134778A1 (en) * 2005-06-14 2006-12-21 The University Of Electro-Communications Position detecting device, position detecting method, position detecting program, and composite reality providing system
JP2008030136A (en) * 2006-07-27 2008-02-14 Sony Corp Apparatus and method for compiling action of robot, as well as computer/program
WO2008065458A2 (en) * 2006-11-28 2008-06-05 Dalnoki Adam System and method for moving real objects through operations performed in a virtual environment
WO2009037679A1 (en) * 2007-09-21 2009-03-26 Robonica (Proprietary) Limited Display of information in a mobile toy gaming system
WO2010026710A1 (en) * 2008-09-03 2010-03-11 村田機械株式会社 Route planning method, route planning unit, and autonomous mobile device
JP2014136141A (en) * 2013-01-18 2014-07-28 Iai Corp Robot game system
WO2014178272A1 (en) * 2013-05-01 2014-11-06 村田機械株式会社 Autonomous moving body

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11944887B2 (en) 2018-03-08 2024-04-02 Sony Corporation Information processing device and information processing method
CN112106004A (en) * 2018-05-09 2020-12-18 索尼公司 Information processing apparatus, information processing method, and program
JP2022509986A (en) * 2019-08-30 2022-01-25 上▲海▼商▲湯▼智能科技有限公司 Vehicle positioning systems and methods, vehicle control methods and equipment
JP7335538B1 (en) 2022-12-22 2023-08-30 株式会社カプコン Information processing method, information processing system and program

Also Published As

Publication number Publication date
JP6352151B2 (en) 2018-07-04
JP2016091423A (en) 2016-05-23

Similar Documents

Publication Publication Date Title
JP6352151B2 (en) Information processing apparatus, information processing system, and information processing method
JP7322122B2 (en) Information processing device, information processing method, and information medium
US11559751B2 (en) Toy systems and position systems
US10328339B2 (en) Input controller and corresponding game mechanics for virtual reality systems
JP6957218B2 (en) Simulation system and program
US20080076498A1 (en) Storage medium storing a game program, game apparatus and game controlling method
JP2010233671A (en) Program, information storage medium and game device
US9526987B2 (en) Storage medium, game apparatus, game system and game controlling method
US8075400B2 (en) Game apparatus
KR20170134675A (en) Portal devices and cooperative video game machines
JP3532898B2 (en) Video game toys
JP2022003549A (en) Entertainment system
JP3751626B2 (en) Game device
WO2019142227A1 (en) Mobile body and mobile body control method
JP2019170544A (en) Action device, action toy and system
JP5499001B2 (en) Game device and program
JP3751627B2 (en) Game device
JP5616010B2 (en) Game program and game system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15857909

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15857909

Country of ref document: EP

Kind code of ref document: A1