US20190102863A1 - Information processing apparatus and method, electronic device and computer readable medium - Google Patents

Information processing apparatus and method, electronic device and computer readable medium Download PDF

Info

Publication number
US20190102863A1
US20190102863A1 US16/014,353 US201816014353A US2019102863A1 US 20190102863 A1 US20190102863 A1 US 20190102863A1 US 201816014353 A US201816014353 A US 201816014353A US 2019102863 A1 US2019102863 A1 US 2019102863A1
Authority
US
United States
Prior art keywords
environment data
data
information processing
processing apparatus
user equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/014,353
Other languages
English (en)
Inventor
Chen Sun
Xin Guo
Lingming Kong
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SUN, CHEN, GUO, XIN, KONG, LINGMING
Publication of US20190102863A1 publication Critical patent/US20190102863A1/en
Priority to US17/092,392 priority Critical patent/US11715177B2/en
Priority to US18/331,953 priority patent/US20230316459A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/432Query formulation
    • G06F16/434Query formulation using image data, e.g. images, photos, pictures taken by a user
    • G06F17/30047
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • G06K9/0063
    • G06K9/6218
    • G06K9/6289
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/04Payment circuits
    • G06Q20/06Private payment circuits, e.g. involving electronic currency used among participants of a common payment scheme
    • G06Q20/065Private payment circuits, e.g. involving electronic currency used among participants of a common payment scheme using e-cash
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • G06Q20/3224Transactions dependent on location of M-devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/327Short range or proximity payments by means of M-devices
    • G06Q20/3272Short range or proximity payments by means of M-devices using an audio code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/17Image acquisition using hand-held instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • the present disclosure generally relates to the field of information processing, and particularly to an information processing apparatus, an information processing method, an electronic device and an information processing method for user equipment side, and a computer readable medium.
  • An existing user equipment is capable of acquiring various environment data, such as video data and audio data, and is capable of uploading the acquired environment data to a server, etc.
  • a smart phone can capture an image and may upload the image to a social network, etc.
  • multiple users capture a same scenario or event.
  • an information processing apparatus including processing circuitry configured to: compare environment data acquired by a user equipment with reference data; determine, based on the comparison, an adjustment for an acquisition manner of the environment data; and perform a control to notify the user equipment of indication information related to the adjustment.
  • an information processing method includes: comparing environment data acquired by a user equipment with reference data; determining, based on the comparison, an adjustment for an acquisition manner of the environment data; and performing a control to notify the user equipment of indication information related to the adjustment
  • an information processing method includes: receiving environment data associated with a scenario and acquired by multiple user equipments, the environment data acquired by the multiple user equipments being at least partially different; determining the scenario; and fusing, based on the determined scenario, the environment data acquired by the multiple user equipments to generate scenario information.
  • an electronic device for user equipment side including processing circuitry configured to perform a control to acquire environment data; transmit the acquired environment data to a control node; and receive, from the control node, indication information related to an adjustment of an acquisition manner of the environment data, where the adjustment being determined based on a comparison between the environment data and reference data.
  • an information processing method for user equipment side includes: acquiring environment data; transmitting the acquired environment data to a control node; and receiving, from the control node, indication information related to an adjustment of an acquisition manner of the environment data, where the adjustment being determined based on a comparison between the environment data and reference data.
  • Embodiments of the present disclosure also include a computer readable medium including executable instructions that, when executed by an information processing apparatus, cause the information processing apparatus to perform the methods according to the embodiments of the present disclosure.
  • the effect of acquiring environment data can be improved.
  • FIG. 1 is a block diagram showing a configuration example of an information processing apparatus according to an embodiment of the present disclosure
  • FIG. 2 is a block diagram showing a configuration example of an information processing apparatus according to another embodiment
  • FIG. 3 is a block diagram showing a configuration example of an information processing apparatus according to yet another embodiment
  • FIG. 4 is a flowchart showing a process example of an information processing method according to an embodiment of the present disclosure
  • FIG. 5 is a block diagram showing a configuration example of an electronic device for user equipment side according to an embodiment of the present disclosure
  • FIG. 6 is a block diagram showing a configuration example of an electronic device for user equipment side according to another embodiment
  • FIG. 7 is a block diagram showing a configuration example of an electronic device for user equipment side according to yet another embodiment
  • FIG. 8 is a flowchart showing a process example of an information processing method for user equipment side according to an embodiment of the present disclosure
  • FIG. 9 is a block diagram showing a configuration example of an information processing apparatus according to an embodiment.
  • FIG. 10 is a block diagram showing a configuration example of an information processing apparatus according to another embodiment.
  • FIG. 11 is a flowchart for illustrating an information processing procedure according to an exemplary embodiment
  • FIGS. 12A to 12E show an example of a procedure of acquiring panoramic video by multiple user equipments
  • FIG. 13 shows an example of presenting an adjustment indication
  • FIG. 14 is a block diagram of an exemplary structure of a computer which can implement the methods and apparatuses according to the present disclosure
  • FIG. 15 is a block diagram of an example of a schematic configuration of a smart phone to which the technology according to the present disclosure can be applied.
  • FIG. 16 is a block diagram of an example of a schematic configuration of a base station to which the technology according to the present disclosure can be applied.
  • an information processing apparatus 100 includes processing circuitry 110 .
  • the processing circuitry 110 may be implemented with a particular chip, a chip set, a central processing unit (CPU), or the like.
  • the information processing apparatus 100 may be implemented as a logic entity on network side.
  • the logic entity on network side includes, for example, a background server of a data sharing website, etc.
  • the information processing apparatus 100 may also be implemented on user equipment side, e.g., a processing apparatus in a local area network formed by some user equipments.
  • a game pad may serve as a data collecting apparatus for a user
  • a game console may serve as a central processing apparatus.
  • the processing circuitry 110 includes a comparing unit 111 , a determining unit 113 and a notifying unit 115 .
  • the comparing unit 111 , the determining unit 113 and the notifying unit 115 are illustrated in the figure as functional blocks, it is understood that the functions of these units may also be realized by the processing circuitry 110 as a whole, instead of being realized by discrete physical components in the processing circuitry 110 .
  • the processing 110 is illustrated by one block in the figure, the electronic device 100 may include multiple processing circuitry, and the functions of the comparing unit 111 , the determining unit 113 and the notifying unit 115 may be distributed to the multiple processing circuitry, so that the multiple processing circuitry operate coordinately to perform the functions.
  • the comparing unit 111 is configured to compare environment data acquired by a user equipment with reference data.
  • the environment data refers to data related to an environment in which the user equipment locates.
  • the environment data may include image data, sound data, video data, smell data, pressure data, magnetic field data, tactile data, but the invention is not limited thereto.
  • the environment data may be acquired from the user equipment in various manners.
  • the user equipment may directly transmit environment data acquired in advance or in a real time manner to the information processing apparatus 100 , or may provide environment data to the information processing apparatus 100 through forwarding by another equipment.
  • the reference data may include historical data stored in advance (which may be acquired by a same user or different users).
  • the reference data may be environment data acquired by another user equipment in a real time manner, for example.
  • the reference data may, for example, correspond to the same or similar scenario or object as the environment data acquired by the user equipment.
  • the determining unit 113 is configured to determine, based on a comparison result of the comparing unit 111 , an adjustment for an acquisition manner of the environment data.
  • the adjustment for the acquisition manner of the environment data may include changing a position, an angle, a parameter setting, etc., of the user equipment acquiring the environment data.
  • the adjustment determined by the determining unit 113 may include changing a data acquisition parameter for acquiring the environment data.
  • the determining unit 113 may determine a data acquisition parameter corresponding to environment data obtaining a high appraisal as a target parameter, based on user appraisals obtained by previous environment data.
  • the notifying unit 115 is configured to perform a control to notify the user equipment of indication information related to the adjustment determined by the determining unit 113 .
  • the indication information related to the adjustment may include indicating the user to capture a new image with a capturing location, capturing angle and capturing parameter (such as an aperture) different from the previous capturing location, capturing angle and capturing parameter, so as to improve the quality or effect of the capture, or indicating the user to provide an image of an object or a scenario at a desired location or angle, so as to obtain a more complete and comprehensive image of the object or scenario.
  • a capturing location such as an aperture
  • processing circuitry 210 of an information processing apparatus 200 may include an recognizing unit 221 in addition to a comparing unit 211 , a determining unit 213 and a notifying unit 215 which are similar to the comparing unit 111 , the determining unit 113 and the notifying unit 115 described above with reference to the FIG. 1 .
  • the recognizing unit 221 may be configured to recognize environment data associated with a same scenario from environment data acquired by one or more user equipments.
  • the recognizing unit 221 may be configured to perform clustering on environment data acquired by the one or more user equipments, and recognize the environment data clustered into a same group as being associated with the same scenario.
  • the recognizing unit 221 may extract a feature from the environment data (taking image data as an example, a texture feature, a shape feature, a color feature and the like may be extracted), and cluster the environment data into several groups based on the extracted features using an existing clustering method.
  • the recognizing unit 211 may be configured to recognize the environment data associated with the same scenario based on acquisition time, acquisition position and the like of the environment data, in addition to the feature of the environment data.
  • the acquisition time, acquisition position and the like of the environment data may be provided together with the environment data by the user equipment acquiring the environment data to the information processing apparatus according to the present embodiment.
  • the information processing apparatus may be configured to acquire from the user equipment one or more of: a time of acquiring the environment data, a position of the user equipment when acquiring the environment data, and an orientation of the user equipment when acquiring the environment data.
  • the time of acquiring the environment data may be determined by a clock of the smart phone
  • a position of the user equipment when acquiring the environment data may be determined by a GPS (Global Positioning System) unit of the smart phone
  • the an orientation of the user equipment when acquiring the environment data may be determined by a gyro sensor of the smart phone.
  • the processing circuitry 210 of the information processing apparatus 200 may further include a fusion unit 223 .
  • the fusion unit 223 may be configured to perform a fusion on the environment data recognized as being associated with the same scenario by the recognizing unit 211 .
  • the fusion unit 223 may generate, based on image data recognized as being associated with the same scenario, a three-dimensional image or a panoramic image of the scenario.
  • the fusion unit 223 may generate, based on video data recognized as being associated with the same scenario, panoramic video or stereoscopic video of the scenario.
  • a left-eye view and a right-eye view may be obtained based on images (which may be acquired by a same user equipment or by different user equipments) of the same object or scenario acquired at different locations and angles, so as to form a three-dimensional stereoscopic view of the object or scenario, for example.
  • images of the same object or scenario acquired at different locations and angles may be processed by an existing splicing or fusion technology to generate an image of the object or scenario with a larger view angle.
  • the fusion unit 223 may generate, based on environment data recognized as being associated with the same scenario, virtual reality data or augmented reality data of the scenario.
  • the virtual reality data may include not only the image data, but also the above-mentioned sound data, video data, smell data, pressure data, magnetic field data, and tactile data, etc. It is noted that, with the development of sensing technology and the presentation technology, the environment data for generating, for example, the virtual reality data may include other types of data which are not mentioned in the present disclosure.
  • the fusion unit 223 may be configured to select, based on data quality, candidate data from the environment data associated with the same scenario, and perform the fusion using the selected candidate data, so as to further improve the quality of the fused data.
  • the processing circuitry 210 of the information processing apparatus 200 may further include a rewarding unit 225 .
  • the rewarding unit 225 is configured to perform a control to give a reward to a user who has provided the environment data for the fusion.
  • the reward may include virtual currency.
  • the processing circuitry 210 of the information processing apparatus 200 may further include an access unit 227 .
  • the access unit 227 is configured to perform a control to provide the user equipment with an access to the fused environment data.
  • the user may be allowed to access the three-dimensional image, the panoramic image, or the virtual reality data generated by the fusion.
  • the user accessing the fused data may be required to pay a corresponding amount of fee, such as the virtual currency.
  • the information processing apparatus may receive environment data collaboratively transmitted by the users.
  • processing circuitry 310 of an information processing apparatus 300 may include an receiving unit 321 in addition to a comparing unit 311 , a determining unit 313 and a notifying unit 315 which are similar to the comparing unit 111 , the determining unit 113 and the notifying unit 115 described above with reference to the FIG. 1 .
  • the receiving unit 321 is configured to perform a control to receive environment data collaboratively transmitted by two or more user equipments. In the collaborative transmission, one user equipment forwards environment data received using a proximity-based service communication from another user equipment.
  • a collaborative terminal may be discovered with reference to the following manners.
  • the terminal may search in neighborhood for a device which has a storage function and can provide network connection for now or for a period in future, which is referred to as an assistant device.
  • the searching may be implemented by broadcast signaling (beacon).
  • the terminal device source device
  • the terminal device may periodically transmit distributed transmission terminal request signaling.
  • the user may preset a condition for selecting an assistance device based on applications, and include the selection condition in the distributed transmission terminal request signaling to transmit to other terminals.
  • the user may set that, an assistance device needs to have a flow rate greater than 1 Mbits/S and is capable of providing a network connection in 30 minutes according to an averaged history record if it cannot provide a network connection immediately.
  • distribution transmission control apparatuses of terminal devices in the neighborhood determine whether they meet the condition to participate in the distribution transmission set by the user, according to history information of network connection of the devices.
  • the feedback information may include by is not limited to at least one of memory capacity of the assistance device and statistical information on network connection.
  • the terminal When a terminal has data to be uploaded, the terminal broadcasts signaling to notify devices in neighborhood that data in the terminal can be read.
  • Transmission configuration may be performed after a collaborative user is found.
  • a transmission parameter may be set by a collaboration control apparatus, which may be provided on the public network side or installed in the terminal.
  • the part of traffic of the source terminal to be transmitted by the collaborative user is determined based on external network resources and internal network resources of the source terminal and collaborative terminal.
  • the transmission setting also includes various settings for the collaborative terminal to communicate with other terminals using internal network after finishing the transmission, such as settings of routing and transmission time (the time sequence of transmission by the terminals).
  • the processing circuitry 310 may further include a rewarding unit 323 according to an embodiment, which is configured to perform a control to give a reward to a user who has performed the forwarding.
  • the reward may include virtual currency.
  • FIG. 11 a process of an exemplary embodiment is described with reference to FIG. 11 , as a summary of various aspects of the above embodiments. It is noted that, an embodiment of the present invention may not include all the aspects of the exemplary embodiment.
  • a user captures an event and transmits the captured image to an information processing apparatus (S 1 ).
  • the information of location and direction of the camera may be determined using a positioning function of the user equipment, and provided together with the image.
  • the user may provide information to indicate whether the image is to participate in an information fusion.
  • a remote server may determine a transmission speed of the user equipment after receiving the data.
  • a processing apparatus may determine whether there are multiple users capturing a same event, based on information of time, location and orientation of transmission of the data by the users, so as to group the users (S 2 ).
  • the processing apparatus gathers the information collected by the users, and determines whether an information collection result meets a requirement (S 3 ), such as data amount and signal-to-noise rate.
  • the processing apparatus may provide the capturing information of this photo as an optimal capturing parameter setting reference for the current user to capture a photo.
  • the remote server may determine which ones of the users capture the same event, based on the captured content and geographical location information of the users. For example, photos are classified using a deep learning algorithm, and users capturing the same event are grouped together.
  • the remote server selects users to be required to upload video from the users who capture around the same event.
  • the selected users may be users who capture contents that are mostly not repeated.
  • the determination may be performed by an existing image processing algorithm.
  • users who have high transmission speeds may be selected from the users who are not required to upload data, and controlled to establish network connection with the devices of the users who are required to upload data.
  • each user who is not required to upload data establishes a D2D connection with a user who is required to upload data (by using a D2D network of 3GPP or a WiFi network).
  • the user who is required to upload image may upload a part of a captured image through a user connected thereto.
  • the remote server combines the data.
  • the remote server may splice the received images captured at different angles to form a panoramic image according to the angles by using an existing image matching algorithm, for example.
  • the processing apparatus sends a proposed reference setting of the capturing camera to the user, so that the user may adjust the camera to optimize the capturing effect (S 4 ).
  • a capturing proposal is sent to the user based on the picture captured by the user, to improve the effect of the panoramic picture.
  • the central management apparatus may determine an overlapping part of the pictures and then determine how to adjust the camera to cover a largest scope.
  • FIG. 12A to FIG. 12E show an example where multiple user equipments acquire videos of the same scenario and a panoramic video of the scenario is generated based on the videos obtained by the multiple user equipments.
  • the multiple users capture the same scenario, for example, at multiple locations and angles.
  • the central processing device may expect a user equipment to adjust its capturing location or angle, so as to capture a region that is not captured by the other user equipments.
  • one of the user equipments may be controlled to display indication information, such as an arrow displayed in the screen as shown in FIG. 12C , to notify the user to move the user equipment.
  • the user may move the user equipment according to the indication information to capture a part of the scenario that is not captured by the other user equipments, as shown in FIG. 12D .
  • the central processing apparatus can fuse videos captured by the multiple user equipments to generate a stereoscopic video or image of the scenario, as shown in FIG. 12E .
  • the central processing apparatus may publish the final video or image on the Internet.
  • the user may access the Internet to view the captured content. For example, it is required to pay with virtual currency for viewing the content.
  • the central processing apparatus fuses the images, the users who participate in the collaborative capture and transmission may obtain virtual currency, so as to pay for viewing the video.
  • an information processing method includes the following steps.
  • step S 410 environment data acquired by a user equipment is compared with reference data.
  • step S 420 an adjustment for an acquisition manner of the environment data is determined based on the comparison.
  • step S 430 indication information related to the adjustment is notified to the user equipment.
  • the information processing method may include the following steps: receiving environment data associated with a scenario and acquired by multiple user equipments, the environment data acquired by the multiple user equipments being at least partially different; determining the scenario; and fusing, based on the determined scenario, the environment data acquired by the multiple user equipments to generate scenario information.
  • the scenario information may include but is not limited to one or more of a three-dimensional image, a panoramic image, a panoramic video, a stereoscopic video, virtual reality data and augmented reality data of an event or scenario.
  • an electronic device for the user equipment side is also provided according to an embodiment of the present disclosure.
  • an electronic device 500 for the user equipment side includes processing circuitry 510 .
  • the processing circuitry 510 includes an acquiring unit 511 , a transmitting unit 513 and a receiving unit 515 .
  • the acquiring unit 511 is configured perform a control to acquire environment data.
  • the environment data may include one or more of image data, sound data, video data, smell data, pressure data, magnetic field data, and tactile data, but the invention is not limited thereto.
  • the environment data may be acquired by a sensor provided on the user equipment, but the inventions is not limited thereto. For example, some environment data may be provided by sensing and describing by the user himself/herself.
  • the transmitting unit 513 is configured to perform a control to transmit the acquired environment data to a control node (such as the information processing apparatus described above).
  • the receiving unit 515 is configured to perform a control to receive, from the control node, indication information related to an adjustment of an acquisition manner of the environment data. The adjustment is determined based on a comparison between the environment data and reference data.
  • the receiving unit 515 may be configured to perform a control to transmit the environment data collaboratively with another user equipment.
  • one user equipment forwards environment data received using a proximity-based service communication from another user equipment.
  • the receiving unit 515 may be configured to perform a control to report to the control node, or to broadcast to another user equipment, a message indicating a predetermined event.
  • FIG. 6 shows a configuration example of an electronic device for user equipment side according to another embodiment.
  • the processing unit of the electronic device 600 as shown in FIG. 6 further includes a presenting unit 617 , in addition to an acquiring unit 611 , a transmitting unit 613 and a receiving unit 615 which are similar to the acquiring unit 511 , the transmitting unit 513 and the receiving unit 515 .
  • the presenting unit 617 is configured to perform a control to present indication information received by the receiving unit 615 .
  • the presenting unit 617 may present the indication information received by the receiving unit 615 by means of, but not limited to, image, text, voice or the like.
  • FIG. 13 shows an example of presented indication information.
  • a zoom-in indication and a rotation indication are displayed on a screen of a user equipment while the user equipment is capturing.
  • the screen may also display indication information corresponding to various adjustment manners, such as zoom-in, zoom-out, translation (upward, downward, leftward, rightward, or the like), rotation (about a horizontal axis, a vertical axis or the like).
  • FIG. 7 shows a configuration example of an electronic device for user equipment side according to another embodiment.
  • the processing unit of the electronic device 700 as shown in FIG. 7 further includes an indicating unit 717 , in addition to an acquiring unit 711 , a transmitting unit 713 and a receiving unit 715 which are similar to the acquiring unit 511 , the transmitting unit 513 and the receiving unit 515 .
  • the indicating unit 717 is configured to perform a control to indicate, to the control node, whether the environment data is to participate in a fusion.
  • FIG. 8 shows a procedure example of an information processing method for user equipment side according to another embodiment.
  • the procedure includes the following steps.
  • step S 810 environment data is acquired.
  • step S 820 the acquired environment data is transmitted to a control node.
  • step S 830 indication information related to an adjustment for an acquisition manner of the environment data is received from the control node.
  • the adjustment is determined based on a comparison between the environment data and reference data.
  • a computer readable medium is also provided according to an embodiment of the present disclosure, which includes executable instructions that, when executed by an information processing apparatus, cause the information processing apparatus to implement the methods according to the above embodiments.
  • FIG. 9 shows an information processing apparatus 900 according to an embodiment, which includes a comparing device 910 configured to compare environment data acquired by a user equipment with reference data, a determining device 920 configured to determine, based on the comparison, an adjustment for an acquisition manner of the environment data, and a notifying device 930 configured to perform a control to notify the user equipment of indication information related to the adjustment.
  • a comparing device 910 configured to compare environment data acquired by a user equipment with reference data
  • a determining device 920 configured to determine, based on the comparison, an adjustment for an acquisition manner of the environment data
  • a notifying device 930 configured to perform a control to notify the user equipment of indication information related to the adjustment.
  • FIG. 10 shows an information processing apparatus 1000 according to an embodiment, which includes an acquiring device 1010 configured to acquire environment data, a transmitting device 1020 configured to transmit the acquired environment data to a control node, and a receiving device 1030 configured to receive, from the control node, indication information related to an adjustment of an acquisition manner of the environment data. The adjustment is determined based on a comparison between the environment data and reference data.
  • the steps of the above methods and the modules of the above apparatuses may be realized by software, firmware, hardware, or a combination thereof.
  • a program constituting the software for implementing the above methods is installed in a computer with a dedicated hardware structure (such as the general computer 2000 shown in FIG. 14 ) from a storage medium or network, where the computer is capable of implementing various functions when installed with various programs.
  • a computation processing unit (CPU) 2001 executes various processing according to a program stored in a read-only memory (ROM) 2002 or a program loaded to a random access memory (RAM) 2003 from a storage section 2008 .
  • the data needed for the various processing of the CPU 2001 may be stored in the RAM 2003 as needed.
  • the CPU 2001 , the ROM 2002 and the RAM 2003 are linked with each other via a bus 2004 .
  • An input/output interface 2005 is also linked to the bus 2004 .
  • the following components are linked to the input/output interface 2005 : an input section 2006 (including a keyboard, a mouse and the like), an output section 2007 (including displays such as a cathode ray tube (CRT), a liquid crystal display (LCD), a loudspeaker and the like), a storage section 2008 (including hard disc and the like), and a communication section 2009 (including a network interface card such as a LAN card, modem and the like).
  • the communication section 2009 performs communication processing via a network such as the Internet.
  • a driver 2010 may also be linked to the input/output interface 2005 .
  • a removable medium 2011 for example, a magnetic disc, an optical disc, a magnetic optical disc, a semiconductor memory and the like, may be installed in the driver 2010 , so that the computer program read therefrom is installed in the storage section 2008 as appropriate.
  • programs forming the software are installed from a network such as the Internet or a memory medium such as the removable medium 2011 .
  • the memory medium is not limited to the removable medium 2011 shown in FIG. 14 , which has program stored therein and is distributed separately from the apparatus so as to provide the programs to users.
  • the removable medium 2011 may be, for example, a magnetic disc (including floppy disc (registered trademark)), a compact disc (including compact disc read-only memory (CD-ROM) and digital versatile disc (DVD), a magneto optical disc (including mini disc (MD)(registered trademark)), and a semiconductor memory.
  • the memory medium may be the hard discs included in ROM 2002 and the storage section 2008 in which programs are stored, and can be distributed to users along with the device in which they are incorporated.
  • the present invention further includes an embodiment of a program product in which machine-readable instruction codes are stored.
  • the aforementioned methods according to the embodiments can be implemented when the instruction codes are read and executed by a machine.
  • the memory medium includes but is not limited to soft disc, optical disc, magnetic optical disc, memory card, memory stick and the like.
  • the embodiments of the present invention further include the following electronic device.
  • the electronic device may be implemented as a gNB of any type, and an evolved node B (eNB) such as a macro eNB and a small eNB.
  • eNB evolved node B
  • the small eNB may be an eNB which covers a cell smaller than a macro cell, such as a pico eNB, a micro eNB and a home (femto) eNB.
  • the electronic device may be implemented as any other type of base station, such as a Node B and a base transceiver station (BTS).
  • BTS base transceiver station
  • the electronic device may include: a main body (also referred to as base station device) configured to control the wireless communication, and one or more remote radio heads (RRH) provided at a different site from the main body.
  • a main body also referred to as base station device
  • RRH remote radio heads
  • various types of terminal devices to be described below may function as a base station by performing the function of the base station temporarily or semi-permanently.
  • the electronic device may be implemented as a mobile terminal (such as smart phone, a panel personal computer (PC), a notebook PC, a portable game terminal, a portable/dongle mobile router and a digital camera) or an on-board terminal device (such as car navigation device). Further, the electronic device may be a wireless communication module mounted on each of the above terminals (such as integrated circuit module including a single chip or multiple chips).
  • a mobile terminal such as smart phone, a panel personal computer (PC), a notebook PC, a portable game terminal, a portable/dongle mobile router and a digital camera
  • an on-board terminal device such as car navigation device
  • the electronic device may be a wireless communication module mounted on each of the above terminals (such as integrated circuit module including a single chip or multiple chips).
  • FIG. 15 is a block diagram illustrating an example of a schematic configuration of a smartphone 2500 to which the technology of the present disclosure may be applied.
  • the smartphone 2500 includes a processor 2501 , a memory 2502 , a storage 2503 , an external connection interface 2504 , a camera 2506 , a sensor 2507 , a microphone 2508 , an input apparatus 2509 , a display apparatus 2510 , a speaker 2511 , a radio communication interface 2512 , one or more antenna switches 2515 , one or more antennas 2516 , a bus 2517 , a battery 2518 , and an auxiliary controller 2519 .
  • the processor 2501 may be, for example, a CPU or a system on a chip (SoC), and controls functions of an application layer and another layer of the smartphone 2500 .
  • the memory 2502 includes RAM and ROM, and stores a program that is executed by the processor 2501 , and data.
  • the storage 2503 may include a storage medium such as a semiconductor memory and a hard disk.
  • the external connection interface 2504 is an interface for connecting an external device (such as a memory card and a universal serial bus (USB) device) to the smartphone 2500 .
  • an external device such as a memory card and a universal serial bus (USB) device
  • the camera 2506 includes an image sensor (such as a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS)), and generates a captured image.
  • the sensor 2507 may include a group of sensors such as a measurement sensor, a gyro sensor, a geomagnetic sensor, and an acceleration sensor.
  • the microphone 2508 converts sounds that are input to the smartphone 2500 to audio signals.
  • the input apparatus 2509 includes, for example, a touch sensor configured to detect touch onto a screen of the display apparatus 2510 , a keypad, a keyboard, a button, or a switch, and receives an operation or an information input from a user.
  • the display apparatus 2510 includes a screen (such as a liquid crystal display (LCD) and an organic light-emitting diode (OLED) display), and displays an output image of the smartphone 2500 .
  • the speaker 2511 converts audio signals that are output from the smartphone 2500 to sounds.
  • the radio communication interface 2512 supports any cellular communication scheme (such as LET and LTE-Advanced), and performs radio communication.
  • the radio communication interface 2512 may typically include, for example, a BB processor 2513 and an RF circuit 2514 .
  • the BB processor 2513 may perform, for example, encoding/decoding, modulating/demodulating, and multiplexing/demultiplexing, and performs various types of signal processing for radio communication.
  • the RF circuit 2514 may include, for example, a mixer, a filter, and an amplifier, and transmits and receives radio signals via the antenna 2516 .
  • the radio communication interface 2512 may be a one chip module having the BB processor 2513 and the RF circuit 2514 integrated thereon.
  • the radio communication interface 2512 may include the multiple BB processors 2513 and the multiple RF circuits 2514 , as illustrated in FIG. 15 .
  • FIG. 15 illustrates the example in which the radio communication interface 2512 includes the multiple BB processors 2513 and the multiple RF circuits 2514
  • the radio communication interface 2512 may also include a single BB processor 2513 or a single RF circuit 2514 .
  • the radio communication interface 2512 may support another type of radio communication scheme such as a short-distance wireless communication scheme, a near field communication scheme, and a radio local area network (LAN) scheme.
  • the radio communication interface 2512 may include the BB processor 2513 and the RF circuit 2514 for each radio communication scheme.
  • Each of the antenna switches 2515 switches connection destinations of the antennas 2516 among multiple circuits (such as circuits for different radio communication schemes) included in the radio communication interface 2512 .
  • Each of the antennas 2516 includes a single or multiple antenna elements (such as multiple antenna elements included in an MIMO antenna), and is used for the radio communication interface 2512 to transmit and receive radio signals.
  • the smartphone 2500 may include the multiple antennas 2516 , as illustrated in FIG. 15 . Although FIG. 15 illustrates the example in which the smartphone 2500 includes the multiple antennas 2516 , the smartphone 2500 may also include a single antenna 2516 .
  • the smartphone 2500 may include the antenna 2516 for each radio communication scheme.
  • the antenna switches 2515 may be omitted from the configuration of the smartphone 2500 .
  • the bus 2517 connects the processor 2501 , the memory 2502 , the storage 2503 , the external connection interface 2504 , the camera 2506 , the sensor 2507 , the microphone 2508 , the input apparatus 2509 , the display apparatus 2510 , the speaker 2511 , the radio communication interface 2512 , and the auxiliary controller 2519 to each other.
  • the battery 2518 supplies power to blocks of the smartphone 2500 illustrated in FIG. 15 via feeder lines, which are partially shown as dashed lines in the figure.
  • the auxiliary controller 2519 operates a minimum necessary function of the smartphone 2500 , for example, in a sleep mode.
  • the transceiver device of the apparatus for user equipment side may be implemented by the radio communication interface 2512 .
  • At least a part of the functions of the processing circuit and/or various units of the electronic device or information processing apparatus for user equipment side according to the embodiments of the present disclosure may also be implemented by the processor 2501 or the auxiliary controller 2519 .
  • the auxiliary controller 2519 performs a part of functions of the processor 2501 , the power consumption of the battery 2518 can be reduced.
  • the processor 2501 or the auxiliary controller 2519 may perform at least a part of the functions of the processing circuit and/or various units of the electronic device or information processing apparatus for user equipment side according to the embodiments of the present disclosure, by executing a program stored in the memory 2502 or the storage 2503 .
  • FIG. 16 is a block diagram illustrating an example of a schematic configuration of a base station, such as an evolved Node B (eNB) to which the technology of the present disclosure may be applied.
  • An eNB 2300 includes one or more antennas 2310 and a base station apparatus 2320 . Each antenna 2310 and the base station apparatus 2320 may be connected to each other via a radio frequency (RF) cable.
  • RF radio frequency
  • Each of the antennas 2310 includes a single or multiple antenna elements (such as multiple antenna elements included in a multi-input and multi-output (MIMO) antenna), and is used for the base station apparatus 2320 to transmit and receive radio signals.
  • the eNB 2300 may include the multiple antennas 2310 , as illustrated in FIG. 16 .
  • the multiple antennas 2310 may be compatible with multiple frequency bands used by the eNB 2300 .
  • FIG. 16 illustrates the example in which the eNB 2300 includes the multiple antennas 2310
  • the eNB 2300 may also include a single antenna 2310 .
  • the base station apparatus 2320 includes a controller 2321 , a memory 2322 , a network interface 2323 , and a radio communication interface 2325 .
  • the controller 2321 may be, for example, a CPU or a DSP, and operates various functions of a higher layer of the base station apparatus 2320 .
  • the controller 2321 generates a data packet from data in signals processed by the radio communication interface 2325 , and transfers the generated packet via the network interface 2323 .
  • the controller 2321 may bundle data from multiple base band processors to generate the bundled packet, and transfer the generated bundled packet.
  • the controller 2321 may have logical functions of performing control such as radio resource control, radio bearer control, mobility management, admission control, and scheduling. The control may be performed in corporation with an eNB or a core network node in the vicinity.
  • the memory 2322 includes RAM and ROM, and stores a program that is executed by the controller 2321 , and various types of control data (such as a terminal list, transmission power data, and scheduling data).
  • the network interface 2323 is a communication interface for connecting the base station apparatus 2320 to a core network 2324 .
  • the controller 2321 may communicate with a core network node or another eNB via the network interface 2323 .
  • the eNB 2300 , and the core network node or the other eNB may be connected to each other through a logical interface (such as an S1 interface and an X2 interface).
  • the network interface 2323 may also be a wired communication interface or a radio communication interface for radio backhaul. If the network interface 2323 is a radio communication interface, the network interface 2323 may use a higher frequency band for radio communication than a frequency band used by the radio communication interface 2325 .
  • the radio communication interface 2325 supports any cellular communication scheme such as Long Term Evolution (LTE) and LTE-Advanced, and provides radio connection to a terminal positioned in a cell of the eNB 2300 via the antenna 2310 .
  • the radio communication interface 2325 may typically include, for example, a baseband (BB) processor 2326 and an RF circuit 2327 .
  • the BB processor 2326 may perform, for example, encoding/decoding, modulating/demodulating, and multiplexing/demultiplexing, and performs various types of signal processing of layers (such as L 1 , medium access control (MAC), radio link control (RLC), and a packet data convergence protocol (PDCP)).
  • layers such as L 1 , medium access control (MAC), radio link control (RLC), and a packet data convergence protocol (PDCP)
  • the BB processor 2326 may have a part or all of the above-described logical functions instead of the controller 2321 .
  • the BB processor 2326 may be a memory that stores a communication control program, or a module that includes a processor and a related circuit configured to execute the program. Updating the program may allow the functions of the BB processor 2326 to be changed.
  • the module may be a card or a blade that is inserted into a slot of the base station apparatus 2320 . Alternatively, the module may also be a chip that is mounted on the card or the blade.
  • the RF circuit 2327 may include, for example, a mixer, a filter, and an amplifier, and transmits and receives radio signals via the antenna 2310 .
  • the radio communication interface 2325 may include the multiple BB processors 2326 , as illustrated in FIG. 16 .
  • the multiple BB processors 2326 may be to compatible with multiple frequency bands used by the eNB 2300 .
  • the radio communication interface 2325 may include the multiple RF circuits 2327 , as illustrated in FIG. 16 .
  • the multiple RF circuits 2327 may be compatible with multiple antenna elements.
  • FIG. 16 illustrates the example in which the radio communication interface 2325 includes the multiple BB processors 2326 and the multiple RF circuits 2327
  • the radio communication interface 2325 may also include a single BB processor 2326 or a single RF circuit 2327 .
  • the transceiver device of the apparatus for base station side may be implemented by the radio communication interface 2325 .
  • At least a part of the functions of the processing circuit and/or various units of the electronic device or information processing apparatus for base station side according to the embodiments of the present disclosure may also be implemented by the controller 2321 .
  • the controller 2321 may perform at least a part of the functions of the processing circuit and/or various units of the electronic device or information processing apparatus for base station side according to the embodiments of the present disclosure, by executing a program stored in the memory 2322 .
  • the methods in the present disclosure is not limited to be performed in the time sequence described herein, but may be performed in parallel, or independently, or in other time sequence. Therefore, the performing order of the methods described herein is not a limitation to the technical scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Geometry (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Astronomy & Astrophysics (AREA)
  • Finance (AREA)
  • Databases & Information Systems (AREA)
  • Mathematical Physics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Telephonic Communication Services (AREA)
US16/014,353 2017-09-29 2018-06-21 Information processing apparatus and method, electronic device and computer readable medium Abandoned US20190102863A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/092,392 US11715177B2 (en) 2017-09-29 2020-11-09 Information processing apparatus and method, electronic device and computer readable medium
US18/331,953 US20230316459A1 (en) 2017-09-29 2023-06-09 Information processing apparatus and method, electronic device and computer readable medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710912367.4 2017-09-29
CN201710912367.4A CN109587203A (zh) 2017-09-29 2017-09-29 信息处理设备和方法、电子装置以及计算机可读介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/092,392 Continuation US11715177B2 (en) 2017-09-29 2020-11-09 Information processing apparatus and method, electronic device and computer readable medium

Publications (1)

Publication Number Publication Date
US20190102863A1 true US20190102863A1 (en) 2019-04-04

Family

ID=65897771

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/014,353 Abandoned US20190102863A1 (en) 2017-09-29 2018-06-21 Information processing apparatus and method, electronic device and computer readable medium
US17/092,392 Active US11715177B2 (en) 2017-09-29 2020-11-09 Information processing apparatus and method, electronic device and computer readable medium
US18/331,953 Pending US20230316459A1 (en) 2017-09-29 2023-06-09 Information processing apparatus and method, electronic device and computer readable medium

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/092,392 Active US11715177B2 (en) 2017-09-29 2020-11-09 Information processing apparatus and method, electronic device and computer readable medium
US18/331,953 Pending US20230316459A1 (en) 2017-09-29 2023-06-09 Information processing apparatus and method, electronic device and computer readable medium

Country Status (2)

Country Link
US (3) US20190102863A1 (zh)
CN (1) CN109587203A (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190028707A1 (en) * 2016-01-12 2019-01-24 Shanghaitech University Compression method and apparatus for panoramic stereo video system
CN112135292A (zh) * 2020-09-24 2020-12-25 维沃移动通信有限公司 信号获取方法、装置、电子设备和存储介质
CN117708680A (zh) * 2024-02-06 2024-03-15 青岛海尔科技有限公司 一种用于提升分类模型准确度的方法及装置、存储介质、电子装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111796980B (zh) * 2019-04-09 2023-02-28 Oppo广东移动通信有限公司 数据处理方法、装置、电子设备和存储介质
CN111294515B (zh) * 2020-02-25 2021-07-16 维沃软件技术有限公司 一种图像获取方法及第一电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100205035A1 (en) * 2009-02-09 2010-08-12 Baszucki David B Providing Advertisements in Virtual Environments
US20110013810A1 (en) * 2009-07-17 2011-01-20 Engstroem Jimmy System and method for automatic tagging of a digital image
US8413206B1 (en) * 2012-04-09 2013-04-02 Youtoo Technologies, LLC Participating in television programs
US20140210941A1 (en) * 2013-01-29 2014-07-31 Sony Corporation Image capture apparatus, image capture method, and image capture program
US20140368600A1 (en) * 2013-06-16 2014-12-18 Samsung Electronics Co., Ltd. Video call method and electronic device supporting the method
US9110988B1 (en) * 2013-03-14 2015-08-18 Google Inc. Methods, systems, and media for aggregating and presenting multiple videos of an event

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011B (zh) * 2010-12-10 2012-12-26 北京大学 利用移动拍摄设备进行动态纹理采集及虚实融合的方法
US9363441B2 (en) * 2011-12-06 2016-06-07 Musco Corporation Apparatus, system and method for tracking subject with still or video camera
CN105306921A (zh) * 2014-06-18 2016-02-03 中兴通讯股份有限公司 一种基于移动终端的三维照片拍摄方法及移动终端
US10217022B2 (en) * 2015-03-06 2019-02-26 Ricoh Company, Ltd. Image acquisition and management
CN105049727B (zh) * 2015-08-13 2019-05-21 小米科技有限责任公司 全景图像拍摄的方法、装置及***
US10453172B2 (en) * 2017-04-04 2019-10-22 International Business Machines Corporation Sparse-data generative model for pseudo-puppet memory recast
CN107103645B (zh) * 2017-04-27 2018-07-20 腾讯科技(深圳)有限公司 虚拟现实媒体文件生成方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100205035A1 (en) * 2009-02-09 2010-08-12 Baszucki David B Providing Advertisements in Virtual Environments
US20110013810A1 (en) * 2009-07-17 2011-01-20 Engstroem Jimmy System and method for automatic tagging of a digital image
US8413206B1 (en) * 2012-04-09 2013-04-02 Youtoo Technologies, LLC Participating in television programs
US20140210941A1 (en) * 2013-01-29 2014-07-31 Sony Corporation Image capture apparatus, image capture method, and image capture program
US9110988B1 (en) * 2013-03-14 2015-08-18 Google Inc. Methods, systems, and media for aggregating and presenting multiple videos of an event
US20140368600A1 (en) * 2013-06-16 2014-12-18 Samsung Electronics Co., Ltd. Video call method and electronic device supporting the method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190028707A1 (en) * 2016-01-12 2019-01-24 Shanghaitech University Compression method and apparatus for panoramic stereo video system
US10643305B2 (en) * 2016-01-12 2020-05-05 Shanghaitech University Compression method and apparatus for panoramic stereo video system
CN112135292A (zh) * 2020-09-24 2020-12-25 维沃移动通信有限公司 信号获取方法、装置、电子设备和存储介质
CN117708680A (zh) * 2024-02-06 2024-03-15 青岛海尔科技有限公司 一种用于提升分类模型准确度的方法及装置、存储介质、电子装置

Also Published As

Publication number Publication date
CN109587203A (zh) 2019-04-05
US20210056664A1 (en) 2021-02-25
US11715177B2 (en) 2023-08-01
US20230316459A1 (en) 2023-10-05

Similar Documents

Publication Publication Date Title
US11715177B2 (en) Information processing apparatus and method, electronic device and computer readable medium
US10431183B2 (en) Wireless device displaying images and matching resolution or aspect ratio for screen sharing during Wi-Fi direct service
EP3616424B1 (en) Electronic device and proximity discovery method thereof
JP6859590B2 (ja) 通信装置および通信方法
CN110089045B (zh) 基站、终端设备、方法和记录介质
KR101973934B1 (ko) 증강현실 서비스 제공 방법, 이를 이용하는 사용자 단말 장치 및 액세스 포인트
KR102391111B1 (ko) 제어 장치 및 방법
KR102623181B1 (ko) 무선 디바이스 및 무선 시스템
CN111684857B (zh) 电子设备、用户设备、无线通信方法和存储介质
CN108028961B (zh) 通信***中发送和接收数据的方法和设备
CN103124442A (zh) 用于在无线终端中连接到装置的设备和方法
US10795831B2 (en) Information processing device, communication system, information processing method
WO2022028314A1 (zh) 用于无线通信的电子设备和方法、计算机可读存储介质
KR20170111000A (ko) 디스플레이 장치 및 그의 동작 방법
KR20160092415A (ko) 이동단말기 및 그 제어방법
KR20170052353A (ko) 이동 단말기 및 그의 동작 방법
CN114449593A (zh) 一种小区切换方法及通信装置
CN113994724B (zh) 电子设备、无线通信方法和计算机可读存储介质
US11758221B2 (en) Wireless communication connection system including mobile terminal and electronic device to perform wireless communication connection through mobile terminal
WO2023138470A1 (zh) 电子设备、用于无线通信的方法以及计算机可读存储介质
EP4135471A1 (en) Electronic device, wireless communication method, and computer readable storage medium
EP4192176A1 (en) Wireless communication method and apparatus, and storage medium
WO2017094360A1 (ja) 装置、方法及びプログラム
CN116830480A (zh) 信息传输方法、装置及存储介质

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SUN, CHEN;GUO, XIN;KONG, LINGMING;SIGNING DATES FROM 20180507 TO 20180607;REEL/FRAME:046406/0382

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION