CN106657922B - Scene switching type shared image processing system based on position - Google Patents

Scene switching type shared image processing system based on position Download PDF

Info

Publication number
CN106657922B
CN106657922B CN201710017390.7A CN201710017390A CN106657922B CN 106657922 B CN106657922 B CN 106657922B CN 201710017390 A CN201710017390 A CN 201710017390A CN 106657922 B CN106657922 B CN 106657922B
Authority
CN
China
Prior art keywords
information
image
stored
user equipment
storage unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710017390.7A
Other languages
Chinese (zh)
Other versions
CN106657922A (en
Inventor
朱磊
韩琦
李建英
杨晓光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
QINGDAO YISPACE TECHNOLOGY Co.,Ltd.
Original Assignee
Harbin Yishe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Yishe Technology Co Ltd filed Critical Harbin Yishe Technology Co Ltd
Priority to CN201710017390.7A priority Critical patent/CN106657922B/en
Publication of CN106657922A publication Critical patent/CN106657922A/en
Application granted granted Critical
Publication of CN106657922B publication Critical patent/CN106657922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a scene switching type shared image processing system based on positions, which can support multiple users to independently select a view angle. The system comprises a storage unit, a processing unit and a wireless transmission unit; the storage unit stores a plurality of groups of predetermined image sequences with position information and reference information, wherein the reference information comprises information such as seasons, time periods, weather and the like; the processing unit receives a transmission instruction of one or more user equipment, and acquires the image frame sequence matched with the relevant position information contained in the transmission instruction, the reference information of the user equipment and the sending gesture from the storage unit to send to the user equipment. The invention can provide a scene switching type viewing experience facing to the sightseeing scene for the user.

Description

Scene switching type shared image processing system based on position
Technical Field
The invention relates to an information processing technology, in particular to a scene switching type shared image processing system based on positions.
Background
With the development of tourism economy, the tourism industry is developed vigorously, wherein the natural landscape is the most important part. The aesthetic feeling of natural scenery is often dependent on the angle and time of viewing, and many beautiful scenery needs to be seen at extremely special positions and angles, so-called 'infinite scene at danger peak', unfortunately, due to the cost and safety considerations, most scenic spots have viewing patterns following established routes, and users have no opportunity to draw natural scenery from the optimal viewing angle, which is unfortunate. On the other hand, if the problem that the user can enjoy scenic spots and scenery at any position and at any visual angle is solved, the sightseeing experience of the user is certainly and greatly improved.
The existing means for expanding the sightseeing visual angle/visual field are mainly expected to be distant mirrors, cable cars, hot air balloons, pictures, documentaries (including 3D documentaries), Virtual Reality (VR) and the like, and no means is available for visitors to achieve the sightseeing experience of the free visual angle facing the sightseeing site with extremely low cost.
Disclosure of Invention
The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. It should be understood that this summary is not an exhaustive overview of the invention. It is not intended to determine the key or critical elements of the present invention, nor is it intended to limit the scope of the present invention. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is discussed later.
In view of this, the present invention provides a position-based scene switching type shared image processing system, so as to at least solve the problem that the prior art cannot realize low-cost, real-time viewing and multi-user sharing for viewing at any viewing angle of a sightseeing site.
According to an aspect of the present invention, there is provided a location-based scene-switching shared image processing system including a storage unit, a processing unit, and a wireless transmission unit; the storage unit is used for storing a plurality of groups of preset image sequences, each group of preset image sequence comprises a plurality of pre-stored wide-view images, and each pre-stored wide-view image has position information and reference information; the processing unit is configured to receive, via the wireless transmission unit, at least one transmission instruction from one or more user equipments, for each of the at least one transmission instruction: extracting relevant position information and reference information and sending gestures of corresponding user equipment from the transmission instruction so as to acquire image frame sequences matched with the relevant position information and the reference information and sending gestures of the corresponding user equipment from a plurality of groups of preset image sequences stored in the storage unit and send the image frame sequences to the user equipment; the reference information includes any one or more of season information, time period information, and weather information.
Further, each pre-stored wide view image in each set of pre-determined image sequences pre-stored in the storage unit is pre-direction-corrected, and the direction information corresponding to each pixel position in the pre-stored wide view image pre-direction-corrected is an absolute direction.
Furthermore, each pre-stored wide view image in each group of pre-stored image sequences pre-stored in the storage unit is not subjected to direction correction processing, and the direction information corresponding to each pixel position in the pre-stored wide view image which is not subjected to the direction correction processing is a relative direction, and the storage unit further stores related information for calculating an absolute direction corresponding to each pixel position in each pre-stored wide view image; the position-based scene-switching shared image processing system further comprises a direction correction unit; the direction correcting unit is used for calculating the absolute direction corresponding to each pixel position in each pre-stored wide-viewing-angle image by utilizing the related information in advance so as to correct the direction information of each pixel position in the pre-stored wide-viewing-angle image from the relative direction to the absolute direction.
Further, the processing unit is configured to obtain the image frame sequence matching the relevant position information and the reference information and the sending pose of the corresponding user equipment from the storage unit by: selecting one or more frames of pre-stored wide-angle images matched with the relevant position information and the reference information of the user equipment from a plurality of groups of predetermined image sequences stored in the storage unit as candidate images; capturing a screenshot corresponding to a sending gesture included by the transmission instruction in the pre-stored wide-view image according to direction information corresponding to each pixel position in the pre-stored wide-view image aiming at each frame of the selected candidate images, wherein the size of the screenshot is a preset value; and forming an image frame sequence according to the time sequence by utilizing all the intercepted screenshots, wherein the image frame sequence is used as the image frame sequence matched with the related position information, the reference information of the corresponding user equipment and the sending gesture.
Further, the location-based scene-switching shared image processing system further includes a user management unit; the user management unit is used for receiving the identity authentication request from each user device, authenticating the corresponding user device based on the identity authentication request, and sending authentication success information to the corresponding user device after the authentication is successful so as to establish data connection with the user device.
Further, the processing unit is further configured to: when at least one transmission instruction from one or more user equipment is received, judging whether the transmission instruction contains paid information or not aiming at each transmission instruction, and establishing data connection with the user equipment sending the transmission instruction under the condition that the transmission instruction contains paid information.
Further, the sending the gesture includes: and the normal direction of the screen of the display module in the corresponding user equipment or the normal direction of the back of the screen.
Further, the location-based scene-switching shared image processing system is used for being arranged on any one of the following devices: a cable car, a sightseeing vehicle, a train, an airplane or a ship.
The scene switching type shared image processing system based on the position has the following beneficial effects:
1) the user can view scenery under different conditions matched with the current position of the user (or matched with the position related to the scene viewing site) at any time on site, and the scenery is like scenery of the position of the same scenery spot in different seasons and/or different weather and/or different time periods, so that different scene switching based on the position is realized;
2) the user can watch a unique scenery at a position and a visual angle which cannot be reached by the user on the user equipment of the user through the scene switching type shared image processing system;
3) multiple users can share the system to view scenes at the same time, and the system is used for shooting landscape photos and videos of unique positions and visual angles under unique climates, seasons and time conditions;
4) the system can support multiple users to independently select the viewing angle, and the viewing direction of the screen display image of the user equipment is matched with the screen orientation of the user equipment;
when the system is used, in the case that the transmission instruction from the user equipment comprises the sending gesture of the user equipment, the user can acquire image frame sequences with different viewing angles from the system by changing the sending gesture (such as changing the gesture of the user equipment, setting different sending gestures through touch screen operation and the like); thus, for the situation of a plurality of users, the image frame sequence returned to each user by the system is different due to different sending gestures of each user device, so that the images displayed on each user device are different, and the plurality of users can independently select the viewing angle;
further, in other words, after the direction correction is performed by the direction correction unit, the finder direction corresponding to each frame in the sequence of image frames displayed on the display module screen of the user equipment can be made to coincide with the screen back normal direction (or screen normal direction) of the display module of the user equipment that issued the transmission instruction. Moreover, the user can switch and select between the normal direction of the back of the screen and the normal direction of the screen, so that the quick switching between the two viewing directions is realized.
5) The user can pay the rent fee and the framing fee in the form of electronic payment through a mobile terminal such as a mobile phone;
6) the scenic spot or the related operators can share the best viewing experience of scenic spots to more users through the position-based scene switching type shared image processing system.
These and other advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments of the present invention, taken in conjunction with the accompanying drawings.
Drawings
The invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like reference numerals are used throughout the figures to indicate like or similar parts. The accompanying drawings, which are incorporated in and form a part of this specification, illustrate preferred embodiments of the present invention and, together with the detailed description, serve to further explain the principles and advantages of the invention. In the drawings:
fig. 1 is a block diagram schematically showing the configuration of one example of the location-based scene-switching shared image processing system of the present invention.
Skilled artisans appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve the understanding of the embodiments of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described hereinafter with reference to the accompanying drawings. In the interest of clarity and conciseness, not all features of an actual implementation are described in the specification. It will of course be appreciated that in the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the device structures and/or processing steps closely related to the solution according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
The embodiment of the invention provides a scene switching type shared image processing system based on position, which comprises a storage unit, a processing unit and a wireless transmission unit; the storage unit is used for storing a plurality of groups of preset image sequences, each group of preset image sequence comprises a plurality of pre-stored wide-view images, and each pre-stored wide-view image has position information and reference information; the processing unit is configured to receive, via the wireless transmission unit, at least one transmission instruction from one or more user equipments, for each of the at least one transmission instruction: extracting relevant position information and reference information and sending gestures of corresponding user equipment from the transmission instruction so as to acquire image frame sequences matched with the relevant position information and the reference information and sending gestures of the corresponding user equipment from a plurality of groups of preset image sequences stored in the storage unit and send the image frame sequences to the user equipment; the reference information includes any one or more of season information, time period information, and weather information.
An example of the location-based scene-switching shared image processing system of the present invention is described below with reference to fig. 1.
The location-based scene-switching shared image processing system may be provided in, for example, a terminal device such as a smartphone, a tablet computer, or a head-mounted VR (virtual reality) system as a part of its composition. In this case, the location-based scene-switching shared image processing system of the present invention may be integrated with a user device.
In addition, the location-based scene-switching shared image processing system can also be separately installed as a device on a vehicle in a scenic spot (such as a cable car, a sightseeing bus, a train, an airplane or a ship at a specific location) or on a fixed location (such as a wall of a certain sight spot) in the scenic spot (i.e. a sightseeing site), and so on.
As shown in fig. 1, the location-based scene-switching shared image processing system includes a storage unit 1, a processing unit 2, and a wireless transmission unit 3. The processing unit 2 may be, for example, a CPU, a microprocessor, or the like having a control processing function. The wireless transmission unit 3 is, for example, a WIFI transmission module or a bluetooth transmission module.
The storage unit 1 is configured to store a plurality of groups of predetermined image sequences, each group of predetermined image sequences including a plurality of pre-stored wide view images, where each pre-stored wide view image has position information and reference information.
The position information of each pre-stored wide-angle image refers to a position of an acquisition device used when the image is acquired, and may be obtained, for example, by a GPS positioning module in the acquisition device.
In addition, the reference information of each pre-stored wide view image may be any one or more of the following information: seasonal information, time period information, and weather information. The season information is, for example, any one of spring, summer, fall, or winter; alternatively, the season information may be divided into more detailed parts, such as early spring, midsummer, early winter, and so on. The time zone information is, for example, a time zone within 24 hours of a day, such as noon (12: 00-14: 00), evening (17: 00-19: 00), and the like; alternatively, the time period may be divided into 24 hours at equal intervals, such as (0: 00-4: 00), (4: 00-8: 00), … …, (20: 00-24: 00), and so on. Further, the weather information is, for example, medium rain, fine or small snow, and the like.
In one implementation (hereinafter referred to as a first implementation), each pre-stored wide view image in each set of predetermined image sequences stored by the storage unit 1 is not subjected to the direction correction process. In other words, in each set of predetermined image sequences stored in the storage unit 1, the direction information corresponding to each pixel position in each pre-stored wide-angle image is a relative direction. The relative direction corresponding to each pixel position of each pre-stored wide-angle image is a direction relative to the overall orientation of the corresponding acquisition device, and is a relative quantity, where the corresponding acquisition device is the acquisition device that acquires the pre-stored wide-angle image at that time. The relative direction corresponding to each pixel position of the pre-stored wide-view image can be obtained by using the prior art, for example, calculated according to the orientation of the object acquisition device, the respective orientations of the lenses included in the object acquisition device, and the like.
In another implementation (hereinafter referred to as a second implementation), each pre-stored wide view image in each set of predetermined image sequences stored by the storage unit 1 is subjected to the direction correction processing in advance. In other words, the direction information corresponding to each pixel position in each pre-stored wide view image in each set of the predetermined image sequences stored in the storage unit 1 is an absolute direction.
In yet another implementation (hereinafter referred to as a third implementation), the above-described location-based scene-switching shared viewing system may further include a direction correction unit, but each pre-stored wide view image in each set of predetermined image sequences pre-stored in the storage unit 1 is not subjected to the direction correction process. In other words, the direction information corresponding to each pixel position in the pre-stored wide-angle image without the direction correction processing is the relative direction. In addition, the storage unit 1 also stores therein correlation information for calculating absolute directions corresponding to respective pixel positions in each of the pre-stored wide view images.
The related information is, for example, a plurality of groups of predetermined viewing direction sequences, each group of predetermined viewing direction sequences corresponds to one group of predetermined image sequences, and each predetermined viewing direction in the group of predetermined viewing direction sequences represents a viewing direction of the corresponding pre-stored wide-angle image in the group of predetermined image sequences. It should be noted that the viewing direction of the pre-stored wide-angle image refers to the overall orientation of the corresponding capturing device when capturing the image. For example, if the corresponding capturing device is installed on a certain vehicle and the overall orientation of the capturing device is the same as the front direction of the vehicle, then if the vehicle is traveling or flying in the south-bound direction and if the capturing device captures a frame of wide-angle image (which is stored in the storage unit 1 as a pre-stored wide-angle image), the viewing direction information corresponding to the frame of wide-angle image is in the south-bound direction, and so on.
For example, assume that, of the predetermined image sequences of the group stored in the storage unit 1, a certain predetermined image sequence of the group may be represented by { I }1,I2,I3,……,IMM represents the number of pre-stored wide-angle images included in the set of predetermined image sequences, which is a positive integer; also, it is assumed that the viewing direction information sequence corresponding to the predetermined set of image sequences can be expressed as { D1,D2,D3,……,DMRepresents it. In this example, a wide view angle image I is prestored1And view direction information D1Correspondence (i.e. scene-taking direction information D)1Indicating that the corresponding collecting equipment is collecting the pre-stored wide-view image I1Overall orientation of the time), pre-storing a wide view angle image I2And view direction information D2Correspondence (i.e. scene-taking direction information D)2To representCorresponding acquisition equipment acquires and prestores wide-view-angle images I2The overall orientation of the time), and so on. Thus, the direction correcting unit can utilize { D1,D2,D3,……,DMAre { I } pair1,I2,I3,……,IMMakes directional corrections such that { I }1,I2,I3,……,IMAnd correcting the direction information of each pixel position in each prestored wide-view-angle image from the relative direction to the absolute direction.
With the pre-stored wide view image I in the above example1For example, the wide-angle image I is pre-stored before the direction correction unit performs the direction correction1The direction information corresponding to each pixel position in the image is a relative direction. According to the corresponding view direction information D1It can be known that the corresponding collecting equipment is collecting and prestoring wide-view-angle image I1The overall orientation of the time and the time can further calculate the pre-stored wide-view image I1The absolute direction corresponding to each pixel position in the image. Using prestored wide-angle image I1The absolute direction corresponding to each pixel position replaces the relative direction of the corresponding pixel position, i.e. the direction correction process of the direction correction unit is completed. The direction correction unit performs direction correction processing on other pre-stored wide-angle images similarly, and details are not described.
In this way, the direction correction processing for each pre-stored wide view image in each set of the predetermined image sequence can be completed by the direction correction unit, so that the direction information corresponding to each pixel position in each pre-stored wide view image is corrected from the relative direction to the absolute direction. That is, in this implementation (i.e., the third implementation), the direction information corresponding to each pixel position in each pre-stored wide view image in each set of the predetermined image sequences stored by the storage unit 1 is an absolute direction.
The absolute direction of each pixel position refers to a direction relative to an environment absolute coordinate system, and each coordinate axis of the environment absolute coordinate system is unchanged; the relative orientation of each pixel location is relative to the orientation of the corresponding capture device when captured, which in some instances (e.g., when disposed on a cable car or the like) varies.
Furthermore, the processing unit 2 may also receive at least one transmission instruction from one or more user equipments via the wireless transmission unit 3. For each of the above-mentioned at least one transmission instruction, the processing unit 2 extracts the relevant position information and the reference information and the sending gesture of the corresponding user equipment from the transmission instruction to acquire an image frame sequence matching the relevant position information and the reference information and the sending gesture of the corresponding user equipment from the plurality of sets of predetermined image sequences stored in the storage unit 1 to send to the user equipment. The reference information of the user equipment includes, for example, any one or more of season information, time period information, and weather information, similar to the above, and is not described in detail here. Thus, the transmission instruction from the user equipment comprises the relevant position information, the reference information of the user equipment and the sending gesture. The related location information may be location information of the user equipment (for example, obtained through a GPS positioning module in the user equipment), or may be location information of a preset spot on the scene (for example, a certain location on a viewing platform is preset in a certain scenic spot, for example, obtained by scanning a two-dimensional code matching of the location by the user, and the like). In addition, the sending gesture of the user equipment refers to a gesture of the user equipment when sending the transmission instruction, and may be, for example, a screen normal direction of a display module in the user equipment or a screen back normal direction thereof. The sending gesture of the user device may be set by the user through a touch screen operation on the user device; alternatively, the current posture of the user equipment may be acquired as the sending posture by integrating a posture sensing sub-module in the user equipment. Further, the user device may be any of terminal devices such as a smartphone, a tablet computer, and a head-mounted VR (virtual reality) system, for example.
For example, the processing unit 2 may acquire a sequence of image frames matching the relevant location information and the reference information and the transmission pose of the corresponding user equipment from the storage unit 1 through steps 201 to 203 as will be described below.
Step 201 is performed first. In step 201, one or more frames of pre-stored wide view images matching the above-mentioned relevant position information and reference information of the user equipment are selected as candidate images in the plurality of sets of predetermined image sequences stored in the storage unit 1.
Taking as an example that the reference information in the transmission instruction contains season information (denoted as S1), time period information (denoted as T1), and weather information (denoted as W1) at the same time, it is assumed that the relevant position information can be denoted as P1, and it is assumed that each image in each set of predetermined image sequences stored in the storage unit 1 also has the position information, season information, time period information, and weather information of the image. Among all the predetermined image sequences stored in the storage unit 1, an image sequence satisfying "position information is P1, season information is S1, time period information is T1, and weather information is W1" is acquired. It should be noted that the image sequence satisfying "position information is P1, season information is S1, time period information is T1, and weather information is W1" obtained at this time may be one image or a plurality of images. In this way, the processing unit 2 can determine, as a candidate image sequence, the above-described image sequence (i.e., the image frame sequence matching the relevant position information and the reference information) satisfying "position information is P1, season information is S1, time period information is T1, and weather information is W1" through the wireless transmission unit 3. The processing unit 2 may form the candidate image sequence according to a chronological order.
Next, in step 202, for each frame of the pre-stored wide view image in all the candidate images selected in step 201, capturing a screenshot in which a direction corresponds to the sending gesture in the transmission instruction from the pre-stored wide view image by using the direction information corresponding to each pixel position of the pre-stored wide view image, where the size of the screenshot is a preset value, and the preset value may be set according to an empirical value, or may be set according to a screen size parameter included in the transmission instruction sent by the user equipment, or the like.
Then, in step 203, an image frame sequence is formed in chronological order using all the screenshots captured in step 202 as an image frame sequence matching the reference information of the user equipment, the relevant location information and the transmission pose contained in the transmission instruction.
Note that, in the case of implementing the first implementation, the direction information corresponding to each pixel position in each frame of the pre-stored wide view image stored in the storage unit 1 is a relative direction. In this case, for a frame of pre-stored wide-angle image, "the screenshot in which the direction corresponds to the sending posture in the transmission instruction" captured in step 202 is actually "the screenshot in which the relative direction corresponds to the sending posture in the transmission instruction". Thus, the image or video that is ultimately viewed by the user device, as if the user were sitting in a vehicle, changes in the image or video that it displays as the direction of vehicle movement changes, even if the user device is held constant in one direction.
Further, in the case of implementing the second or third implementation, the direction information corresponding to each pixel position in each frame of the pre-stored wide view image stored in the storage unit 1 is an absolute direction. In this case, for a frame of pre-stored wide-angle image, "the screenshot in which the direction corresponds to the sending posture in the transmission instruction" captured in step 202 is actually "the screenshot in which the absolute direction corresponds to the sending posture in the transmission instruction". Therefore, the image or video finally seen by the user equipment is the absolute direction pointed by the normal direction of the back of the screen (or the normal direction of the screen) of the user equipment, and if the user equipment is kept towards one direction, the displayed image or video cannot change along with the change of the moving direction of the vehicle. In other words, after the direction correction is performed by the direction correction unit in the second or third implementation, the viewing direction corresponding to each frame in the sequence of image frames displayed on the display module screen of the user equipment coincides with the screen back normal direction (or screen normal direction) of the display module of the user equipment that issued the transmission instruction. In addition, the user can switch the selection between the normal direction of the back of the screen and the normal direction of the screen, for example, so as to realize the quick switching between the two viewing directions.
In addition, when the scene-switching shared image processing system is installed on an unmanned vehicle, the unmanned vehicle is located at a sightseeing spot, and one or more users carrying their respective user devices are also located at the sightseeing spot, so as to establish a connection with the scene-switching shared image processing system in a wireless manner to perform data transmission. When the scene-switching shared image processing system is provided on a manned vehicle such as a cable car, a sightseeing bus, a train, an airplane or a ship, one or more users ride on the vehicle with their respective user devices, and the vehicle is located at a sightseeing spot. Furthermore, when the scene-switching shared image processing system is provided at a fixed location, such as a tourist site, one or more users carrying their respective user devices are also located at the tourist site.
According to another implementation, the processing unit 2 may further determine, when receiving at least one transmission instruction from one or more user devices, whether the transmission instruction includes paid information for each of the at least one transmission instruction, and establish a data connection with the user device that issued the transmission instruction if the transmission instruction includes paid information; and if the transmission instruction is judged not to contain the paid information, the data connection is not established with the corresponding user equipment. The paid information is obtained by an online payment method such as code scanning payment, for example.
Further, according to one implementation, the location-based scene-switching shared image processing system may further include a user management unit. The user management unit can receive the identity authentication request from each user equipment through the wireless transmission unit 3, perform identity authentication on the corresponding user equipment based on the identity authentication request, and send authentication success information to the corresponding user equipment after the authentication is successful so as to establish data connection with the user equipment.
By applying the scene switching type shared image processing system based on the position, a user can realize shooting or shooting at a position which cannot be reached by the user through user equipment such as a mobile phone, so that the user experience is greatly enhanced, and the scene switching type shared image processing system based on the position can be simultaneously suitable for a plurality of users and is convenient and fast to operate.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (4)

1. The position-based scene-switching type shared image processing system is characterized by comprising a storage unit, a processing unit and a wireless transmission unit;
the storage unit is used for storing a plurality of groups of preset image sequences, each group of preset image sequence comprises a plurality of pre-stored wide-view images, and each pre-stored wide-view image has position information and reference information;
the processing unit is configured to receive, via the wireless transmission unit, at least one transmission instruction from one or more user equipments, for each of the at least one transmission instruction: extracting relevant position information and reference information and sending gestures of corresponding user equipment from the transmission instruction so as to acquire image frame sequences matched with the relevant position information and the reference information and sending gestures of the corresponding user equipment from a plurality of groups of preset image sequences stored in the storage unit and send the image frame sequences to the user equipment;
the reference information comprises any one or more of season information, time period information and weather information;
each pre-stored wide-angle image in each group of pre-stored image sequences pre-stored in the storage unit is subjected to direction correction processing in advance, and direction information corresponding to each pixel position in the pre-stored wide-angle image subjected to the direction correction processing is an absolute direction;
or, each pre-stored wide view image in each group of pre-stored image sequences pre-stored in the storage unit is not subjected to direction correction processing, and the direction information corresponding to each pixel position in the pre-stored wide view image without being subjected to the direction correction processing is a relative direction, the storage unit further stores related information for calculating an absolute direction corresponding to each pixel position in each pre-stored wide view image, and the position-based scene switching type shared image processing system further includes a direction correction unit, the direction correction unit is configured to calculate an absolute direction corresponding to each pixel position in each pre-stored wide view image by using the related information in advance, so as to correct the direction information of each pixel position in the pre-stored wide view image from the relative direction to the absolute direction;
the sending gesture comprises a screen normal direction or a screen back normal direction of a display module in the corresponding user equipment;
the processing unit is used for acquiring the image frame sequence matched with the related position information and the reference information and the sending gesture of the corresponding user equipment from the storage unit in the following way:
selecting one or more frames of pre-stored wide-angle images matched with the relevant position information and the reference information of the user equipment from a plurality of groups of predetermined image sequences stored in the storage unit as candidate images;
capturing a screenshot corresponding to a sending gesture included by the transmission instruction in the pre-stored wide-view image according to direction information corresponding to each pixel position in the pre-stored wide-view image aiming at each frame of the selected candidate images, wherein the size of the screenshot is a preset value;
and forming an image frame sequence according to the time sequence by utilizing all the intercepted screenshots, wherein the image frame sequence is matched with the related position information, the reference information of the corresponding user equipment and the sending gesture.
2. The location-based scene-switching shared image processing system according to claim 1, further comprising a user management unit;
the user management unit is used for receiving the identity authentication request from each user device, authenticating the corresponding user device based on the identity authentication request, and sending authentication success information to the corresponding user device after the authentication is successful so as to establish data connection with the user device.
3. The location-based scene-switching shared image processing system of claim 1 or 2, wherein the processing unit is further configured to:
when the at least one transmission instruction from the one or more user equipment is received, judging whether the transmission instruction contains paid information or not aiming at each of the at least one transmission instruction, and establishing data connection with the user equipment sending the transmission instruction under the condition that the transmission instruction contains paid information.
4. The location-based scene-switching shared image processing system according to claim 1 or 2, wherein the location-based scene-switching shared image processing system is configured to be provided on any one of:
a cable car, a sightseeing vehicle, a train, an airplane or a ship.
CN201710017390.7A 2017-01-10 2017-01-10 Scene switching type shared image processing system based on position Active CN106657922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710017390.7A CN106657922B (en) 2017-01-10 2017-01-10 Scene switching type shared image processing system based on position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710017390.7A CN106657922B (en) 2017-01-10 2017-01-10 Scene switching type shared image processing system based on position

Publications (2)

Publication Number Publication Date
CN106657922A CN106657922A (en) 2017-05-10
CN106657922B true CN106657922B (en) 2020-02-18

Family

ID=58843557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710017390.7A Active CN106657922B (en) 2017-01-10 2017-01-10 Scene switching type shared image processing system based on position

Country Status (1)

Country Link
CN (1) CN106657922B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107105168A (en) * 2017-06-02 2017-08-29 哈尔滨市舍科技有限公司 Can virtual photograph shared viewing system
CN107277009A (en) * 2017-06-16 2017-10-20 苏州蜗牛数字科技股份有限公司 A kind of method showed in VR products round the clock
CN108415656B (en) * 2018-03-09 2022-07-12 网易(杭州)网络有限公司 Display control method, device, medium and electronic equipment in virtual scene
CN110399567A (en) * 2019-06-28 2019-11-01 深圳市华夏光彩股份有限公司 Sight spot information recommended method and Related product based on display screen

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753813A (en) * 2008-12-17 2010-06-23 索尼株式会社 Imaging apparatus, imaging method, and program
CN101937535A (en) * 2010-09-21 2011-01-05 上海杰图度假网络科技有限公司 Panoramic electronic map-based virtual tourism platform system
CN102256154A (en) * 2011-07-28 2011-11-23 中国科学院自动化研究所 Method and system for positioning and playing three-dimensional panoramic video
CN103543831A (en) * 2013-10-25 2014-01-29 梁权富 Head-mounted panoramic player
CN205345350U (en) * 2016-01-26 2016-06-29 高士雅 Outer view system of aircraft passenger compartment
CN106095774A (en) * 2016-05-25 2016-11-09 深圳市创驰蓝天科技发展有限公司 A kind of unmanned plane image panorama methods of exhibiting
CN106162204A (en) * 2016-07-06 2016-11-23 传线网络科技(上海)有限公司 Panoramic video generation, player method, Apparatus and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130053535A (en) * 2011-11-14 2013-05-24 한국과학기술연구원 The method and apparatus for providing an augmented reality tour inside a building platform service using wireless communication device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753813A (en) * 2008-12-17 2010-06-23 索尼株式会社 Imaging apparatus, imaging method, and program
CN101937535A (en) * 2010-09-21 2011-01-05 上海杰图度假网络科技有限公司 Panoramic electronic map-based virtual tourism platform system
CN102256154A (en) * 2011-07-28 2011-11-23 中国科学院自动化研究所 Method and system for positioning and playing three-dimensional panoramic video
CN103543831A (en) * 2013-10-25 2014-01-29 梁权富 Head-mounted panoramic player
CN205345350U (en) * 2016-01-26 2016-06-29 高士雅 Outer view system of aircraft passenger compartment
CN106095774A (en) * 2016-05-25 2016-11-09 深圳市创驰蓝天科技发展有限公司 A kind of unmanned plane image panorama methods of exhibiting
CN106162204A (en) * 2016-07-06 2016-11-23 传线网络科技(上海)有限公司 Panoramic video generation, player method, Apparatus and system

Also Published As

Publication number Publication date
CN106657922A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN106657923B (en) Scene switching type shared viewing system based on position
CN106550182B (en) Shared unmanned aerial vehicle viewing system
JP7223978B2 (en) Calibration device and calibration method
CN106657922B (en) Scene switching type shared image processing system based on position
US8963999B1 (en) Augmented reality with earth data
CN106657792B (en) Shared viewing device
EP2625847B1 (en) Network-based real time registered augmented reality for mobile devices
US20190356936A9 (en) System for georeferenced, geo-oriented realtime video streams
CN108932051A (en) augmented reality image processing method, device and storage medium
US20130201182A1 (en) Image display apparatus, imaging apparatus, image display method, control method for imaging apparatus, and program
CN104571532A (en) Method and device for realizing augmented reality or virtual reality
CN108924590B (en) Video playing and photographing system
CN108114471B (en) AR service processing method and device, server and mobile terminal
JP2003132068A (en) Navigation system and navigation terminal
CN112815923B (en) Visual positioning method and device
US10825252B2 (en) Information processing program, method, and system for sharing virtual process for real object in real world using augmented reality
CN108650522B (en) Live broadcast system capable of instantly obtaining high-definition photos based on automatic control
CN110858414A (en) Image processing method and device, readable storage medium and augmented reality system
CN107197165A (en) Unmanned plane self-heterodyne system and method
JP2011233005A (en) Object displaying device, system, and method
WO2011096343A1 (en) Photographic location recommendation system, photographic location recommendation device, photographic location recommendation method, and program for photographic location recommendation
US10102675B2 (en) Method and technical equipment for determining a pose of a device
KR20150077607A (en) Dinosaur Heritage Experience Service System Using Augmented Reality and Method therefor
JP4059154B2 (en) Information transmission / reception device, information transmission / reception program
JP2014142847A (en) Service method for providing information in augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 150016 Heilongjiang Province, Harbin Economic Development Zone haping Road District Dalian road and Xingkai road junction

Applicant after: HARBIN YISHE TECHNOLOGY CO., LTD.

Address before: 150016 Heilongjiang City, Harbin province Daoli District, quiet street, unit 54, unit 2, layer 4, No. 3

Applicant before: HARBIN YISHE TECHNOLOGY CO., LTD.

GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 266100 Block C 200-43, Chuangke Street, Qingdao, 306 Ningxia Road, Laoshan District, Qingdao City, Shandong Province

Patentee after: QINGDAO YISPACE TECHNOLOGY Co.,Ltd.

Address before: 150016 Heilongjiang Province, Harbin Economic Development Zone haping Road District Dalian road and Xingkai road junction

Patentee before: HARBIN YISHE TECHNOLOGY Co.,Ltd.