CN104407696B - 移动设备的虚拟球模拟及控制的方法 - Google Patents

移动设备的虚拟球模拟及控制的方法 Download PDF

Info

Publication number
CN104407696B
CN104407696B CN201410619473.XA CN201410619473A CN104407696B CN 104407696 B CN104407696 B CN 104407696B CN 201410619473 A CN201410619473 A CN 201410619473A CN 104407696 B CN104407696 B CN 104407696B
Authority
CN
China
Prior art keywords
virtual ball
screen
ball
described virtual
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410619473.XA
Other languages
English (en)
Other versions
CN104407696A (zh
Inventor
张斯聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong three hundred and sixty degree e-commerce Co., Ltd.
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201410619473.XA priority Critical patent/CN104407696B/zh
Publication of CN104407696A publication Critical patent/CN104407696A/zh
Priority to HK15105142.9A priority patent/HK1204690A1/zh
Priority to PCT/CN2015/093735 priority patent/WO2016070800A1/zh
Priority to US15/524,858 priority patent/US10401947B2/en
Priority to JP2017542256A priority patent/JP6810048B2/ja
Priority to RU2017119469A priority patent/RU2667720C1/ru
Application granted granted Critical
Publication of CN104407696B publication Critical patent/CN104407696B/zh
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Computer Vision & Pattern Recognition (AREA)

Abstract

本发明公开了一种移动设备的虚拟球模拟及控制的方法,包括:使用所述移动设备的图像采集部件采集图像,以获取连续图像序列;对图像内容进行分析;所述虚拟球在所采集的图像序列中按照特定规则与图像内容及用户进行交互;以及在所述移动设备的屏幕上显示交互结果。

Description

移动设备的虚拟球模拟及控制的方法
技术领域
本发明涉及人机交互中的增强现实***,更具体地,本发明涉及移动设备的虚拟球模拟及控制的方法。
背景技术
随着移动设备和技术的发展,在移动设备上设计开发增强现实***对增强现实技术的推进和发展具有重要意义。增强现实技术能够将虚拟信息与现实场景结合起来,增强用户身临其境的感觉。通过传感器或摄像机屏幕可以捕捉到用户的动作命令。
传统的基于计算机的人机交互***都是依靠鼠标、键盘实现的,但在移动设备上的增强现实***,通过键盘实现交互很容易发生抖动,使交互精度受到很大影响。在现实生活中,人通过视觉、触觉和力觉可以用手任意操纵空间中的诸如球的真实物体,人的双手已经成为人类与第三方世界交流最直观、最自然和最现实的媒介。
然而,在现有技术中的人与球之间的互动基本分为以下两种:(1)人与实物球之间的互动;(2)利用诸如红外检测器等特殊物理检测装置捕捉人的位置来实现与虚拟球的互动。
因此,需要一种在不需要额外设备的情况下容易地实现用户与虚拟球的虚拟球模拟和控制装置。
发明内容
根据本发明的一个实施例,提供了一种移动设备的虚拟球模拟及控制的方法,包括:使用所述移动设备的图像采集部件采集图像,以获取连续图像序列;对图像内容进行分析;所述虚拟球在所采集的图像序列中按照以下中的至少一个与图像内容及用户进行交互:所述虚拟球与图像中颜色边界进行刚体碰撞;所述虚拟球与屏幕左、右、下边界进行刚体碰撞;所述虚拟球在屏幕中,受到自屏幕上方竖直向屏幕下方的虚拟重力场作用,向屏幕下方“掉落”;所述虚拟球与用户手指发生刚体碰撞,碰撞时考虑所述虚拟球与手指碰撞的角度、力度;所述虚拟球可向上飞出屏幕范围,飞出屏幕范围后,仅做向屏幕下方的自由落体运动;仅考虑所述虚拟球的下半部分与图像内容及用户手指的碰撞;以及在所述移动设备的屏幕上显示交互结果。
优选地,对图像内容进行分析的步骤进一步包括:计算所述虚拟球附近的图像边缘;计算碰撞速度和方向;更新所述虚拟球的位置;以及在所述移动设备的屏幕上显示所述虚拟球。
优选地,计算所述虚拟球附近的图像边缘的步骤进一步包括:将输入图像转为灰度图;计算所述虚拟球下半部分灰度直方图;用所述直方图生成灰度概率图;以及将所生成的概率图作为边缘判断依据。
优选地,所述图像边缘是灰度概率不同的区域交界。
优选地,所述碰撞速度在横纵两个方向上的速度相互独立,并且横向速度与横方向接触点到所述虚拟球球心的距离成正比,纵向速度与纵方向上接触点到球心的距离成反比。
根据本公开和附图的下面的详细描述,对本领域的普通技术人员来说其它的目的、特征、以及优点将是显而易见的。
附图说明
附图图示了本发明的实施例,并与说明书一起用于解释本发明的原理。在附图中:
图1图示了根据本发明的实施例的虚拟球模拟及控制的方法。
图2图示了根据本发明的实施例的应用场景示意图。
图3图示了如何实现图1中的图像内容分析的流程图。
图4图示了如何计算球附近的图像边缘的流程图。
图5图示了当没有其他物体进入球体范围时的灰度图。
图6图示了当有其他物体进入球体范围时的灰度图。
具体实施方式
根据本发明的实施例公开了一种移动设备的虚拟球模拟及控制的方法。在以下描述中,为了说明的目的,阐述了多个具体细节以提供对本发明的实施例的全面理解。然而,对于本领域人员显而易见的是,本发明的实施例可以在没有这些具体细节的情况下实现。
本发明采用常见的消费级图像采集设备(诸如手机、平板电脑、家用摄像头)捕捉普通图像,并利用图像智能分析结果来产生人与虚拟球的互动效果。具体来说,本发明公开了一种利用图像采集装置(如摄像头)获取图像信息,利用计算装置(如PC,手机,平板电脑等)生成虚拟球,同时利用图像处理方法智能分析获取到的图像内容,并以此为依据与虚拟球进行交互,最终将交互结果及图像显示在显示设备(如手机,显示器等)上的***。
图1图示了根据本发明的实施例的虚拟球模拟及控制的方法100。该方法100主要包括三个步骤:图像采集、图像内容分析和交互结果显示。
具体地,在图像采集中,利用常见图像采集设备(如手机,台式PC摄像头)获取到连续图像序列。图像采集设备可以移动,不影响虚拟球的交互效果。
在图像内容分析中,本发明假设虚拟球在采集到的图像序列中按一定规则与图像内容及用户进行交互,下面以虚拟颠球为例进行介绍。须注意的是,虚拟颠球仅作为说明,介绍虚拟球如何与图像内容及用户进行交互,不作为对本发明在实施例上的限制,任何包含虚拟球与图像及用户的交互应用都属于本发明所属范围。
图2图示了根据本发明的实施例的应用场景示意图。如图2所示,用户手持设备(如手机),摄像头对准某一场景S,显示屏幕面向用户,用户手指进入图像采集设备(诸如手机摄像头)所面向的场景S,与虚拟颠球进行交互,交互规则如下:
(1)球与图像中颜色边界进行刚体碰撞;
(2)球与屏幕左、右、下边界进行刚体碰撞;
(3)球在屏幕中,受到自屏幕上方竖直向屏幕下方的虚拟重力场作用,向屏幕下方“掉落”;
(4)球与用户手指发生刚体碰撞,碰撞时考虑球与手指碰撞的角度、力度;
(5)球可向上飞出屏幕范围,飞出屏幕范围后,仅做向屏幕下方的自由落体运动;
(6)仅球的下半部分考虑与图像内容及用户手指的碰撞,球上半部分不考虑;
(7)“球”仅作为口头常用语用来称呼虚拟球体,其实际上在屏幕上显示的是一个正圆形。
本领域技术人员将理解,任意一种满足功能的交互设备及基于图像处理的交互方式都包含在本发明的交互方式中,如本发明中使用的虚拟球并不限于虚拟颠球,其他诸如可以是虚拟冰壶球,虚拟乒乓球,篮球等。
图3图示了如何实现图1中的图像内容分析的流程图。在步骤31中,输入下一帧图像。在步骤32中,计算球附近的图像边缘。在步骤33中,计算碰撞速度和方向。在步骤34中,更新球的位置。在步骤35中,显示所述球。在步骤36中,判断图像内容分析是否结束。如果是,则该流程结束。如果不是,则该流程返回步骤31。
计算球附近的图像边缘
图4图示了如何计算球附近的图像边缘的流程图。计算球附近图像边缘的原理如下:本发明认为球体下半部分的颜色应为一种,当出现两种及以上的灰度时,认为有其他物体进入球体范围,与球进行交互。具体计算过程涉及直方图的计算和概率图的生成。
如图4所示,首先,将输入图像转为灰度图。然后,计算球下半部分灰度直方图H。然后,用所述直方图H生成灰度概率图。最后,将所生成的概率图作为边缘判断依据。
A.计算直方图
图5图示了当没有其他物体进入球体范围时的灰度图。图中灰色区域是计算颜色直方图的范围。颜色直方图计算方法:
H ( i ) = Σ j = 0 N P ( j )
其中,
H是一个含256个元素的数组,每个元素表示灰度图像中,灰度值为i的像素个数。遍历灰度图上图灰色区域范围每个像素p(j),j表示第j个像素,N为灰度区域像素总数,若灰度值p(j)=i,H(i)=H(i)+1。
B.生成概率图
图6图示了当有其他物体进入球体范围时的灰度图。在如上所述计算得到H后,本发明认为,出现的多的灰度为主背景灰度,其他灰度的像素表示进入球体范围的交互物体。
出现概率为:
D(j)表示出小概率图每个像素灰度值,j表示第j个像素,HMax表示直方图256个元素中的最大值。结果图D中,灰度值越大,表示在球体范围内的颜色越少,即该像素是进入球体的交互物体的概率就大。那么,图像边缘即出现概率不同的两个区域交界,也即灰度概率不同的区域交界。
计算碰撞速度、方向
本发明认为,上一步生成的出现概率图D中,是交互物体概率更高的区域的重心,为交互物体与球的交互点PInt,PInt相对于球心的横纵坐标(像素)差,表示交互的横纵方向力度大小,横纵两方向速度相互独立。
交互点计算方法:
PInt(XC,YC)
X C = 1 M Σ j = 1 M ( D ( j ) × X j )
Y C = 1 M Σ j = 1 M ( D ( j ) × Y j )
X,Y表示概率高的区域的横纵坐标,D(j)是上一步求出的概率,XC,YC即为重心坐标。横向速度与X方向接触点到球心的距离成正比,接触位置离球心越远,那么接触速度越大。纵向速度接与Y方向接触点到球心的距离成反比,接触位置离球心越近,那么接触速度越大:
V X = | X C - X 0 | R × V X , M a x
V Y = ( 1 - | Y C - Y 0 | R ) × V Y , M a x
Max下标表示横纵方向最大速度,是预先设定的固定值。
更新球位置
本发明认为,球的位置从初始静止开始,分为横纵两个方向的独立运动。横方向在每次交互发生(触碰手,触碰图像边缘,触碰屏幕边缘)后,以触碰速度VX做匀速直线运动。纵方向在交互发生后,以触碰速度VY为初速度,做竖直上抛运动(参考:http://baike.***.com/link?url=tJ8e6IfmpCZqSxfLM98X7_qCWeuq2JQBHDCRJfb8GxDqZHhCqxeHuJ_9GokBCSzZ)
当前球位置:
X=Xt+VX×T
Y=Yt+(VY-0.5×g×T)×T
其中下标t表示第t次触碰Xt,Yt表示第t次触碰时的横纵坐标,g是重力加速度,T表示当前与第t次触碰之间的时间间隔。
显示球
将虚拟球以当前位置X,Y显示在屏幕上,即完成所有操作。
本发明利用常见的消费级图像采集设备(如手机,平板电脑,家用摄像头)捕捉普通图像,并利用图像智能分析结果,产生人与虚拟球的互动效果,使得基于图像处理的人与球交互利用增强现实的方法将互动结果显示给用户。因此,本发明可以让普通用户使用手机、平板电脑等常见图像采集、显示设备,与通过增强现实技术生成的虚拟小球进行互动,从而产生更真实的实景感受,更好的融入到增强现实***中,达到现有方式不能完成的独特交互方式。
上述实施例仅是本发明的优选实施例,并不用于限制本发明。对本领域技术人员显而易见的是,在不脱离本发明精神和范围的情况下,可以对本发明的实施例进行各种修改和改变。因此,本发明意在涵盖落入如权利要求所限定的本发明的范围之内的所有的修改或变型。

Claims (5)

1.一种移动设备的虚拟球模拟及控制的方法,包括:
使用所述移动设备的图像采集部件采集图像,以获取连续图像序列;
对图像内容进行分析;
所述虚拟球在所采集的图像序列中按照以下中的至少一个与图像内容及用户进行交互:所述虚拟球与图像中颜色边界进行刚体碰撞;所述虚拟球与屏幕左、右、下边界进行刚体碰撞;所述虚拟球在屏幕中,受到自屏幕上方竖直向屏幕下方的虚拟重力场作用,向屏幕下方“掉落”;所述虚拟球与用户手指发生刚体碰撞,碰撞时考虑所述虚拟球与手指碰撞的角度、力度;所述虚拟球可向上飞出屏幕范围,飞出屏幕范围后,仅做向屏幕下方的自由落体运动;仅考虑所述虚拟球的下半部分与图像内容及用户手指的碰撞;以及
在所述移动设备的屏幕上显示交互结果。
2.根据权利要求1所述的方法,其中,对图像内容进行分析的步骤进一步包括:
计算所述虚拟球附近的图像边缘;
计算碰撞速度和方向;
更新所述虚拟球的位置;以及
在所述移动设备的屏幕上显示所述虚拟球。
3.根据权利要求2所述的方法,其中,计算所述虚拟球附近的图像边缘的步骤进一步包括:
将输入图像转为灰度图;
计算所述虚拟球下半部分灰度直方图;
用所述直方图生成灰度概率图;以及
将所生成的概率图作为边缘判断依据。
4.根据权利要求3所述的方法,其中,所述图像边缘是灰度概率不同的区域交界。
5.根据权利要求2所述的方法,其中,所述碰撞速度在横纵两个方向上的速度相互独立,并且横向速度与横方向接触点到所述虚拟球球心的距离成正比,纵向速度与纵方向上接触点到球心的距离成反比。
CN201410619473.XA 2014-11-06 2014-11-06 移动设备的虚拟球模拟及控制的方法 Active CN104407696B (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN201410619473.XA CN104407696B (zh) 2014-11-06 2014-11-06 移动设备的虚拟球模拟及控制的方法
HK15105142.9A HK1204690A1 (zh) 2014-11-06 2015-05-29 移動設備的虛擬球模擬及控制的方法
PCT/CN2015/093735 WO2016070800A1 (zh) 2014-11-06 2015-11-03 移动设备的虚拟球模拟及控制的方法
US15/524,858 US10401947B2 (en) 2014-11-06 2015-11-03 Method for simulating and controlling virtual sphere in a mobile device
JP2017542256A JP6810048B2 (ja) 2014-11-06 2015-11-03 モバイル機器のバーチャルボールのシミュレーションおよびコントロールの方法
RU2017119469A RU2667720C1 (ru) 2014-11-06 2015-11-03 Способ имитационного моделирования и управления виртуальной сферой в мобильном устройстве

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410619473.XA CN104407696B (zh) 2014-11-06 2014-11-06 移动设备的虚拟球模拟及控制的方法

Publications (2)

Publication Number Publication Date
CN104407696A CN104407696A (zh) 2015-03-11
CN104407696B true CN104407696B (zh) 2016-10-05

Family

ID=52645333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410619473.XA Active CN104407696B (zh) 2014-11-06 2014-11-06 移动设备的虚拟球模拟及控制的方法

Country Status (6)

Country Link
US (1) US10401947B2 (zh)
JP (1) JP6810048B2 (zh)
CN (1) CN104407696B (zh)
HK (1) HK1204690A1 (zh)
RU (1) RU2667720C1 (zh)
WO (1) WO2016070800A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104407696B (zh) * 2014-11-06 2016-10-05 北京京东尚科信息技术有限公司 移动设备的虚拟球模拟及控制的方法
CN108735052B (zh) * 2018-05-09 2021-01-08 北京航空航天大学青岛研究院 一种基于slam的增强现实自由落体实验方法
CN110517341A (zh) * 2018-05-21 2019-11-29 北京京东尚科信息技术有限公司 视图的物理动画效果实现方法和装置
CN110278446B (zh) * 2019-06-20 2022-01-28 北京字节跳动网络技术有限公司 确定虚拟礼物展现信息的方法、装置及电子设备
CN113867523B (zh) * 2021-09-08 2024-05-28 北京工业大学 一种基于虚拟现实体感交互的现代冰壶运动模拟***和方法
CN114001731B (zh) * 2021-10-12 2023-03-07 苏州大学 虚拟圆球模型下极区惯性导航相位调制阻尼方法及***

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002082751A (ja) * 2000-09-08 2002-03-22 Mitsubishi Electric Corp 仮想空間とのインタラクション装置およびこれを応用した仮想空間システム
CN101893935A (zh) * 2010-07-14 2010-11-24 北京航空航天大学 基于真实球拍的协同式增强现实乒乓球***构建方法
CN102163077A (zh) * 2010-02-16 2011-08-24 微软公司 使用碰撞体来捕捉屏幕对象
CN102902355A (zh) * 2012-08-31 2013-01-30 中国科学院自动化研究所 移动设备的空间交互方法

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3463379B2 (ja) * 1994-10-19 2003-11-05 カシオ計算機株式会社 画像制御装置及び画像制御方法
JP2005253871A (ja) * 2004-03-15 2005-09-22 Vr Sports:Kk 通信対戦型バーチャルリアリティテニスゲームシステム
US7847808B2 (en) * 2006-07-19 2010-12-07 World Golf Tour, Inc. Photographic mapping in a simulation
JP4883774B2 (ja) * 2006-08-07 2012-02-22 キヤノン株式会社 情報処理装置及びその制御方法、プログラム
RU2451982C1 (ru) * 2008-06-24 2012-05-27 Олег Станиславович Рурин Способ воздействия на виртуальные объекты
JP5288375B2 (ja) * 2008-09-08 2013-09-11 国立大学法人広島大学 印加力推定装置及び方法
EP2193825B1 (en) 2008-12-03 2017-03-22 Alcatel Lucent Mobile device for augmented reality applications
US20110107216A1 (en) * 2009-11-03 2011-05-05 Qualcomm Incorporated Gesture-based user interface
US20120117514A1 (en) * 2010-11-04 2012-05-10 Microsoft Corporation Three-Dimensional User Interaction
US9529424B2 (en) * 2010-11-05 2016-12-27 Microsoft Technology Licensing, Llc Augmented reality with direct user interaction
US8717318B2 (en) * 2011-03-29 2014-05-06 Intel Corporation Continued virtual links between gestures and user interface elements
US9152306B2 (en) * 2011-03-29 2015-10-06 Intel Corporation Techniques for touch and non-touch user interaction input
US9183676B2 (en) * 2012-04-27 2015-11-10 Microsoft Technology Licensing, Llc Displaying a collision between real and virtual objects
US9041622B2 (en) * 2012-06-12 2015-05-26 Microsoft Technology Licensing, Llc Controlling a virtual object with a real controller device
JP2014006820A (ja) * 2012-06-26 2014-01-16 Honda Motor Co Ltd 車両周辺監視装置
US9741145B2 (en) * 2012-06-29 2017-08-22 Disney Enterprises, Inc. Augmented reality simulation continuum
US9466121B2 (en) * 2012-09-11 2016-10-11 Qualcomm Incorporated Devices and methods for augmented reality applications
US9552673B2 (en) * 2012-10-17 2017-01-24 Microsoft Technology Licensing, Llc Grasping virtual objects in augmented reality
US9367136B2 (en) 2013-04-12 2016-06-14 Microsoft Technology Licensing, Llc Holographic object feedback
US20150185826A1 (en) * 2013-12-30 2015-07-02 Daqri, Llc Mapping gestures to virtual functions
CN104407696B (zh) 2014-11-06 2016-10-05 北京京东尚科信息技术有限公司 移动设备的虚拟球模拟及控制的方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002082751A (ja) * 2000-09-08 2002-03-22 Mitsubishi Electric Corp 仮想空間とのインタラクション装置およびこれを応用した仮想空間システム
CN102163077A (zh) * 2010-02-16 2011-08-24 微软公司 使用碰撞体来捕捉屏幕对象
CN101893935A (zh) * 2010-07-14 2010-11-24 北京航空航天大学 基于真实球拍的协同式增强现实乒乓球***构建方法
CN102902355A (zh) * 2012-08-31 2013-01-30 中国科学院自动化研究所 移动设备的空间交互方法

Also Published As

Publication number Publication date
RU2667720C1 (ru) 2018-09-24
CN104407696A (zh) 2015-03-11
WO2016070800A1 (zh) 2016-05-12
JP2017534135A (ja) 2017-11-16
HK1204690A1 (zh) 2015-11-27
US20170315609A1 (en) 2017-11-02
US10401947B2 (en) 2019-09-03
JP6810048B2 (ja) 2021-01-06

Similar Documents

Publication Publication Date Title
CN104407696B (zh) 移动设备的虚拟球模拟及控制的方法
CN103713737B (zh) 用于智能眼镜的虚拟键盘***
US8564535B2 (en) Physical model based gesture recognition
CN102799317B (zh) 智能互动投影***
CN103793060B (zh) 一种用户交互***和方法
CN102915112A (zh) 用于近距离动作跟踪的***和方法
CN104281397B (zh) 多深度区间的重聚焦方法、装置及电子设备
CN107038455A (zh) 一种图像处理方法及装置
WO2012063560A1 (ja) 画像処理システム、画像処理方法、及び画像処理プログラムを記憶した記憶媒体
CN102541256A (zh) 具有视觉反馈的位置知晓姿势作为输入方法
CN103020885A (zh) 深度图像压缩
CN102509092A (zh) 体感手势识别方法
CN105107200A (zh) 基于实时深度体感交互与增强现实技术的变脸***及方法
CN102335510A (zh) 人机互动***
CN107272884A (zh) 一种基于虚拟现实技术的控制方法及其控制***
CN113052078A (zh) 空中书写轨迹识别方法、装置、存储介质及电子设备
CN105929946B (zh) 一种基于虚拟界面的自然交互方法
KR102063408B1 (ko) 가상 객체와의 상호 작용을 위한 방법 및 장치
Muller Multi-touch displays: design, applications and performance evaluation
CN104732570B (zh) 一种图像生成方法及装置
CN109375866A (zh) 一种屏幕触摸点击响应的方法及实现的***
CN106547339A (zh) 计算机设备的控制方法和装置
CN103389793B (zh) 人机交互方法和***
KR20140046197A (ko) 동작인식 장치 및 방법, 그리고 프로그램을 저장한 컴퓨터로 판독 가능한 기록매체
CN105843479A (zh) 一种内容交互方法及其***

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1204690

Country of ref document: HK

C14 Grant of patent or utility model
GR01 Patent grant
REG Reference to a national code

Ref country code: HK

Ref legal event code: GR

Ref document number: 1204690

Country of ref document: HK

TR01 Transfer of patent right

Effective date of registration: 20191128

Address after: 100176 room 222, 2f, building C, No. 18, Kechuang 11th Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Patentee after: Beijing Jingdong three hundred and sixty degree e-commerce Co., Ltd.

Address before: 100080 Beijing city Haidian District xingshikou Road No. 65 west Shan Creative Park District 11C four floor East West 1-4 layer 1-4 layer

Co-patentee before: Beijing Jingdong Century Commerce Co., Ltd.

Patentee before: Beijing Jingdong Shangke Information Technology Co., Ltd.

TR01 Transfer of patent right