WO2013182142A1 - 一种终端设备桌面的实现方法、***及终端设备 - Google Patents

一种终端设备桌面的实现方法、***及终端设备 Download PDF

Info

Publication number
WO2013182142A1
WO2013182142A1 PCT/CN2013/079519 CN2013079519W WO2013182142A1 WO 2013182142 A1 WO2013182142 A1 WO 2013182142A1 CN 2013079519 W CN2013079519 W CN 2013079519W WO 2013182142 A1 WO2013182142 A1 WO 2013182142A1
Authority
WO
WIPO (PCT)
Prior art keywords
desktop
message
user
scened
captured touch
Prior art date
Application number
PCT/CN2013/079519
Other languages
English (en)
French (fr)
Inventor
王大伟
范伟
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2013182142A1 publication Critical patent/WO2013182142A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object

Definitions

  • the present invention relates to the field of terminal equipment technologies, and in particular, to a method, a system, and a terminal device for implementing a desktop of a terminal device. Background technique
  • the desktop of the terminal device is generally a two-dimensional plane, and various abstract and flat icons are arranged in order, which is relatively monotonous.
  • each element on the desktop is an icon represented by an abstract picture, and the desktop is actually a large container in which these icons are placed.
  • an embodiment of the present invention provides a method for implementing a desktop of a terminal device, which is applied to a terminal device having a touch screen, and the method includes:
  • the drawing the 3D scened desktop on the screen comprises: loading a 3D stereoscopic 3D scened desktop model, parsing the 3D scened desktop model to obtain desktop data, and drawing a 3D scened desktop on the screen according to the desktop data.
  • the elements on the 3D scened desktop are 3D objects in the real scene, the 3D objects satisfy the spatial positional relationship in the real scene; and the user touches the touch message, If the captured touch message matches the predetermined message type, the user interacts with the 3D scened desktop according to the captured touch message.
  • the drawing the 3D scened desktop on the screen comprises: loading a 3D stereoscopic 3D scened desktop model, parsing the 3D scened desktop model to obtain desktop data, and drawing a 3D scened desktop on the screen according to the desktop data.
  • the predetermined message type includes one or more of the following: adding an object, deleting an object, moving the object, rotating the object, starting an application corresponding to the object, and simulating the user to walk in the scene;
  • the captured touch message is a combination of one or more of the following messages: clicking on the object, long pressing on the object, sliding on the object, dragging the object, long pressing in the blank area of the desktop, and blank space in the desktop slide.
  • clicking on the object long pressing on the object
  • sliding on the object sliding on the object
  • dragging the object long pressing in the blank area of the desktop
  • blank space in the desktop slide Preferably,
  • a list of objects is popped up.
  • the model file of the object is loaded, and the model file of the object is parsed to obtain model data of the object. , querying a spatial positional relationship between the pre-saved object and an existing object on the 3D scened desktop, and drawing the object on the screen;
  • the object is deleted from the 3D scened desktop after the user selects an object and drags it into the pop-up trash.
  • the captured touch message is a deleted object
  • the object is deleted from the 3D scened desktop after the user selects an object and drags it into the pop-up trash.
  • the collision detection of the object is performed: detecting whether the coordinates of the dragged object and the coordinates of other objects Intersect, if intersecting, stop moving the object, query a spatial positional relationship between the previously saved dragged object and the collided object, and display the dragged object at a position that conforms to the spatial positional relationship; If they do not intersect, the object is moved synchronously following the trajectory dragged by the user.
  • the collision detection of the object is performed: detecting whether the coordinates of the dragged object and the coordinates of other objects Intersect, if intersecting, stop moving the object, query a spatial positional relationship between the previously saved dragged object and the collided object, and display the dragged object at a position that conforms to the spatial positional relationship; If they do not intersect, the object is moved synchronously following the trajectory dragged by the user.
  • the captured touch message matches the type of the message as a rotating object, the object is rotated about its own central axis;
  • the captured touch message matches the type of the message to the application corresponding to the startup object, the application represented by the object is started;
  • the simulated user moves around in the scene, and the user's visual field in the real scene is simulated on the screen according to the touch trajectory of the user on the screen.
  • the embodiment of the present invention further provides a system for implementing a desktop of a terminal device, which is applied to a terminal device having a touch screen, the system comprising:
  • a desktop display module configured to draw a 3D scened desktop on the screen; wherein, the elements on the 3D scene-based desktop are 3D objects in the real scene, and the 3D objects satisfy a spatial position relationship in the real scene;
  • the interactive operation processing module is configured to listen to the touch message of the user. If the captured touch message matches the predetermined message type, the user interacts with the 3D scened desktop according to the captured touch message.
  • the user interacts with the 3D scened desktop according to the captured touch message.
  • the desktop display module draws a 3D scened desktop on the screen, including: loading a three-dimensional
  • the 3D scenes the desktop model parses the 3D scened desktop model to obtain desktop data, and draws a 3D scened desktop on the screen according to the desktop data.
  • the interactive operation processing module matches the captured touch message with a predetermined message type, wherein: the predetermined message type includes one or more of the following: adding an object, deleting an object, moving the object, rotating the object, and starting the object corresponding to the Application, and simulated user walking around the scene;
  • the captured touch message is a combination of one or more of the following messages: clicking on the object, long pressing on the object, sliding on the object, dragging the object, long pressing in the blank area of the desktop, and blank space in the desktop slide.
  • the predetermined message type includes one or more of the following: adding an object, deleting an object, moving the object, rotating the object, and starting the object corresponding to the Application, and simulated user walking around the scene
  • the captured touch message is a combination of one or more of the following messages: clicking on the object, long pressing on the object, sliding on the object, dragging the object, long pressing in the blank area of the desktop, and blank space in the desktop slide.
  • the predetermined message type includes one or more of the following: adding an object
  • the interaction operation processing module performs interaction between the user and the 3D scene desktop according to the captured touch message, including:
  • a list of objects is popped up.
  • the model file of the object is loaded, and the model file of the object is parsed to obtain model data of the object. , querying a spatial positional relationship between the pre-saved object and an existing object on the 3D scened desktop, and drawing the object on the screen;
  • the object is deleted from the 3D scened desktop after the user selects an object and drags it into the pop-up trash.
  • the interaction operation processing module performs interaction between the user and the 3D scene desktop according to the captured touch message, including:
  • the collision detection of the object is performed: detecting whether the coordinates of the dragged object and the coordinates of other objects Intersect, if intersecting, stop moving the object, query a spatial positional relationship between the previously saved dragged object and the collided object, and display the dragged object at a position that conforms to the spatial positional relationship; If they do not intersect, the object is moved synchronously following the trajectory dragged by the user.
  • the collision detection of the object is performed: detecting whether the coordinates of the dragged object and the coordinates of other objects Intersect, if intersecting, stop moving the object, query a spatial positional relationship between the previously saved dragged object and the collided object, and display the dragged object at a position that conforms to the spatial positional relationship; If they do not intersect, the object is moved synchronously following the trajectory dragged by the user.
  • the interaction operation processing module performs interaction between the user and the 3D scene desktop according to the captured touch message, including:
  • the captured touch message matches the type of the message as a rotating object, the object is rotated about its own central axis;
  • the simulated user moves around in the scene, and the user's visual field in the real scene is simulated on the screen according to the touch trajectory of the user on the screen.
  • an embodiment of the present invention further provides a terminal device, including any implementation system of a terminal device desktop as described above.
  • the method, system, and terminal device for implementing the desktop of the terminal device provided by the embodiment of the present invention can provide a real-life three-dimensional stereo desktop to the user, and enhance the user's interest in using the terminal device. Interactive, enhance the user experience. BRIEF abstract
  • FIG. 1 is a schematic diagram of a conventional desktop in the prior art.
  • FIG. 2 is a flowchart of a method for implementing a desktop of a terminal device according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a 3D scened desktop according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of an implementation system of a desktop device of a terminal device according to an embodiment of the present invention. Preferred embodiment of the invention
  • an embodiment of the present invention provides a method for implementing a desktop of a terminal device, which is applied to a terminal device having a touch screen, and the method includes:
  • the terminal device draws a 3D scened desktop on the screen; wherein, the elements on the 3D scened desktop are 3D objects in the real scene, and the 3D objects satisfy the spatial positional relationship in the real scene; S20.
  • the terminal device listens to the touch message of the user. If the captured touch message matches the predetermined message type, the user interacts with the 3D scened desktop according to the captured touch message.
  • each element on the 3D scened desktop is a 3D object in the real scene, and the 3D objects satisfy the spatial positional relationship in the real scene;
  • the drawing a 3D scened desktop on the screen comprises: loading a 3D stereoscopic 3D scened desktop model, parsing the 3D scened desktop model to obtain desktop data, and drawing a 3D scened desktop on the screen according to the desktop data.
  • the method further includes: presetting and saving the spatial positional relationship between the 3D objects in the real scene.
  • the 3D scene is, for example, a living room, an office, a book room, and the like.
  • the living room scene includes at least: 3D objects such as walls, ceilings, floors, windows, doors, etc., and may also include 3D furniture (eg, sofa, TV rejection), 3D appliances (eg, television, telephone), and the like.
  • the 3D objects satisfy the spatial positional relationship in the real scene.
  • the spatial positional relationship between the television and the table may be: The television is placed on the table, so on the 3D scened desktop, the television does not Appearing under the desktop, there will be no vertical desktop.
  • the method further includes: detecting whether the user has enabled the 3D scened desktop application, and if the user opens the application of the 3D scened desktop, loading and parsing the 3D scened desktop model;
  • the 3D stereoscopic desktop model is loaded, the 3D scened desktop model is parsed to obtain desktop data, and the 3D scened desktop is drawn on the screen according to the desktop data, including: loading a 3D scened desktop model file from the file system Parsing the 3D scened desktop model file to obtain desktop data, drawing an outline of the object on the screen according to the desktop data, and then attaching the texture image to the rim of the object.
  • the predetermined message type includes one or more of the following: adding an object, deleting an object, moving an object, rotating the object, starting an application corresponding to the object, and simulating the user to walk in the scene;
  • the user's touch message is a combination of one or more of the following messages: clicking on the object, long pressing on the object, sliding on the object, dragging the object, long pressing in the blank area of the desktop, and blank space in the desktop Sliding
  • the captured touch message matches the predetermined message type, which means: Corresponding relationship between each type of touch message and a predetermined message type is saved. If the captured touch message belongs to the touch message having the corresponding relationship, it indicates that the captured touch message matches the corresponding predetermined message type.
  • a touch message that is long pressed in the blank area of the desktop matches the message type of the added object; a touch message that is long pressed and dragged to the trash on the object matches the message type of the deleted object; long press and drag on the object
  • the touch message matches the message type of the moving object; the touch message sliding on the object matches the message type of the rotating object; the touch message clicked on the object matches the message type of the application corresponding to the startup object; the touch sliding in the blank area of the desktop
  • the message matches the type of message that the simulated user walks in the scene;
  • the interaction between the user and the 3D scened desktop is performed according to the captured touch message, including:
  • a list of objects is popped up.
  • the model file of the object is loaded, and the model file of the object is parsed to obtain model data of the object. , querying a spatial positional relationship between the pre-saved object and an existing object on the 3D scened desktop, and drawing the object on the screen;
  • the captured touch message matches the type of the message as a deleted object, after the user selects an object and drags it into the pop-up trash, the object is deleted from the 3D scened desktop;
  • the type of the message touched by the touch message is a moving object, and when the user selects an object and drags it, the collision detection of the object is performed: detecting whether the coordinates of the dragged object intersect with the coordinates of other objects, such as intersecting , stopping moving the object, querying a spatial positional relationship between the previously saved dragged object and the collided object, and displaying the dragged object at a position that conforms to the spatial positional relationship; if not, Then moving the object synchronously following the trajectory dragged by the user;
  • the object is rotated around its central axis; for example, the object is gradually rotated about its central axis according to the sliding trajectory on the object according to the touch operation;
  • a list of rotation options is popped up, and after the user selects a rotation option, the object is rotated according to the requirements of the selected rotation option; If the captured touch message matches the type of the message to the application corresponding to the startup object, the application represented by the object is started;
  • an embodiment of the present invention provides a system for implementing a desktop of a terminal device, which is applied to a terminal device having a touch screen, and the system includes:
  • the desktop display module 11 is configured to draw a 3D scened desktop on the screen; wherein the elements on the 3D scened desktop are 3D objects in the real scene, and the 3D objects satisfy the spatial positional relationship in the real scene;
  • the interactive operation processing module 12 is configured to listen to the touch message of the user. If the captured touch message matches the predetermined message type, the user interacts with the 3D scene desktop according to the captured touch message.
  • the desktop display module 11 draws a 3D scened desktop on the screen, including: loading a three-dimensional stereoscopic 3D scened desktop model, parsing the 3D scened desktop model to obtain desktop data, and drawing a 3D scene on the screen according to the desktop data. desktop.
  • the predetermined message type includes one or more of the following: adding an object, deleting the object, moving the object, rotating the object, and starting the application corresponding to the object, Or simulate the user walking in the scene;
  • the captured touch message is a combination of one or more of the following messages: click on the object, long press on the object, slide on the object, drag the object, long press in the blank area of the desktop , and slide in the blank area of the desktop.
  • the interaction operation processing module 12 performs interaction between the user and the 3D scened desktop according to the captured touch message, including:
  • a list of objects is popped up.
  • the model file of the object is loaded, and the model file of the object is parsed to obtain model data of the object. , querying a spatial positional relationship between the pre-saved object and an existing object on the 3D scened desktop, and drawing the object on the screen;
  • the user selects a certain After the object is dragged into the pop-up trash, the object is deleted from the 3D scened desktop; if the captured touch message matches the type of the message as a moving object, the user selects an object and performs When dragging, perform collision detection of the object: detect whether the coordinates of the dragged object intersect with the coordinates of other objects. If intersecting, stop moving the object, and query the pre-stored dragged object and the colliding object. The spatial positional relationship between the objects to be dragged is displayed at a position that conforms to the spatial positional relationship; if not intersected, the object is moved synchronously following the trajectory dragged by the user;
  • the captured touch message matches the type of the message as a rotating object, the object is rotated about its own central axis;
  • the captured touch message matches the type of the message to the application corresponding to the startup object, the application represented by the object is started;
  • the embodiment of the invention further provides a terminal device, which has the implementation system of the desktop of the terminal device.
  • the method, the system and the terminal device for implementing the desktop of the terminal device provided by the foregoing embodiments can provide a real-life three-dimensional desktop to the user, enhance the user's interest and interaction with the terminal device, and enhance the user experience.
  • each module/unit in the foregoing embodiment may be implemented in the form of hardware, or may be implemented by using a software function module. The form is implemented. The invention is not limited to any specific form of combination of hardware and software.
  • the method, system, and terminal device for implementing a desktop of a terminal device provided by an embodiment of the present invention, by drawing a 3D scened desktop on a screen, and listening to a touch message of the user, such as a captured touch message and a predetermined
  • a touch message of the user such as a captured touch message and a predetermined
  • the message type is matched, and the interaction between the user and the 3D scened desktop is performed according to the captured touch message, and the user can provide a real-life three-dimensional desktop to enhance the user's interest and interaction with the terminal device.
  • Improve the user experience by drawing a 3D scened desktop on a screen, and listening to a touch message of the user, such as a captured touch message and a predetermined
  • the message type is matched, and the interaction between the user and the 3D scened desktop is performed according to the captured touch message, and the user can provide a real-life three-dimensional desktop to enhance the user's interest and interaction with the terminal device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

一种终端设备桌面的实现方法和***及相应的终端设备,终端设备中的该实现***包括桌面显示模块和互动操作处理模块,实现时,桌面显示模块在屏幕上绘制3D场景化桌面,互动操作处理模块侦听用户的触摸消息,如捕获的触摸消息与预定的消息类型匹配,则根据所述捕获的触摸消息进行用户与所述3D场景化桌面的互动,其中,所述3D场景化桌面上的元素为真实场景中的3D物体且所述3D物体之间满足真实场景中的空间位置关系。上述方案能够向用户提供一种真实场景化的三维立体桌面,增强用户使用终端设备的趣味性和互动性,提升用户体验。

Description

一种终端设备桌面的实现方法、 ***及终端设备 技术领域
本发明涉及终端设备技术领域, 尤其涉及的是一种终端设备桌面的实现 方法、 ***和终端设备。 背景技术
随着科学技术日新月异的发展, 手机、 平板电脑等便携式智能终端设备 的普及率越来越高。 但是, 目前终端设备的桌面一般是二维的平面, 上面顺 序排列着各种抽象、 平面的图标, 比较单调。 如图 1所示的传统桌面, 桌面 上每个元素都是一个由抽象图片代表的图标, 桌面实际上是一个大的容器, 其中放置这些图标。
随着终端设备的硬件配置越来越高, 平面的、 抽象的桌面已经不能满足 用户日益增高的使用需求, 因此, 如何向用户提供一种新颖的桌面以提升用 户体验, 是需要解决的技术问题。 发明内容
本发明实施例所要解决的技术问题是提供一种终端设备桌面的实现方 法、 ***及终端设备, 能够向用户提供一种真实场景化的三维立体桌面, 增 强用户使用终端设备的趣味性和互动性, 提升用户体验。 为了解决上述技术问题, 本发明实施例提供了一种终端设备桌面的实现 方法, 应用于具有触摸屏的终端设备上, 该方法包括:
在屏幕上绘制 3D场景化桌面; 其中, 所述 3D场景化桌面上的元素为真 实场景中的 3D物体, 所述 3D物体之间满足真实场景中的空间位置关系; 侦听用户的触摸消息, 如捕获的触摸消息与预定的消息类型匹配, 则根 据所述捕获的触摸消息进行用户与所述 3D场景化桌面的互动。 较佳地, 所述在屏幕上绘制 3D场景化桌面, 包括: 加载三维立体 3D场景化桌面 模型, 解析所述 3D 场景化桌面模型获得桌面数据, 根据所述桌面数据在屏 幕上绘制 3D场景化桌面。 较佳地,
所述预定的消息类型包括以下一种或多种: 增加物体、 删除物体、 移动 物体、 旋转物体、 启动物体对应的应用、 和模拟用户在场景中走动;
所述捕获的触摸消息为以下消息的一种或多种的组合: 在物体上点击、 在物体上长按、 在物体上滑动、 拖动物体、 在桌面空白区长按、 和在桌面空 白区滑动。 较佳地,
所述根据所述捕获的触摸消息进行用户与所述 3D 场景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为增加物体, 则弹出一个物体 列表, 在用户选定某个物体后, 加载该物体的模型文件, 解析该物体的模型 文件获得该物体的模型数据, 查询预先保存的该物体与 3D 场景化桌面上的 已有物体之间的空间位置关系, 在屏幕上绘制出该物体;
如所述捕获的触摸消息为删除物体, 则在用户选定某个物体并将其拖入 弹出的垃圾篓中后, 从 3D场景化桌面上删除该物体。 较佳地,
所述根据所述捕获的触摸消息进行用户与所述 3D 场景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为移动物体, 则在用户选定某 个物体并进行拖动时, 进行物体的碰撞检测: 检测被拖动的物体的坐标是否 与其它物体的坐标相交, 如相交, 则停止移动该物体, 查询预先保存的被拖 动的物体与发生碰撞的物体之间的空间位置关系, 将被拖动的物体显示在符 合所述空间位置关系的位置上; 如不相交, 则跟随用户拖动的轨迹同步移动 该物体。 较佳地
所述根据所述捕获的触摸消息进行用户与所述 3D 场景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为旋转物体, 则使物体绕着自 身的中心轴做旋转;
如所述捕获的触摸消息匹配到的消息类型为启动物体对应的应用, 则启 动该物体所代表的应用程序;
如所述捕获的触摸消息匹配到的消息类型为模拟用户在场景中走动, 则 根据用户在屏幕上的触摸轨迹,在屏幕上模拟显示用户在真实场景中的视野。
为了解决上述技术问题, 本发明实施例还提供了一种终端设备桌面的实 现***, 应用于具有触摸屏的终端设备上, 该***包括:
桌面显示模块, 设置为在屏幕上绘制 3D场景化桌面; 其中, 所述 3D场 景化桌面上的元素为真实场景中的 3D物体, 所述 3D物体之间满足真实场景 中的空间位置关系;
互动操作处理模块, 设置为侦听用户的触摸消息, 如捕获的触摸消息与 预定的消息类型匹配, 则根据所述捕获的触摸消息进行用户与所述 3D 场景 化桌面的互动。 较佳地
所述桌面显示模块在屏幕上绘制 3D 场景化桌面, 包括: 加载三维立体
3D场景化桌面模型, 解析所述 3D场景化桌面模型获得桌面数据, 根据所述 桌面数据在屏幕上绘制 3D场景化桌面。 较佳地
所述互动操作处理模块将捕获的触摸消息与预定的消息类型匹配,其中: 所述预定的消息类型包括以下一种或多种: 增加物体、 删除物体、 移动 物体、 旋转物体、 启动物体对应的应用、 和模拟用户在场景中走动; 所述捕获的触摸消息为以下消息的一种或多种的组合: 在物体上点击、 在物体上长按、 在物体上滑动、 拖动物体、 在桌面空白区长按、 和在桌面空 白区滑动。 较佳地
所述互动操作处理模块根据所述捕获的触摸消息进行用户与所述 3D 场 景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为增加物体, 则弹出一个物体 列表, 在用户选定某个物体后, 加载该物体的模型文件, 解析该物体的模型 文件获得该物体的模型数据, 查询预先保存的该物体与 3D 场景化桌面上的 已有物体之间的空间位置关系, 在屏幕上绘制出该物体;
如所述捕获的触摸消息匹配到的消息类型为删除物体, 则在用户选定某 个物体并将其拖入弹出的垃圾篓中后, 从 3D场景化桌面上删除该物体。 较佳地
所述互动操作处理模块根据所述捕获的触摸消息进行用户与所述 3D 场 景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为移动物体, 则在用户选定某 个物体并进行拖动时, 进行物体的碰撞检测: 检测被拖动的物体的坐标是否 与其它物体的坐标相交, 如相交, 则停止移动该物体, 查询预先保存的被拖 动的物体与发生碰撞的物体之间的空间位置关系, 将被拖动的物体显示在符 合所述空间位置关系的位置上; 如不相交, 则跟随用户拖动的轨迹同步移动 该物体。 较佳地
所述互动操作处理模块根据所述捕获的触摸消息进行用户与所述 3D 场 景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为旋转物体, 则使物体绕着自 身的中心轴做旋转;
如所述捕获的触摸消息匹配到的消息类型为启动物体对应的应用, 则启 动该物体所代表的应用程序;
如所述捕获的触摸消息匹配到的消息类型为模拟用户在场景中走动, 则 根据用户在屏幕上的触摸轨迹,在屏幕上模拟显示用户在真实场景中的视野。
为了解决上述技术问题, 本发明实施例还提供了一种终端设备, 包括如 上所述的任意一种终端设备桌面的实现***。
与现有技术相比, 本发明实施例提供的一种终端设备桌面的实现方法、 ***和终端设备, 能够向用户提供一种真实场景化的三维立体桌面, 增强用 户使用终端设备的趣味性和互动性, 提升用户体验。 附图概述
图 1为现有技术中的传统桌面的示意图。
图 2为本发明实施例终端设备桌面的实现方法的流程图。
图 3为本发明实施例的 3D场景化桌面的示意图。
图 4 为本发明实施例终端设备桌面的实现***的结构示意图。 本发明的较佳实施方式
为使本发明的目的、 技术方案和优点更加清楚明白, 下文中将结合附图 对本发明的实施例进行详细说明。 需要说明的是, 在不冲突的情况下, 本申 请中的实施例及实施例中的特征可以相互任意组合。 如图 2所示, 本发明实施例提供了一种终端设备桌面的实现方法, 应用 于具有触摸屏的终端设备上, 该方法包括:
S10 , 终端设备在屏幕上绘制 3D场景化桌面; 其中, 所述 3D场景化桌 面上的元素为真实场景中的 3D物体, 所述 3D物体之间满足真实场景中的空 间位置关系; S20, 终端设备侦听用户的触摸消息, 如捕获的触摸消息与预定的消息类 型匹配, 则根据所述捕获的触摸消息进行用户与所述 3D场景化桌面的互动。
其中, 如图 3 所示, 3D 场景化桌面上的每一个元素都是真实场景中的 3D物体, 所述 3D物体之间满足真实场景中的空间位置关系;
其中, 在屏幕上绘制 3D场景化桌面, 包括: 加载三维立体 3D场景化桌 面模型, 解析所述 3D 场景化桌面模型获得桌面数据, 根据所述桌面数据在 屏幕上绘制 3D场景化桌面。
其中, 加载 3D 场景化桌面模型之前, 还包括: 预先设置并保存真实场 景中各 3D物体之间的空间位置关系。 所述 3D场景比如, 客厅、 办公室、 书 房等。 具体地, 客厅场景至少包括: 墙壁、 天花板、 地板、 窗户、 门等 3D 物体, 还可以包括 3D家具(比如, 沙发、 电视拒)、 3D电器(比如, 电视机、 电话)等。 其中, 所述 3D物体之间满足真实场景中的空间位置关系, 比如, 电视机与桌子的空间位置关系可以是: 电视机放置在桌子上, 因此, 在 3D 场景化桌面上, 电视机不会出现在桌面下面, 也不会出现纵穿桌面的情况。
其中, 加载 3D场景化桌面模型之前, 还包括: 检测用户是否开启了 3D 场景化桌面的应用, 如检测到用户开启了 3D 场景化桌面的应用, 则加载并 解析 3D场景化桌面模型;
其中, 加载三维立体 3D场景化桌面模型, 解析所述 3D场景化桌面模型 获得桌面数据, 根据所述桌面数据在屏幕上绘制 3D 场景化桌面, 包括: 从 文件***中加载 3D场景化桌面模型文件,对所述 3D场景化桌面模型文件进 行解析获得桌面数据, 根据所述桌面数据在屏幕上绘制物体的轮廓, 然后将 纹理图片贴到物体的轮靡上。
其中, 预定的消息类型包括以下一种或多种: 增加物体、 删除物体、 移 动物体、 旋转物体、 启动物体对应的应用、 和模拟用户在场景中走动;
其中, 用户的触摸消息为以下消息的一种或多种的组合: 在物体上点击、 在物体上长按、 在物体上滑动、 拖动物体、 在桌面空白区长按、 和在桌面空 白区滑动;
其中, 捕获的触摸消息与预定的消息类型匹配是指: 在终端设备上预先 保存有每一种触摸消息与预定的消息类型的对应关系, 如捕获的触摸消息属 于具有所述对应关系的触摸消息, 则表示捕获的触摸消息与对应的预定的消 息类型匹配。
比如, 在桌面空白区长按的触摸消息与增加物体的消息类型匹配; 在物 体上长按并拖拽至垃圾篓的触摸消息与删除物体的消息类型匹配; 在物体上 长按并拖拽的触摸消息与移动物体的消息类型匹配; 在物体上滑动的触摸消 息与旋转物体的消息类型匹配; 在物体上点击的触摸消息与启动物体对应的 应用的消息类型匹配; 在桌面空白区滑动的触摸消息与模拟用户在场景中走 动的消息类型匹配;
其中, 根据所述捕获的触摸消息进行用户与所述 3D场景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为增加物体, 则弹出一个物体 列表, 在用户选定某个物体后, 加载该物体的模型文件, 解析该物体的模型 文件获得该物体的模型数据, 查询预先保存的该物体与 3D 场景化桌面上的 已有物体之间的空间位置关系, 在屏幕上绘制出该物体;
如所述捕获的触摸消息匹配到的消息类型为删除物体, 则在用户选定某 个物体并将其拖入弹出的垃圾篓中后, 从 3D场景化桌面上删除该物体; 如所述捕获的触摸消息匹配到的消息类型为移动物体, 则在用户选定某 个物体并进行拖动时, 进行物体的碰撞检测: 检测被拖动的物体的坐标是否 与其它物体的坐标相交, 如相交, 则停止移动该物体, 查询预先保存的被拖 动的物体与发生碰撞的物体之间的空间位置关系, 将被拖动的物体显示在符 合所述空间位置关系的位置上; 如不相交, 则跟随用户拖动的轨迹同步移动 该物体;
如所述捕获的触摸消息匹配到的消息类型为旋转物体, 则使物体绕着自 身的中心轴做旋转; 比如, 根据触摸操作在物体上的滑动轨迹使物体绕着自 身的中心轴逐渐旋转; 或者, 如所述捕获的触摸消息为旋转物体, 则弹出一 个旋转选项的列表, 在用户选定某个旋转选项后, 按照选定的旋转选项的要 求对物体进行旋转; 如所述捕获的触摸消息匹配到的消息类型为启动物体对应的应用, 则启 动该物体所代表的应用程序;
如所述捕获的触摸消息匹配到的消息类型为模拟用户在场景中走动, 则 根据用户在屏幕上的触摸轨迹,在屏幕上模拟显示用户在真实场景中的视野。 如图 4所示, 本发明实施例提供了一种终端设备桌面的实现***, 应用 于具有触摸屏的终端设备上, 该***包括:
桌面显示模块 11, 设置为在屏幕上绘制 3D场景化桌面; 其中, 所述 3D 场景化桌面上的元素为真实场景中的 3D物体, 所述 3D物体之间满足真实场 景中的空间位置关系;
互动操作处理模块 12, 设置为侦听用户的触摸消息, 如捕获的触摸消息 与预定的消息类型匹配, 则根据所述捕获的触摸消息进行用户与所述 3D 场 景化桌面的互动。
其中, 桌面显示模块 11在屏幕上绘制 3D场景化桌面, 包括: 加载三维 立体 3D场景化桌面模型, 解析所述 3D场景化桌面模型获得桌面数据, 根据 所述桌面数据在屏幕上绘制 3D场景化桌面。 其中, 互动操作处理模块 12 将捕获的触摸消息与预定的消息类型匹配 时, 预定的消息类型包括以下一种或多种: 增加物体、 删除物体、 移动物体、 旋转物体、 启动物体对应的应用、 或模拟用户在场景中走动; 捕获的触摸消 息为以下消息的一种或多种的组合: 在物体上点击、 在物体上长按、 在物体 上滑动、 拖动物体、 在桌面空白区长按、 和在桌面空白区滑动。 其中, 互动操作处理模块 12 根据所述捕获的触摸消息进行用户与所述 3D场景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为增加物体, 则弹出一个物体 列表, 在用户选定某个物体后, 加载该物体的模型文件, 解析该物体的模型 文件获得该物体的模型数据, 查询预先保存的该物体与 3D 场景化桌面上的 已有物体之间的空间位置关系, 在屏幕上绘制出该物体;
如所述捕获的触摸消息匹配到的消息类型为删除物体, 则在用户选定某 个物体并将其拖入弹出的垃圾篓中后, 从 3D场景化桌面上删除该物体; 如所述捕获的触摸消息匹配到的消息类型为移动物体, 则在用户选定某 个物体并进行拖动时, 进行物体的碰撞检测: 检测被拖动的物体的坐标是否 与其它物体的坐标相交, 如相交, 则停止移动该物体, 查询预先保存的被拖 动的物体与发生碰撞的物体之间的空间位置关系, 将被拖动的物体显示在符 合所述空间位置关系的位置上; 如不相交, 则跟随用户拖动的轨迹同步移动 该物体;
如所述捕获的触摸消息匹配到的消息类型为旋转物体, 则使物体绕着自 身的中心轴做旋转;
如所述捕获的触摸消息匹配到的消息类型为启动物体对应的应用, 则启 动该物体所代表的应用程序;
如所述捕获的触摸消息匹配到的消息类型为模拟用户在场景中走动, 则 根据用户在屏幕上的触摸轨迹,在屏幕上模拟显示用户在真实场景中的视野。 本发明实施例还提供一种终端设备,具有上述终端设备桌面的实现***。 上述实施例提供的一种终端设备桌面的实现方法、 ***及终端设备, 能 够向用户提供一种真实场景化的三维立体桌面, 增强用户使用终端设备的趣 味性和互动性, 提升用户体验。
本领域普通技术人员可以理解上述方法中的全部或部分步骤可通过程序 来指令相关硬件完成, 所述程序可以存储于计算机可读存储介质中, 如只读 存储器、 磁盘或光盘等。 可选地, 上述实施例的全部或部分步骤也可以使用 一个或多个集成电路来实现, 相应地, 上述实施例中的各模块 /单元可以釆用 硬件的形式实现, 也可以采用软件功能模块的形式实现。 本发明不限制于任 何特定形式的硬件和软件的结合。
需要说明的是, 本发明还可有其他多种实施例, 在不背离本发明精神及 其实质的情况下, 熟悉本领域的技术人员可根据本发明实施例作出各种相应 的改变和变形, 但这些相应的改变和变形都应属于本发明所附的权利要求的 保护范围。
工业实用性
与有关技术相比, 本发明实施方式所提供的终端设备桌面的实现方法、 ***及终端设备, 通过在屏幕上绘制 3D场景化桌面, 侦听用户的触摸消息, 如捕获的触摸消息与预定的消息类型匹配, 则根据所述捕获的触摸消息进行 用户与所述 3D 场景化桌面的互动, 能够向用户提供一种真实场景化的三维 立体桌面, 增强用户使用终端设备的趣味性和互动性, 提升用户体验。

Claims

权 利 要 求 书
1、 一种终端设备桌面的实现方法, 应用于具有触摸屏的终端设备上, 该 方法包括:
在屏幕上绘制 3D场景化桌面; 其中, 所述 3D场景化桌面上的元素为真 实场景中的 3D物体, 所述 3D物体之间满足真实场景中的空间位置关系; 侦听用户的触摸消息, 如捕获的触摸消息与预定的消息类型匹配, 则根 据所述捕获的触摸消息进行用户与所述 3D场景化桌面的互动。
2、 如权利要求 1所述的实现方法, 其中:
所述在屏幕上绘制 3D场景化桌面, 包括: 加载三维立体 3D场景化桌面 模型, 解析所述 3D 场景化桌面模型获得桌面数据, 根据所述桌面数据在屏 幕上绘制 3D场景化桌面。
3、 如权利要求 1所述的实现方法, 其中:
所述预定的消息类型包括以下一种或多种: 增加物体、 删除物体、 移动 物体、 旋转物体、 启动物体对应的应用、 或模拟用户在场景中走动;
所述捕获的触摸消息为以下消息的一种或多种的组合: 在物体上点击、 在物体上长按、 在物体上滑动、 拖动物体、 在桌面空白区长按、 和在桌面空 白区滑动。
4、 如权利要求 3所述的实现方法, 其中:
所述根据所述捕获的触摸消息进行用户与所述 3D 场景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为增加物体, 则弹出一个物体 列表, 在用户选定某个物体后, 加载该物体的模型文件, 解析该物体的模型 文件获得该物体的模型数据, 查询预先保存的该物体与 3D 场景化桌面上的 已有物体之间的空间位置关系, 在屏幕上绘制出该物体;
如所述捕获的触摸消息匹配到的消息类型为删除物体, 则在用户选定某 个物体并将其拖入弹出的垃圾篓中后, 从 3D场景化桌面上删除该物体。
5、 如权利要求 3所述的实现方法, 其中:
所述根据所述捕获的触摸消息进行用户与所述 3D 场景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为移动物体, 则在用户选定某 个物体并进行拖动时, 进行物体的碰撞检测: 检测被拖动的物体的坐标是否 与其它物体的坐标相交, 如相交, 则停止移动该物体, 查询预先保存的被拖 动的物体与发生碰撞的物体之间的空间位置关系, 将被拖动的物体显示在符 合所述空间位置关系的位置上; 如不相交, 则跟随用户拖动的轨迹同步移动 该物体。
6、 如权利要求 3所述的实现方法, 其中:
所述根据所述捕获的触摸消息进行用户与所述 3D 场景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为旋转物体, 则使物体绕着自 身的中心轴做旋转;
如所述捕获的触摸消息匹配到的消息类型为启动物体对应的应用, 则启 动该物体所代表的应用程序;
如所述捕获的触摸消息匹配到的消息类型为模拟用户在场景中走动, 则 根据用户在屏幕上的触摸轨迹,在屏幕上模拟显示用户在真实场景中的视野。
7、 一种终端设备桌面的实现***, 应用于具有触摸屏的终端设备上, 该 ***包括:
桌面显示模块, 设置为在屏幕上绘制 3D场景化桌面; 其中, 所述 3D场 景化桌面上的元素为真实场景中的 3D物体, 所述 3D物体之间满足真实场景 中的空间位置关系;
互动操作处理模块, 设置为侦听用户的触摸消息, 如捕获的触摸消息与 预定的消息类型匹配, 则根据所述捕获的触摸消息进行用户与所述 3D 场景 化桌面的互动。
8、 如权利要求 7所述的实现***, 其中: 所述桌面显示模块在屏幕上绘制 3D 场景化桌面, 包括: 加载三维立体 3D场景化桌面模型, 解析所述 3D场景化桌面模型获得桌面数据, 根据所述 桌面数据在屏幕上绘制 3D场景化桌面。
9、 如权利要求 7所述的实现***, 其中:
所述互动操作处理模块将捕获的触摸消息与预定的消息类型匹配,其中: 所述预定的消息类型包括: 增加物体、 删除物体、 移动物体、 旋转物体、 启动物体对应的应用、 或模拟用户在场景中走动;
所述捕获的触摸消息为以下消息的一种或多种的组合: 在物体上点击、 在物体上长按、 在物体上滑动、 拖动物体、 在桌面空白区长按、 和在桌面空 白区滑动。
10、 如权利要求 9所述的实现***, 其特征在于:
所述互动操作处理模块根据所述捕获的触摸消息进行用户与所述 3D 场 景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为增加物体, 则弹出一个物体 列表, 在用户选定某个物体后, 加载该物体的模型文件, 解析该物体的模型 文件获得该物体的模型数据, 查询预先保存的该物体与 3D 场景化桌面上的 已有物体之间的空间位置关系, 在屏幕上绘制出该物体;
如所述捕获的触摸消息匹配到的消息类型为删除物体, 则在用户选定某 个物体并将其拖入弹出的垃圾篓中后, 从 3D场景化桌面上删除该物体。
11、 如权利要求 9所述的实现***, 其特征在于:
所述互动操作处理模块根据所述捕获的触摸消息进行用户与所述 3D 场 景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为移动物体, 则在用户选定某 个物体并进行拖动时, 进行物体的碰撞检测: 检测被拖动的物体的坐标是否 与其它物体的坐标相交, 如相交, 则停止移动该物体, 查询预先保存的被拖 动的物体与发生碰撞的物体之间的空间位置关系, 将被拖动的物体显示在符 合所述空间位置关系的位置上; 如不相交, 则跟随用户拖动的轨迹同步移动 该物体。
12、 如权利要求 9所述的实现***, 其特征在于:
所述互动操作处理模块根据所述捕获的触摸消息进行用户与所述 3D 场 景化桌面的互动, 包括:
如所述捕获的触摸消息匹配到的消息类型为旋转物体, 则使物体绕着自 身的中心轴做旋转;
如所述捕获的触摸消息匹配到的消息类型为启动物体对应的应用, 则启 动该物体所代表的应用程序;
如所述捕获的触摸消息匹配到的消息类型为模拟用户在场景中走动, 则 根据用户在屏幕上的触摸轨迹,在屏幕上模拟显示用户在真实场景中的视野。
13、 一种终端设备, 其特征在于, 包括如权利要求 7至 12中任一权利要 求所述的终端设备桌面的实现***。
PCT/CN2013/079519 2012-12-18 2013-07-17 一种终端设备桌面的实现方法、***及终端设备 WO2013182142A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2012105515236A CN103064617A (zh) 2012-12-18 2012-12-18 一种三维立体场景化桌面的实现方法和***
CN201210551523.6 2012-12-18

Publications (1)

Publication Number Publication Date
WO2013182142A1 true WO2013182142A1 (zh) 2013-12-12

Family

ID=48107261

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/079519 WO2013182142A1 (zh) 2012-12-18 2013-07-17 一种终端设备桌面的实现方法、***及终端设备

Country Status (2)

Country Link
CN (1) CN103064617A (zh)
WO (1) WO2013182142A1 (zh)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103064617A (zh) * 2012-12-18 2013-04-24 中兴通讯股份有限公司 一种三维立体场景化桌面的实现方法和***
CN103309562A (zh) * 2013-06-28 2013-09-18 北京小米科技有限责任公司 桌面显示方法、装置和移动终端
CN104346054A (zh) * 2013-07-30 2015-02-11 维沃移动通信有限公司 一种实现仿3d场景桌面的方法及***
CN103761307A (zh) * 2014-01-22 2014-04-30 华为技术有限公司 数据处理设备和数据处理方法
CN103984475A (zh) * 2014-05-06 2014-08-13 广州市久邦数码科技有限公司 一种立体桌面菜单栏弹出的实现方法及其***
CN103984474B (zh) * 2014-05-06 2017-02-15 广州市久邦数码科技有限公司 一种立体桌面功能表展开的实现方法及其***
CN103984553B (zh) * 2014-05-26 2017-10-24 中科创达软件股份有限公司 一种3d桌面显示方法和***
CN106030523B (zh) * 2015-09-21 2019-03-29 上海欧拉网络技术有限公司 一种在手机桌面上实现3d动效交互的方法及装置
CN107506110B (zh) * 2017-08-24 2021-02-05 深圳依偎控股有限公司 一种3d立方体移动终端桌面的操控方法、***及移动终端
CN109408851B (zh) * 2018-08-30 2020-04-14 百度在线网络技术(北京)有限公司 家具展示方法、装置、存储介质及电子设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1885233A (zh) * 2006-06-27 2006-12-27 刘金刚 三维桌面***的显示与操作方法
CN102081493A (zh) * 2009-12-01 2011-06-01 宏碁股份有限公司 移动电子装置及其三维操作界面的控制方法
CN102541531A (zh) * 2010-12-31 2012-07-04 福建星网视易信息***有限公司 基于OpenGL ES窗口立方体旋转切换特效实现***及方法
CN103064617A (zh) * 2012-12-18 2013-04-24 中兴通讯股份有限公司 一种三维立体场景化桌面的实现方法和***

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101309471A (zh) * 2007-05-18 2008-11-19 希姆通信息技术(上海)有限公司 在移动通信终端实现三维场景墙纸的方法
WO2011072456A1 (en) * 2009-12-18 2011-06-23 Nokia Corporation Desktop display apparatus
CN102819400A (zh) * 2012-08-14 2012-12-12 北京小米科技有限责任公司 一种移动终端的桌面***及界面交互方法、装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1885233A (zh) * 2006-06-27 2006-12-27 刘金刚 三维桌面***的显示与操作方法
CN102081493A (zh) * 2009-12-01 2011-06-01 宏碁股份有限公司 移动电子装置及其三维操作界面的控制方法
CN102541531A (zh) * 2010-12-31 2012-07-04 福建星网视易信息***有限公司 基于OpenGL ES窗口立方体旋转切换特效实现***及方法
CN103064617A (zh) * 2012-12-18 2013-04-24 中兴通讯股份有限公司 一种三维立体场景化桌面的实现方法和***

Also Published As

Publication number Publication date
CN103064617A (zh) 2013-04-24

Similar Documents

Publication Publication Date Title
WO2013182142A1 (zh) 一种终端设备桌面的实现方法、***及终端设备
JP6830447B2 (ja) 情報処理方法、端末、およびコンピュータ記憶媒体
CN108255304B (zh) 基于增强现实的视频数据处理方法、装置和存储介质
JP6529659B2 (ja) 情報処理方法、端末及びコンピュータ記憶媒体
KR102407623B1 (ko) 사용자 단말 장치 및 이의 제어 방법
CN107562316B (zh) 界面展示方法、装置及终端
CN109905754B (zh) 虚拟礼物收取方法、装置及存储设备
TWI571792B (zh) Touch control method and device for multi - touch terminal
US9628744B2 (en) Display apparatus and control method thereof
US11604580B2 (en) Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device
CN104182125B (zh) 一种悬浮窗的触发运行方法及装置
EP2696271A2 (en) Method and apparatus for controlling a display
CN105190486A (zh) 显示装置及其用户界面屏幕提供方法
CN106537326A (zh) 用于辅助显示器的移动设备输入控制器
WO2015058623A1 (zh) 一种多媒体数据共享方法和***、及电子设备
WO2015184770A1 (zh) 一种信息处理方法、***及终端
US9495064B2 (en) Information processing method and electronic device
US20180307387A1 (en) Electronic device and method for operating the electronic device
CN102819391B (zh) 一种多场景下具有并行性的多点触摸手势反馈***及方法
US20150121301A1 (en) Information processing method and electronic device
WO2014019207A1 (zh) Widget处理方法、装置及移动终端
US20150019976A1 (en) Portable terminal and method for providing information using the same
CN107831981A (zh) 终端控制方法、装置、终端及计算机可读存储介质
WO2024045985A1 (zh) 屏幕控制方法、屏幕控制装置、电子设备、程序及介质
CN107728809A (zh) 一种应用界面显示方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13800687

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13800687

Country of ref document: EP

Kind code of ref document: A1