CN113760090A - Business process execution method based on trusted execution environment and electronic equipment - Google Patents

Business process execution method based on trusted execution environment and electronic equipment Download PDF

Info

Publication number
CN113760090A
CN113760090A CN202110681590.9A CN202110681590A CN113760090A CN 113760090 A CN113760090 A CN 113760090A CN 202110681590 A CN202110681590 A CN 202110681590A CN 113760090 A CN113760090 A CN 113760090A
Authority
CN
China
Prior art keywords
algorithm
picture
algorithms
handle
result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110681590.9A
Other languages
Chinese (zh)
Other versions
CN113760090B (en
Inventor
陈志辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Glory Smart Technology Development Co ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110681590.9A priority Critical patent/CN113760090B/en
Publication of CN113760090A publication Critical patent/CN113760090A/en
Application granted granted Critical
Publication of CN113760090B publication Critical patent/CN113760090B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Telephone Function (AREA)
  • Storage Device Security (AREA)

Abstract

The application provides a service flow execution method and electronic equipment based on a trusted execution environment, and a plurality of algorithms for realizing the functional requirements of a service flow are integrated into one TA, so that the TA can continuously operate the plurality of algorithms to process service data, the interaction times of an REE side and a TEE side can be reduced, the times of establishing/releasing a memory mapping relation can be reduced, and the effect of improving the interaction efficiency of the REE side and the TEE side is achieved. The method comprises the following steps: CA sends a service request to TA only once; the service request carries a service flow identifier, the service flow identifier is used for indicating a service flow to be executed by the TA, and the service flow comprises N algorithms; in response to the service request, the TA at least inputs the handle of the service data into N algorithms respectively to process the service data to obtain N operation results; the TA feeds back the execution result of the business process to the CA; the N algorithms comprise the Nth algorithm executed last, and the execution result comprises the operation result of the Nth algorithm.

Description

Business process execution method based on trusted execution environment and electronic equipment
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method for executing a business process based on a trusted execution environment and an electronic device.
Background
Along with the rapid development of terminal intelligence, more and more sensitive information such as business secrets, personal privacy and the like are related to mobile terminals, and the mobile terminals also face various security threats. In order to create a more secure smart terminal, a Trusted Execution Environment (TEE) should be created. The TEE is a stand-alone operating environment that may be isolated from a Rich Execution Environment (REE). A Trusted Application (TA) may be run on the TEE, and the TA may provide security services, such as password input, transaction signature generation, face recognition, and the like, for a Client Application (CA) running outside the TEE.
At present, if the security service required by the CA involves multiple algorithms, there are more unnecessary delays easily existing when the TEE side and the REE side interact with each other, resulting in low interaction efficiency between the REE side and the TEE side.
Disclosure of Invention
The application provides a service flow execution method and electronic equipment based on a trusted execution environment, and a plurality of algorithms for realizing the functional requirements of the service flow are integrated into one TA, so that the TA can continuously run the plurality of algorithms to process service data after receiving a request of a CA (certificate Authority), thereby not only reducing the interaction times of an REE side and a TEE side, but also reducing the times of establishing/releasing a memory mapping relation, and achieving the effect of improving the interaction efficiency of the REE side and the TEE side.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a service flow execution method based on a trusted execution environment, which is applied to an electronic device including a trusted execution environment TEE and a rich execution environment REE, where a client application CA is run in the REE, and a trusted application TA is run in the TEE, and the service flow execution method based on the trusted execution environment includes:
CA sends a service request to TA only once; the service request carries a service flow identifier and a handle of service data, the service flow identifier is used for indicating a service flow to be executed by the TA, the service flow comprises N algorithms, the N algorithms are used for matching with one function requirement for realizing the service flow, and N is greater than or equal to 2. In response to the service request, the TA inputs at least handles of the service data into the N algorithms to process the service data, to obtain N operation results, where the N operation results correspond to the N algorithms one to one. The TA feeds back the execution result of the business process to the CA; the N algorithms comprise the Nth algorithm executed last, and the execution result comprises the operation result of the Nth algorithm.
It can be seen that by integrating N algorithms that fulfill one functional requirement into one TA, the CA can only send a service request once, so that the TA can run the N algorithms. That is, the functional requirement of the TA can be met by only one interaction between the REE and the TEE, and the interaction times between the REE and the TEE are effectively reduced, so that the time delay in the interaction process between the REE and the TEE is reduced. In addition, the TA only needs to map the handle to the memory to establish the memory mapping relation when receiving the service request, so that the TA function requirement can be met only by establishing the memory mapping relation once. Therefore, time delay in the interaction process of the REE and the TEE is reduced by reducing the memory mapping times, and the interaction efficiency of the REE side and the TEE side is further improved.
In one possible implementation mode, the N algorithms comprise an exposure algorithm, a face detection algorithm and a human eye gazing algorithm, the exposure algorithm, the face detection algorithm and the human eye gazing algorithm are used for being matched with human eye gazing detection requirements for realizing a service process, and service data comprise pictures;
the TA respectively inputs at least handles of the service data into N algorithms to process the service data, to obtain N operation results, including: TA inputs the handle of the picture into an exposure algorithm to obtain the next exposure parameter; TA inputs the next exposure parameter and the handle of the picture into a face detection algorithm to obtain the coordinates of the face in the picture; TA inputs the coordinates of the human face in the picture, the next exposure parameters and the handle of the picture into a human eye gazing algorithm to obtain a human eye gazing detection result; and the human eye gazing detection result is used for indicating whether a human face gazing screen exists in the picture.
It can be seen that the exposure algorithm, the face detection algorithm and the eye gazing algorithm are integrated into one TA, so that the TA can sequentially run the exposure algorithm, the face detection algorithm and the eye gazing algorithm to meet the eye gazing detection requirement of the CA. The CA does not need to initiate requests aiming at different algorithms, and the next exposure parameter and the human eye gazing detection result can be obtained only through one service request, so that the efficiency is higher.
In one possible implementation, the execution result includes a next exposure parameter and a human eye gaze detection result.
Therefore, the CA can obtain the operation results of a plurality of algorithms only by initiating the service request once, and the interaction efficiency of the REE side and the TEE side is improved.
In a possible implementation manner, the TA inputs coordinates of a human face in a picture, a next exposure parameter, and a handle of the picture into a human eye gaze algorithm to obtain a human eye gaze detection result, including:
if the number of the human faces detected by the human face detection algorithm is larger than 0, the TA inputs the coordinates of the human faces in the picture, the next exposure parameters and the handle of the picture into the human eye watching algorithm to obtain a human eye watching detection result.
When the number of the human faces is 0, the TA execution of the human eye fixation algorithm has no significance. Therefore, by judging whether the number of the human faces is greater than 0 or not and inputting the coordinates of the human faces in the pictures, the next exposure parameters and the handles of the pictures into the human eye watching algorithm when the number of the human faces is determined to be greater than 0, the invalid operation of the human eye watching algorithm can be avoided, and the unnecessary power consumption of TA is reduced.
In one possible implementation mode, the N algorithms include an exposure algorithm, a face detection algorithm and an owner identification algorithm, the exposure algorithm, the face detection algorithm and the owner identification algorithm are used for matching with the owner identification requirement for realizing the business process, and the business data includes pictures; the TA respectively inputs at least handles of the service data into N algorithms to process the service data, to obtain N operation results, including:
TA inputs the handle of the picture into an exposure algorithm to obtain the next exposure parameter; TA inputs the next exposure parameter and the handle of the picture into a face detection algorithm to obtain the coordinates of the face in the picture; TA inputs the coordinates of the face in the picture, the next exposure parameters and the handle of the picture into an owner identification algorithm to obtain an owner identification result; and the owner identification result is used for indicating whether the face in the picture is consistent with a preset face.
It can be seen that the exposure algorithm, the face detection algorithm and the main recognition algorithm are integrated into one TA, so that the TA sequentially runs the exposure algorithm, the face detection algorithm and the main recognition algorithm to meet the main recognition detection requirement of the CA. The CA does not need to initiate requests aiming at different algorithms, and the next exposure parameter and the owner identification detection result can be obtained only through one service request, so that the efficiency is higher.
In one possible implementation manner, the execution result includes an exposure parameter and an owner identification result.
Therefore, the CA can obtain the operation results of a plurality of algorithms only by initiating the service request once, and the interaction efficiency of the REE side and the TEE side is improved.
In a possible implementation manner, inputting coordinates of a human face in a picture, next exposure parameters and a handle of the picture into an owner identification algorithm, including: and if the number of the human faces detected by the human face detection algorithm is more than 0, inputting the coordinates of the human faces in the picture, the next exposure parameters and the handle of the picture into the main identification algorithm to obtain a main identification result.
When the number of the faces is 0, the TA execution main identification algorithm has no significance. Therefore, by judging whether the number of the faces is greater than 0 and inputting the coordinates of the faces in the picture, the next exposure parameters and the handle of the picture into the main identification algorithm when the number of the faces is determined to be greater than 0, the invalid operation of the main identification algorithm can be avoided, and the unnecessary power consumption of TA can be reduced.
In one possible implementation, the execution results further include the results of the execution of at least one of the first N-1 algorithms.
It can be seen that there may be strong dependencies between N algorithms, i.e. the output of the previous algorithm is used as the input of the next algorithm, so that the execution result includes the running result of the nth algorithm (the last algorithm). Alternatively, there may be no strong dependency relationship between the N algorithms, so that the TA may output the operation result of the last algorithm and the operation results of one or more of the first N-1 algorithms.
In a possible implementation manner, the service request further carries a data identifier of the service data, and the method further includes: judging whether a first operation result exists or not aiming at any one algorithm in the N algorithms; the first operation result is the operation result of the service data corresponding to any algorithm processing data identifier;
the TA respectively inputs at least handles of the service data into N algorithms to process the service data, to obtain N operation results, including: if the first operation result does not exist, the TA at least inputs the handle of the service data into any algorithm to process the service data, and obtains the operation result of any algorithm; if the first operation result exists, the TA takes the first operation result as the operation result of any algorithm.
As can be seen, for the service data that has been processed by the algorithm and has obtained the historical operation result (i.e., the first operation result), when the algorithm receives the processed service data again, the TA may directly read the historical operation result of the algorithm operating the service data without executing the algorithm again. For example, the TA has executed an overexposure algorithm on the service data when the human eye gaze detection requirement is realized, and if the TA receives the service request for realizing the owner identification requirement again, the service data carried by the service request is consistent with the service data when the human eye gaze detection requirement is realized. The TA may no longer invoke the exposure algorithm, but directly read the historical operating results of the exposure algorithm. Therefore, the same algorithm can be prevented from being repeatedly executed on the same service data, and the interaction efficiency of the REE side and the TEE side is further improved.
In a second aspect, the present application provides an electronic device, where a trusted execution environment TEE and a rich execution environment REE are integrated on the electronic device, a client application CA is run in the REE, and a trusted application TA is run in the TEE, and the electronic device includes: a wireless communication module, memory, and one or more processors; the wireless communication module and the memory are coupled with the processor;
wherein the memory is to store computer program code, the computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the trusted execution environment based business process execution method of any of the first aspects.
In a third aspect, the present application provides a chip system that includes one or more interface circuits and one or more processors. The interface circuit and the processor are interconnected by a line. The chip system may be applied to an electronic device including a communication module and a memory. The interface circuit may read instructions stored in a memory in the electronic device and send the instructions to the processor. The instructions, when executed by the processor, may cause the electronic device to perform the method as any one of the first aspect.
In a fourth aspect, the present application provides a computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the trusted execution environment based business process execution method according to any one of the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on an electronic device, causes the electronic device to execute the method for executing a voice trusted execution environment based business process according to any one of the first aspect.
It is to be understood that the electronic device according to the second aspect, the chip system according to the third aspect, the computer storage medium according to the fourth aspect, and the computer program product according to the fifth aspect are all configured to execute the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device according to the second aspect, the chip system according to the third aspect, and the computer program product according to the fifth aspect can refer to the beneficial effects of the corresponding method provided above, and are not described herein again.
Drawings
FIG. 1 is a schematic diagram of the interaction of a TEE and a REE in the prior art;
FIG. 2 is a schematic diagram illustrating an interaction between a TEE and a REE according to an embodiment of the present application;
FIG. 3 is a TEE + REE architecture diagram provided in accordance with an embodiment of the present application;
fig. 4 is a flowchart of a method for executing a business process based on a trusted execution environment according to an embodiment of the present application;
fig. 5 is an application scenario diagram provided in the embodiment of the present application;
fig. 6 is a schematic diagram of searching service data according to an embodiment of the present application;
fig. 7 is a schematic processing diagram of a face detection algorithm provided in the embodiment of the present application;
FIG. 8 is a schematic diagram illustrating another interaction between a TEE and a REE according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the present application, unless otherwise specified, "at least one" means one or more, "a plurality" means two or more. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
In order to make the following description of the various embodiments clear and concise and to facilitate an understanding of those skilled in the art, a brief description of the relevant concepts or techniques may be first presented.
REE, a rich execution environment, may also be referred to as a normal execution environment. REE generally refers to a runtime environment without specific security functions, such as Android (Android), IOS operating systems. It should be noted that, in addition to being referred to as a "rich execution environment," the REE may also be referred to as an "untrusted execution environment," "normal execution environment," "insecure execution environment," and the like, which is not limited by the embodiment of the present application.
TEE, a trusted execution environment. The TEE is an operating environment coexisting with the REE in the electronic equipment, is isolated from the REE through the support of hardware, has security capability and can resist software attacks which are easy to suffer from the conventional REE side. The TEE has its own operating space, defines strict protection measures, and therefore has a higher security level than the REE, and can protect assets (assets) such as data, software and the like in the TEE from software attack and resist certain types of security threats.
The REE + TEE architecture refers to an architecture which provides services for applications through the combination of TEE and REE. That is, the TEE and the REE co-exist in the electronic device. Illustratively, the TEE may implement an operation mechanism isolated from the REE by a hardware support. The TEE has an own operating space, has a higher security level than the REE, and can protect assets (such as data, software and the like) in the TEE from software attack. Only authorized security software can be executed in the TEE, and meanwhile, the confidentiality of resources and data of the security software is protected. Compared with REE, the TEE can better protect the security of data and resources due to the protection mechanisms such as isolation, authority control and the like.
The TA, trusted application, is an application running in the TEE, and can provide security services for CAs running outside the TEE, such as entering passwords, generating transaction signatures, face recognition, and the like.
CA, i.e. client application. CA generally refers to an application running in REE, but in some cases the TA is invoked by a TA, the TA that actively initiates the invocation may also act as CA. The CA may call the TA through an Application Programming Interface (API) of a Client (Client) and instruct the TA to perform a corresponding security operation.
A handle, which may be understood as a fixed address, may point to a fixed memory space (e.g., memory space a). The memory space stores the address of the object in the memory at the current moment. Regardless of the position of the object in the memory, the storage space a can be always found by the handle, and the object can be found according to the address in the storage space a. In this embodiment, the object may refer to service data that the TA needs to process, and may include, for example, a picture, signature data, a fingerprint, and the like.
The following describes a business process of implementing an intelligent screen-brightening function by an electronic device in the prior art, taking an intelligent screen-brightening application as an example. Firstly, an intelligent bright screen application is explained, and the intelligent bright screen application is a product or an application program for automatically unlocking and lighting a display screen when an electronic device identifies a machine owner. In the intelligent bright screen application, operations such as processing or detecting the picture can be operated on the TEE side, and the safety of the picture can be guaranteed. Referring to fig. 1, in the business process, the CA may send the handle of the picture (collected by the camera) to the TEE and request TA1 in the TEE to call the exposure algorithm to process the picture. The exposure algorithm may output the next exposure parameter to the CA. And after receiving the next exposure parameter, the CA sends the handle of the picture to the TEE again, and continuously requests TA2 in the TEE to call a face detection algorithm to process the picture. The face detection algorithm may output the face position to the CA. After receiving the face position, the CA may send the handle and the face position of the picture to the TEE, and request TA3 in the TEE to invoke an owner recognition algorithm to process the picture based on the face position. The owner identification algorithm can output the result of whether the face in the picture is the owner to the CA.
That is, if the business process involves multiple algorithms, the CA is required to initiate a request to the TA where each algorithm is located. Therefore, the number of interactions between the TEE side and the REE side is large, and time delay exists between each time when the CA initiates a request and obtains an operation result, so that time required by one service process is large. In addition, after each TA receives the request, it needs to establish the memory mapping relationship before finding the object (e.g., service data) through the handle, and it also needs to release the memory mapping relationship after the algorithm is executed. Thus, establishing the memory mapping relationship and releasing the memory mapping relationship for many times may also cause unnecessary time delay. Therefore, the problem that the interaction efficiency of the REE side and the TEE side is low exists in the current technical scheme.
Based on the existing problems, the embodiment of the present application provides a service flow execution method based on a trusted execution environment, and the method may be applied to an electronic device deployed with a TEE and a REE. According to the method, a plurality of algorithms for realizing the functional requirements of the business process are integrated into one TA, so that the TA can continuously operate the plurality of algorithms to process business data after receiving a request of a CA, the interaction times of an REE side and a TEE side can be reduced, the times of establishing/releasing a memory mapping relation can be reduced, and the effect of improving the interaction efficiency of the REE side and the TEE side is achieved.
Illustratively, as shown in FIG. 2, the plurality of algorithms may include an exposure algorithm, a face detection algorithm, and a human eye gaze algorithm. The CA may send the handle of the picture to the TA in the TEE. The TA may first enter the handle of the picture into the exposure algorithm, which may output the next exposure parameters. Then, the TA continues to input the next exposure parameter and the handle of the picture to the face detection algorithm to obtain the coordinates of the face in the picture (i.e. the face position). And then, the TA continuously inputs the next exposure parameter, the handle of the picture and the coordinates of the face in the picture into the human eye watching algorithm to obtain the human eye watching detection result. And finally, the TA feeds back the next exposure parameter and the human eye fixation detection result to the CA together. Therefore, the REE and the TEE are interacted only once in the whole process, the CA can obtain the next exposure parameter and the human eye gaze detection result, and the interaction times of the REE side and the TEE side are effectively reduced. In addition, the TA establishes the memory mapping relation only when receiving the request carrying the handle, so that the method and the device only need to establish the memory mapping relation once, and further improve the interaction efficiency of the REE side and the TEE side.
In addition, when the TA receives a request for calling the same algorithm (e.g., the first algorithm) to process the same service data again, the CA may directly read a historical operation result obtained by processing the service data by the first algorithm, without operating the first algorithm again. Therefore, the same algorithm can be prevented from being repeatedly executed on the same service data, and the interaction efficiency of the REE side and the TEE side is further improved.
It should be noted that the electronic device according to the embodiments of the present application may be any electronic device with an REE + TEE architecture, such as a portable computer (e.g., a mobile phone), a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR), a Virtual Reality (VR) device, a media player, and an intelligent door lock. The embodiment of the present application does not particularly limit the specific form of the electronic device. A mobile phone (mobile phone), a tablet computer, a notebook computer, a palm top computer, a Mobile Internet Device (MID), a wearable device (e.g., a watch, a bracelet, a smart helmet, etc.), a Virtual Reality (VR) device, an Augmented Reality (AR) device, an Ultra Mobile Personal Computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a wireless terminal in industrial control (industrial control), a wireless terminal in self driving (driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation security (security), a wireless terminal in city smart terminal, a wireless terminal in home (smart) and the like.
It should be noted that the electronic device according to the embodiment of the present application may also be any electronic device that supports two or more operating environments, and here, the electronic device that supports REE and TEE is taken as an example, but this is not a limitation to this, and an electronic device that supports other two operating environments may also be used.
Please refer to fig. 3, which illustrates a TEE + REE structure according to an embodiment of the present invention. As shown in fig. 3, the architecture includes a REE 101 and a TEE 102. Therein, the REE 101 may include one or more CAs and a Client API. TEE 102 may include a TA. Multiple algorithms (e.g., algorithm 1, algorithm 2, … … algorithm n) may be integrated on the TA. The CA in REE 101 may send a request to the TA in TEE 102 through the Client API. After receiving the request, the TA may execute a service flow (including multiple algorithms) corresponding to the request, and feed back an execution result of the service flow to the CA.
The following describes in detail a business process execution method based on a trusted execution environment according to an embodiment of the present application. Please refer to fig. 4, which is a flowchart illustrating a trusted execution environment based business process executing method according to an embodiment of the present application, and the method is applied to an electronic device. As shown in fig. 4, the process of the business process execution method based on the trusted execution environment includes:
s101, the CA sends a service request to the TA.
In general, CA may have a correspondence with TA. The CA is used to provide a channel for interaction with a user (e.g., to provide an operation interface for the user), and the TA is used to implement the relevant security functions provided by the CA for the user. Thus, in case that a CA in a REE needs a TA to perform some service operations (to meet related functional requirements), the CA in the REE may send a service request to the corresponding TA through the Client API. The service request may carry service information and a handle of service data.
The service data can be understood as an object that the TA needs to process. The service data may be, for example, information such as a picture, a fingerprint, a signature, and a password. The type of service data may vary according to application scenarios.
The service information may include a Universally Unique Identifier (UUID) of the TA and a service flow identifier. The UUID is used to indicate which service the CA requests, and the service flow identifier is used to indicate the service flow that the TA needs to execute. It will be appreciated that by executing the business process, the TA can implement the security functions required by the CA (i.e., meet the functional requirements of the CA for the business process). A plurality of different service flows can be integrated in the TA, different service flows have different service flow identifiers, and different service flows can implement different security functions.
Each business process may include multiple algorithms, and different combinations of algorithms may form different business processes. If the business process includes N algorithms, the functional requirements of the business process can be realized by executing the N algorithms. In one possible design, there may be a strict order of execution among the N algorithms. For example, a business process includes algorithm 1, algorithm 2, and algorithm 3 (i.e., N is 3), and must be executed in sequence according to the order of algorithm 1, algorithm 2, and algorithm 3 to implement the security function provided by the business process. In another possible design, the N algorithms may not have a strict execution order among them. For another example, the business process includes algorithm 1, algorithm 2, and algorithm 3, but the functional requirements of the business process can be realized as long as 3 algorithms, namely algorithm 1, algorithm 2, and algorithm 3, are executed.
In one possible design, the TA may preset different service flows according to different functional requirements, so as to implement different security functions. Illustratively, an exposure algorithm, a face detection algorithm, a human eye fixation algorithm, and an owner identification algorithm may be integrated on the TA. On the basis of the above, different business processes set by the TA while meeting different functional requirements can be as shown in expression 1.
TABLE 1
Figure BDA0003122838170000061
Figure BDA0003122838170000071
When the electronic device needs to meet the function requirement 1, the electronic device needs to perform human eye gaze recognition, that is, the electronic device can detect whether a user pays attention to the display screen (for example, the user watches or gazes at the display screen) through the camera. If the user focuses on the display screen, the electronic device can automatically light up the display screen. For example, as shown in fig. 5 (a), in a case where the user pays attention to the display screen, the display screen of the electronic apparatus is lit. Further, after the display screen of the electronic device is lighted, if no user pays attention to the display screen of the electronic device within a preset time, the electronic device can automatically blank the screen. For example, as shown in (b) of fig. 5, in the case where the user does not pay attention to the display screen, the display screen of the electronic apparatus is blank. Therefore, when the TA needs to satisfy the function requirement 1, the TA may provide the CA with a service flow (i.e., service flow 1) related to implementing the human eye gaze recognition function. The service flow identifier of the service flow 1 is ID1, and the service flow 1 includes executing an exposure algorithm, a face detection algorithm, and an eye gaze algorithm on the acquired picture.
When the electronic equipment needs to meet the functional requirement 2, the electronic equipment needs to perform owner identification, namely, the electronic equipment can detect whether the current user is the owner through the camera. If the current user is the owner, the electronic equipment can execute unlocking operation. If the current user is the owner, the electronic equipment cannot execute the unlocking operation. When the TA needs to satisfy the function requirement 2, the TA may provide a service flow (i.e., service flow 2) related to the implementation of the owner identification function for the CA. The service flow identifier of the service flow 2 is ID2, and the service flow 2 includes executing an exposure algorithm, a face detection algorithm, and an owner identification algorithm on the collected picture.
When the electronic equipment needs to meet the function requirement 3, the electronic equipment needs to perform human eye gaze recognition and owner recognition at the same time, namely, the electronic equipment can detect whether a user pays attention to the display screen and whether the user is the owner through the camera. If a user pays attention to the display screen and the user is the owner, the electronic equipment can automatically light up the display screen. Further, after the display screen of the electronic device is in a black state, if the user does not pay attention to the display screen or the user is not the owner, the display screen of the electronic device continues to be in the black state. When the TA needs to satisfy the function requirement 3, the TA can provide a service flow (i.e., the service flow 3) related to the realization of the owner identification function and the human eye gaze identification function by the CA. The service flow identifier of the service flow 3 is ID3, and the service flow 3 includes an exposure algorithm, a face detection algorithm, a human eye watching algorithm, and an owner identification algorithm performed on the collected picture.
Thus, if the CA issues a request related to the realization of the human eye gaze recognition function to the corresponding TA, the service flow identifier in the service information should be ID 1. Thus, after receiving the request, the TA can execute the service flow 1 on the service data (the picture acquired by the camera). If the CA initiates a request related to implementing the owner identification function to the corresponding TA, the service flow identifier in the service information should be ID 2. Thus, after receiving the request, the TA can execute the service process 2 on the picture acquired by the camera. If the CA initiates a request related to the implementation of the owner identification function and the human eye gaze identification function to the corresponding TA, the service flow identifier in the service information should be ID 3. Thus, after receiving the request, the TA can execute the service flow 3 on the picture acquired by the camera.
It should be noted that the above TA integration algorithm and 3 scenarios are only exemplary and not limiting, and as long as any scenario involving the combined use of multiple algorithms is applicable, the embodiments of the present application are not necessarily listed.
S102, responding to the service request, the TA at least inputs the handle of the service data into N algorithms respectively to process the service data, and N operation results are obtained.
And the N operation results correspond to the N algorithms one by one. In addition, the TA may input the handle of the service data into the algorithm to process the service data, and may also input the handle of the service data into the algorithm along with other parameters (e.g., output of other algorithms).
In the embodiment of the present application, before the TA searches for the service data according to the handle, the TA needs to map the handle to the shared memory, so that the TA reads and modifies the object (e.g., the service data) by reading and modifying the shared memory. Illustratively, this may be accomplished using a mmap function. Furthermore, the TA may release the mapping relationship after the service data is not needed to be used (e.g., after N algorithms are executed). Illustratively, it can be implemented using unmap functions. It should be noted that, in the embodiment of the present application, since the CA only needs to initiate a request for one service flow, the present application only needs to execute a process of establishing/releasing the memory mapping once. Compared with the time delay caused by the fact that the memory mapping relation needs to be established/released for many times in the prior art, the method and the device reduce unnecessary time delay between the TEE and the REE, and improve interaction efficiency of the REE and the TEE.
Then, the TA looks up the traffic data according to the handle. It can be understood that the storage space pointed by the handle of the service data stores the address of the service data in the memory at the current time. Thus, the TA can find the storage space pointed to by the handle according to the value of the handle, and further find the service data. For example, as shown in fig. 6, the address "0X 00000AC 6" of the memory space 2 is stored in the memory space 1 where the handle 1 is located, and the address "0X 00100016" of the traffic data is stored in the memory space 2. Therefore, the TA may first find the value of the storage space 2 (i.e. 0X00100016) according to the handle 1, and further find the service data in the storage space pointed by the address "0X 00100016" (i.e. storage space 3).
It can be understood that the corresponding service process is executed on the service data, that is, the service data is processed by the N algorithms running the service process. In a possible design, if the last executed algorithm of the N algorithms is the nth algorithm, the TA at least inputs the handle of the service data into the N algorithms to process the service data, respectively, and the process of obtaining N operation results includes: firstly, the handle is sequentially input into the first N-1 algorithms in the N algorithms to obtain the running results of the first N-1 algorithms. Then, the handle and the operation result of at least one algorithm in the first N-1 algorithms are input into the Nth algorithm.
Exemplarily, taking the service flow 1 in table 1 as an example, a process that the TA inputs at least handles of the service data into N algorithms respectively to process the service data, so as to obtain N operation results will be described.
In an alternative embodiment, as also shown in fig. 2, the business process may include: the handle to the picture is input to an exposure algorithm, which can output the next exposure parameters. Then, the next exposure parameter and the handle of the picture are input into a face detection algorithm together to obtain the coordinates of the face in the picture. And then, inputting the next exposure parameter, the handle of the picture and the coordinates of the face in the picture into a human eye gazing algorithm to obtain a human eye gazing detection result. And finally, feeding back the next exposure parameter and the human eye gaze detection result to the CA.
It should be noted that after the handle of the picture is input into the exposure algorithm, the face detection algorithm and the eye fixation algorithm, the corresponding picture needs to be found according to the handle, and then the corresponding algorithm is run to process the picture. The process of finding the corresponding picture according to the handle of the picture may refer to fig. 6 and related text, which are not described herein again. It can be understood that the space occupied by the handle of the picture is usually much smaller than the space occupied by the picture itself, so that only transmitting the handle of the picture can improve the transmission efficiency between different algorithms.
The exposure algorithm has a function of calculating the next exposure parameter according to the input picture. It can be understood that, in the process of determining whether the display screen is watched by the human eyes of the electronic device through the camera, the service flow as described in fig. 2 needs to be respectively executed on consecutive multiple frames of pictures. Therefore, the handle of the (current frame) picture can be input into the exposure algorithm first, so as to obtain the next exposure parameter, so as to ensure the brightness of the next frame picture.
The next exposure parameters may include fast speed, aperture value, and sensitivity. The next exposure parameter can be used for adjusting the exposure parameter of the camera so as to improve the brightness of a picture (namely a next frame picture) shot by the camera next time. It should be noted that the next exposure parameter is associated with the ambient light and the hardware parameter of the electronic device, and when neither the ambient light nor the hardware parameter of the electronic device is changed, the exposure algorithm may obtain a normalization result by inputting different picture values.
It should be noted that after the picture is found according to the handle of the picture, there may be a case where the exposure algorithm fails to operate due to reasons such as the size and format of the picture being not satisfactory. When the exposure algorithm fails to operate, the next exposure parameter cannot be obtained, and the subsequent processing (for example, processing such as executing a face detection algorithm on the picture) cannot be continued. In this way, the TA can feed back to the CA the result of the failed operation. In addition, the exposure algorithm provided in the embodiment of the present application may be implemented by using a histogram equalization method, a gamma transformation method, a logarithmic transformation method, a local histogram equalization method, and the like, which is not specifically limited herein.
After obtaining the next exposure parameter, the TA may continue to input the handle of the picture and the next exposure parameter into the face detection algorithm, thereby obtaining the coordinates of the face in the picture. In one possible design, the coordinates of the face in the picture may include coordinates of five key points of the face (including the tip of the nose, the left eye corner of the left eye, the right eye corner of the right eye, the left mouth corner, and the right mouth corner), and the face position may be determined by the coordinates of the face in the picture. In another possible design, the face detection algorithm may select a face by using a circumscribed rectangle frame, and use coordinates of the circumscribed rectangle frame in the picture as coordinates of the face in the picture. For example, as shown in fig. 7, after input 1 is input into the face detection algorithm, output 1 can be obtained. And the coordinates of the rectangular frame in the output 1 are the coordinates of the face in the picture. In addition, the face detection algorithm can also detect the inclination information of the face, such as the inclination angle of the face.
It should be noted that, when there is no face in the picture, the face detection algorithm cannot determine the face position. When a plurality of faces exist in the picture, the face position of each face in the picture can be determined.
In a possible design, if the number of faces detected by the face detection algorithm is 0, the service process may be ended, and a result of operation failure is fed back to the CA. If the number of the human faces detected by the human face detection algorithm is more than 0, the handle of the picture, the next exposure parameter and the coordinates of the human faces in the picture can be further input into the human eye watching algorithm to obtain the human eye watching detection result. And the human eye gazing detection result is used for indicating whether a human face gazing screen exists in the picture. It should be noted that the number of faces corresponds to the face position. That is, how many faces the face detection algorithm detects, how many faces there are. Therefore, the number of faces can be determined by counting the number of face positions.
The human eye watching algorithm has the function of detecting whether a person watches the screen in the picture. The processing flow can comprise: first, the found picture is subjected to a first process according to the coordinates of the face in the picture, and the first process may include image segmentation (processing). For example, if the coordinates of the face in the picture include coordinates of 5 key points, when the picture is cut, the picture may be cut into a preset size with the tip of the nose of the face in the picture as the center. For example, the picture may be cut to a size of 256 × 256 pixels. For another example, if the coordinates of the face in the picture include the coordinates of the circumscribed rectangular frame, the face can be cut directly according to the length and width of the circumscribed rectangular frame, and the face in the circumscribed rectangular frame is retained.
Optionally, the first processing may further include image rotation (processing). That is, before the image is cut, if the face position in the image is deflected or skewed (for example, the person is crooked), the image may be rotated by a corresponding angle, and the face position may be corrected (for example, the angle between the horizontal line and the line between the two eyes of the face is smaller than a preset threshold).
After the first processing is performed on the picture, a plurality of key points can be detected by performing key point detection on the picture. Wherein the plurality of key points include a plurality of key points with respect to the eyes (eye contour and pupil) and a plurality of key points with respect to the nose. Illustratively, the picture may include 25 keypoints. Where the eyes may include 16 keypoints (two eyes each having keypoints of 5 eye contours and 3 pupils), and the nose may include 9 keypoints. It should be noted that, when the key point detection is performed on the picture by using different methods, the picture may include other parts (such as eyebrows, mouth, face, etc.) or other numbers (such as 30, 60, etc.) of key points, and the application is not limited in particular.
After obtaining the plurality of key points, the relative coordinates of the left pupil in the left eye (denoted as the first coordinates) and the relative coordinates of the right pupil in the right eye (denoted as the second coordinates) and the distance between the two eyes and the nose can be calculated. And then, judging whether the human eyes watch according to the first coordinate, the second coordinate and the distance.
It should be noted that the above-mentioned process of performing the first processing on the picture may also be integrated on other algorithm modules, and operates independently of the human eye gaze algorithm.
For another example, taking the service flow 2 in table 1 as an example, a process in which the TA inputs at least handles of the service data into N algorithms respectively to process the service data, thereby obtaining N operation results will be described. In an alternative embodiment, as shown in fig. 8, the business process may include: the handle to the picture is input to an exposure algorithm, which can output the next exposure parameters. Then, the next exposure parameter and the handle of the picture are input into a face detection algorithm together to obtain the coordinates of the face in the picture. And then, inputting the next exposure parameter, the handle of the picture and the coordinates of the face in the picture into an owner identification algorithm to obtain an owner identification result. And finally, feeding back the next exposure parameter and the owner identification result to the CA.
It should be noted that the processing flow of the exposure algorithm and the face detection algorithm on the picture may refer to the content in the service flow 1, and details are not described herein again.
That is, if the number of faces detected by the face detection algorithm is greater than 0, the handle of the picture, the next exposure parameter and the coordinates of the faces in the picture can be further input into the main identification algorithm to obtain the main identification result. The owner identification result is used for indicating whether the face in the picture is consistent with a preset face (namely the face of the owner).
Likewise, the process of processing the picture using the dominant recognition algorithm may include: and performing first processing on the searched picture according to the coordinates of the face in the picture, wherein the first processing may include image cutting (processing), image rotation (processing) and the like. After the first processing is performed on the picture, the picture can be compared with a preset face. And if the similarity between the picture subjected to the first processing and a preset face is greater than or equal to a preset threshold value, generating an identification result that the face is the owner. And if the similarity between the picture subjected to the first processing and a preset face is smaller than a preset threshold value, generating a recognition result that the face is not the owner.
S103, the TA feeds back the execution result of the business process to the CA.
Wherein the execution result may include an operation result of the nth algorithm. For example, as shown in fig. 2, the execution result includes a human eye gaze detection result (where N is 3) output by the human eye gaze algorithm. For another example, as shown in fig. 8, the execution result includes an owner identification result output by the owner identification algorithm (where N is 3). In addition, the execution result may also include the operation results of the first N-1 algorithms. For example, as shown in fig. 2 and 8, the execution result further includes the next exposure parameter output by the exposure algorithm.
Consider that there may be an overlap of algorithms between different traffic flows on the TA. For example, in table 1, the business process 1 and the business process 2 both include an exposure algorithm and a face detection algorithm. In one possible design, before the business data is input into any one of the algorithms (e.g., the first algorithm), it may be determined whether there is a historical execution result of the first algorithm processing the business data. The first algorithm is any one of N algorithms included in the business process. And if the historical operation result of the first algorithm for processing the service data does not exist, inputting the service data into the first algorithm. If the operation result obtained by the first algorithm based on the service data exists, the historical operation result can be directly read and used as the output of the first algorithm at this time, and the first algorithm does not need to be operated again. Therefore, the same algorithm can be prevented from being repeatedly executed on the same picture, and the interaction efficiency of the REE side and the TEE side is further improved.
In one possible design, the service request in S101 may carry a service data identifier. After each algorithm obtains an operation result based on the service data, the corresponding relation among the service data identifier, the algorithm and the operation result can be established and stored. In this way, the TA may query whether the first operation result exists according to the service data identifier before operating the first algorithm (any one of the N algorithms) based on the service data. And the first operation result is a historical operation result of the first algorithm processing the service data corresponding to the service data identifier. And if the first operation result exists, the first algorithm does not need to be operated again, and the next algorithm in the service flow is continuously operated.
Illustratively, algorithm 1, algorithm 2, algorithm 3, and algorithm 4 are integrated on the TA. In addition, the TA is configured with a service flow a and a service flow b in advance. The business process a comprises an algorithm 1, an algorithm 2 and an algorithm 3; the business process b comprises an algorithm 1, an algorithm 2 and an algorithm 4. The TA may first receive a request for executing the service process a on the service data 1, and store the correspondence table shown in table 2.
TABLE 2
Figure BDA0003122838170000101
Figure BDA0003122838170000111
In this way, when the TA receives the request for executing the service process b on the service data 1 again, the first operation result of the algorithm 1, i.e. the result 1_1, can be directly queried according to the identifier (ID101) of the service data 1. Then, before executing the algorithm 2, the first operation result of the algorithm 2, i.e. the result 2_1, can be queried again according to the identification (ID101) of the service data 1. And finally, inputting the result 2_1 into an algorithm 4 to obtain an operation result of the service process b.
In one possible design, the log information may be generated each time the CA sends a service request to the TA, or each time the TA feeds back an operation result to the CA. The log information may record the interaction of the CA with the TA. Illustratively, the interaction scenario may include a number of interactions, interaction data, interaction time, and the like.
Therefore, in the service flow execution method based on the trusted execution environment provided by the embodiment of the application, the TA can set different service flows according to different functional requirements, and the functional requirements are realized by using N algorithms included in one service flow. Therefore, after the TA receives the request of the CA, a plurality of algorithms can be continuously operated to process the service data, so that the interaction times of the REE side and the TEE side can be reduced, the times of establishing/releasing the memory mapping relation can be reduced, and the effect of improving the interaction efficiency of the REE side and the TEE side is achieved.
Another embodiment of the present application provides a chip system 900, as shown in fig. 9, which includes at least one processor 901 and at least one interface circuit 902. The processor 901 and the interface circuit 902 may be interconnected by wires. For example, the interface circuit 902 may be used to receive signals from other devices (e.g., a memory of an electronic device). Also for example, the interface circuit 902 may be used to send signals to other devices, such as the processor 901.
For example, the interface circuit 902 may read instructions stored in a memory in the electronic device and send the instructions to the processor 901. The instructions, when executed by the processor 901, may cause the electronic device to perform the steps in the embodiments described above.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. The business process execution method based on the trusted execution environment is applied to electronic equipment comprising a trusted execution environment TEE and a rich execution environment REE, wherein a client application program CA runs in the REE, a trusted application program TA runs in the TEE, and the business process execution method based on the trusted execution environment comprises the following steps:
the CA only sends a service request to the TA once; the service request carries a service flow identifier and a handle of service data, the service flow identifier is used for indicating a service flow to be executed by the TA, the service flow comprises N algorithms, the N algorithms are used for matching with a function requirement for realizing the service flow, and N is greater than or equal to 2;
in response to the service request, the TA at least inputs handles of the service data into the N algorithms respectively to process the service data, and N operation results are obtained; the N operation results correspond to the N algorithms one by one;
the TA feeds back the execution result of the business process to the CA; wherein the N algorithms include an Nth algorithm that is executed last, and the execution result includes an operation result of the Nth algorithm.
2. The trusted execution environment based business process execution method of claim 1, wherein said N algorithms include an exposure algorithm, a face detection algorithm and a human eye gaze algorithm, said exposure algorithm, said face detection algorithm and said human eye gaze algorithm are adapted to cooperate with human eye gaze detection requirements for implementing said business process, said business data includes pictures;
the TA inputs at least the handle of the service data into the N algorithms respectively to process the service data, and obtains N operation results, including:
the TA inputs the handle of the picture into the exposure algorithm to obtain the next exposure parameter;
the TA inputs the next exposure parameter and the handle of the picture into the face detection algorithm to obtain the coordinates of the face in the picture;
the TA inputs the coordinates of the human face in the picture, the next exposure parameters and the handle of the picture into the human eye watching algorithm to obtain a human eye watching detection result; and the human eye gazing detection result is used for indicating whether a human face gazing screen exists in the picture or not.
3. The trusted execution environment based business process execution method of claim 2, wherein said execution result comprises said next exposure parameter and said eye gaze detection result.
4. The method as claimed in claim 2, wherein the TA inputs coordinates of the face in the picture, the next exposure parameter, and a handle of the picture into the human eye gaze algorithm to obtain a human eye gaze detection result, and the method includes:
and if the number of the human faces detected by the human face detection algorithm is more than 0, the TA inputs the coordinates of the human faces in the picture, the next exposure parameters and the handle of the picture into the human eye gazing algorithm to obtain the human eye gazing detection result.
5. The trusted execution environment based business process execution method of claim 1, wherein said N algorithms comprise an exposure algorithm, a face detection algorithm and an owner identification algorithm, said exposure algorithm, said face detection algorithm and said owner identification algorithm are used to cooperate with an owner identification requirement for implementing said business process, said business data comprises a picture;
the TA inputs at least the handle of the service data into the N algorithms respectively to process the service data, and obtains N operation results, including:
the TA inputs the handle of the picture into the exposure algorithm to obtain the next exposure parameter;
the TA inputs the next exposure parameter and the handle of the picture into the face detection algorithm to obtain the coordinates of the face in the picture;
the TA inputs the coordinates of the face in the picture, the next exposure parameters and the handle of the picture into the owner identification algorithm to obtain an owner identification result; and the owner identification result is used for indicating whether the face in the picture is consistent with a preset face or not.
6. The trusted execution environment based business process execution method of claim 5, wherein said execution result comprises said next exposure parameter and said owner identification result.
7. The method of claim 6, wherein the inputting coordinates of the face in the picture, the next exposure parameters, and the handle of the picture into the host identification algorithm comprises:
and if the number of the human faces detected by the human face detection algorithm is more than 0, inputting the coordinates of the human faces in the picture, the next exposure parameters and the handle of the picture into the main identification algorithm to obtain the main identification result.
8. The trusted execution environment based business process execution method of any one of claims 1-7, wherein the execution result further comprises a result of execution of at least one of the first N-1 algorithms.
9. The method for executing the service flow based on the trusted execution environment according to any one of claims 1 to 7, wherein the service request further carries a data identifier of service data, and the method further comprises:
judging whether a first operation result exists or not aiming at any one algorithm in the N algorithms; wherein the first operation result is an operation result of the service data corresponding to the data identifier processed by the arbitrary algorithm;
the TA inputs at least the handle of the service data into the N algorithms respectively to process the service data, and obtains N operation results, including:
and if the first operation result does not exist, the TA at least inputs the handle of the service data into any algorithm to process the service data, and obtains the operation result of any algorithm.
10. The trusted execution environment based business process execution method of claim 9, further comprising:
and if the first operation result exists, the TA takes the first operation result as the operation result of any algorithm.
11. An electronic device having integrated thereon a trusted execution environment TEE and a rich execution environment REE, the REE having a client application CA running therein and the TEE having a trusted application TA running therein, the electronic device comprising: a wireless communication module, memory, and one or more processors; the wireless communication module, the memory and the processor are coupled;
wherein the memory is to store computer program code comprising computer instructions; the computer instructions, when executed by the processor, cause the electronic device to perform the trusted execution environment based business process execution method of any one of claims 1-10.
12. A computer-readable storage medium comprising computer instructions;
the computer instructions, when executed on the electronic device, cause the electronic device to perform the trusted execution environment based business process execution method of any one of claims 1-10.
CN202110681590.9A 2021-06-18 2021-06-18 Business process execution method based on trusted execution environment and electronic equipment Active CN113760090B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110681590.9A CN113760090B (en) 2021-06-18 2021-06-18 Business process execution method based on trusted execution environment and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110681590.9A CN113760090B (en) 2021-06-18 2021-06-18 Business process execution method based on trusted execution environment and electronic equipment

Publications (2)

Publication Number Publication Date
CN113760090A true CN113760090A (en) 2021-12-07
CN113760090B CN113760090B (en) 2022-09-13

Family

ID=78787469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110681590.9A Active CN113760090B (en) 2021-06-18 2021-06-18 Business process execution method based on trusted execution environment and electronic equipment

Country Status (1)

Country Link
CN (1) CN113760090B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116382896A (en) * 2023-02-27 2023-07-04 荣耀终端有限公司 Calling method of image processing algorithm, terminal equipment, medium and product

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2353421A1 (en) * 1972-10-30 1974-05-09 Hewlett Packard Co ELECTRONIC CALCULATOR
US20150286853A1 (en) * 2014-04-08 2015-10-08 Disney Enterprises, Inc. Eye gaze driven spatio-temporal action localization
CN109766152A (en) * 2018-11-01 2019-05-17 华为终端有限公司 A kind of exchange method and device
CN109960582A (en) * 2018-06-19 2019-07-02 华为技术有限公司 The method, apparatus and system of multi-core parallel concurrent are realized in the side TEE
WO2019205108A1 (en) * 2018-04-27 2019-10-31 华为技术有限公司 Constructing common trusted application for a plurality of applications
CN110555706A (en) * 2019-08-30 2019-12-10 北京银联金卡科技有限公司 Face payment security method and platform based on security unit and trusted execution environment
CN111274019A (en) * 2019-12-31 2020-06-12 深圳云天励飞技术有限公司 Data processing method and device and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2353421A1 (en) * 1972-10-30 1974-05-09 Hewlett Packard Co ELECTRONIC CALCULATOR
US20150286853A1 (en) * 2014-04-08 2015-10-08 Disney Enterprises, Inc. Eye gaze driven spatio-temporal action localization
WO2019205108A1 (en) * 2018-04-27 2019-10-31 华为技术有限公司 Constructing common trusted application for a plurality of applications
CN109960582A (en) * 2018-06-19 2019-07-02 华为技术有限公司 The method, apparatus and system of multi-core parallel concurrent are realized in the side TEE
CN109766152A (en) * 2018-11-01 2019-05-17 华为终端有限公司 A kind of exchange method and device
CN110555706A (en) * 2019-08-30 2019-12-10 北京银联金卡科技有限公司 Face payment security method and platform based on security unit and trusted execution environment
CN111274019A (en) * 2019-12-31 2020-06-12 深圳云天励飞技术有限公司 Data processing method and device and computer readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
杨穗珊: "基于可信执行环境的安全手机架构研究", 《移动通信》 *
陈彦光等: "EZMS:支持EZ-Flow的工作流管理***", 《计算机技术与发展》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116382896A (en) * 2023-02-27 2023-07-04 荣耀终端有限公司 Calling method of image processing algorithm, terminal equipment, medium and product
CN116382896B (en) * 2023-02-27 2023-12-19 荣耀终端有限公司 Calling method of image processing algorithm, terminal equipment, medium and product

Also Published As

Publication number Publication date
CN113760090B (en) 2022-09-13

Similar Documents

Publication Publication Date Title
US11321308B2 (en) Asset management method and apparatus, and electronic device
US20220224677A1 (en) User inviting method and apparatus, computer device, and computer-readable storage medium
US20160253519A1 (en) Apparatus and method for trusted execution environment file protection
US11853793B2 (en) Methods and system for on-device AI model parameter run-time protection
US11017066B2 (en) Method for associating application program with biometric feature, apparatus, and mobile terminal
CN110457963B (en) Display control method, display control device, mobile terminal and computer-readable storage medium
EP3176719B1 (en) Methods and devices for acquiring certification document
US10504289B2 (en) Method and apparatus for securely displaying private information using an augmented reality headset
US11240230B2 (en) Automatic authentication processing method and system using dividing function
US20230089622A1 (en) Data access control for augmented reality devices
US20230360044A1 (en) Digital asset transfers in a virtual environment based on a physical object
CN113760090B (en) Business process execution method based on trusted execution environment and electronic equipment
US11374898B1 (en) Use of partial hash of domain name to return IP address associated with the domain name
US11615205B2 (en) Intelligent dynamic data masking on display screens based on viewer proximity
US10902101B2 (en) Techniques for displaying secure content for an application through user interface context file switching
CN116311389B (en) Fingerprint identification method and device
US12026684B2 (en) Digital and physical asset transfers based on authentication
US10430571B2 (en) Trusted UI authenticated by biometric sensor
CN113673676A (en) Electronic device, method for implementing neural network model, system on chip, and medium
CN111475779A (en) Image processing method, device, equipment and storage medium
US20240161375A1 (en) System and method to display profile information in a virtual environment
US11604938B1 (en) Systems for obscuring identifying information in images
US20230379159A1 (en) System and method to validate a rendered object using non-fungible tokens
CN115134473B (en) Image encryption method and device
US20240022553A1 (en) Authenticating a virtual entity in a virtual environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230914

Address after: 201306 building C, No. 888, Huanhu West 2nd Road, Lingang New District, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: Shanghai Glory Smart Technology Development Co.,Ltd.

Address before: Unit 3401, unit a, building 6, Shenye Zhongcheng, No. 8089, Hongli West Road, Donghai community, Xiangmihu street, Futian District, Shenzhen, Guangdong 518040

Patentee before: Honor Device Co.,Ltd.