[go: up one dir, main page]

CN116820320A - Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment - Google Patents

Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment Download PDF

Info

Publication number
CN116820320A
CN116820320A CN202310208397.2A CN202310208397A CN116820320A CN 116820320 A CN116820320 A CN 116820320A CN 202310208397 A CN202310208397 A CN 202310208397A CN 116820320 A CN116820320 A CN 116820320A
Authority
CN
China
Prior art keywords
gesture
view
finger touch
core management
touch action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310208397.2A
Other languages
Chinese (zh)
Inventor
谭斌
尹欣荣
肖灵聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xingka Software Technology Development Co Ltd
Original Assignee
Shenzhen Xingka Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xingka Software Technology Development Co Ltd filed Critical Shenzhen Xingka Software Technology Development Co Ltd
Priority to CN202310208397.2A priority Critical patent/CN116820320A/en
Publication of CN116820320A publication Critical patent/CN116820320A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of automobile diagnosis, in particular to an intelligent gesture monitoring method, device, equipment and medium for automobile diagnosis equipment. The gesture intelligent monitoring method of the automobile diagnosis equipment comprises the following steps: creating a corresponding core management class in view layout existing in the source code; creating a gesture view through a core management class; generating a custom animation icon in the gesture view; displaying the animation icons in a screen; registering a monitoring area of the gesture in the core management class to monitor the finger touch action in real time; the periodic frequency of monitoring by people with different rights/different devices is different; judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view; and returning corresponding animation icons and finger touch action responses when judging to be valid. The gesture return operation of the automobile diagnosis equipment can be adapted to different conditions, and hardware resources are saved.

Description

Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment
The application relates to a method, a device and equipment for processing gesture intelligent response of automobile diagnosis equipment and a medium, which are classified as patent applications of 20 days of 2022 and 07, 2022108501455.
Technical Field
The application relates to the technical field of automobile diagnosis, in particular to an intelligent gesture monitoring method, device, equipment and medium for automobile diagnosis equipment.
Background
In the technical field of automobile diagnosis, due to the rapid development of the intelligent age, the design of diagnostic equipment is more biased to improve the experience of users, and if the interface of one diagnostic equipment is returned to the upper interface, the diagnostic equipment is limited to be operated only through a navigation button fixed on the interface and appears to be single; and the gesture graphs are single, so that the requirements of diversification and individuation cannot be met.
As disclosed in patent document CN105549838A with publication date 2016.05.04, a method for destroying current Activity by gesture sliding trigger is disclosed, which includes: opening an application program to enter a first Activity interface; pressing one end of one side of the screen, and then slowly sliding to the other side; when slid to the width of one third of the screen, the application program executes a finish () method; destroying the current Activity interface and returning to the second Activity interface. The technical scheme of the application aims to provide a gesture response, the gesture response can return to the upper layer by sliding from left to right, the current Activity is destroyed, the effect of clicking a return button is realized, the operation of a user is convenient, and the experience is improved.
The scheme can achieve the return operation through gestures, but the gesture response technology is simple in overall function, cannot meet the requirements of intelligent monitoring and intelligent response in the running process of the automobile diagnosis equipment, is insufficient in regulation and control of hardware resources, has the condition of wasting the hardware resources, and still needs to be further optimized.
Disclosure of Invention
In order to solve the defects of single gesture response U I and single function of the existing automobile diagnosis equipment, the application provides an intelligent gesture monitoring method of the automobile diagnosis equipment, which comprises the following steps:
s100, creating a corresponding core management class in view layout existing in source codes;
s200, creating a gesture view through the core management class;
s300, generating a custom animation icon in the gesture view;
s400, displaying the animation icons in a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor the finger touch action in real time; the periodic frequency of monitoring by people with different rights/different devices is different;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment of S600 is valid.
Preferably, the gesture view in step S200 is carried on the whole screen.
Preferably, the listening area of the gesture is registered by Wi ndowManagerServi ce (WMS) in step S500.
Preferably, in step S500, the monitoring is awakened according to the time of the diagnostic operation of the vehicle diagnostic device.
Preferably, in step S600, the sliding distance of the finger is determined to realize different responses.
Preferably, the gesture view generated in step S200 is associated with the operation user authority I D, that is, the same gesture view responds differently due to different operation user authorities.
Preferably, in step S700, the number of times of returning the response within the preset time is dynamically set according to the frequency of detecting the finger touch motion in real time.
The application also provides an intelligent gesture monitoring device of the automobile diagnosis equipment, which comprises
The core management module is used for creating a corresponding core management class in view layout existing in the source code;
the creation module is used for creating a gesture view through the core management class;
the drawing module is used for generating a custom animation icon in the gesture view;
the display module is used for displaying the animation icons in a screen;
the monitoring module is used for registering a monitoring area of the gesture in the core management class so as to monitor the finger touch action in real time; the periodic frequency of monitoring by people with different rights/different devices is different;
the judging module is used for judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and the response module is used for returning corresponding animation icons and finger touch action responses through the core management class when the judging module judges that the animation icons and the finger touch action responses are valid.
The application also provides an automobile diagnosis device, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the gesture intelligent monitoring method of the automobile diagnosis device when executing the computer program.
The application also provides a computer readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the steps of the intelligent monitoring method for gestures of an automobile diagnostic device as described in any of the above.
Based on the above, compared with the prior art, the gesture intelligent monitoring method for the automobile diagnostic equipment provided by the application has the advantages that the gesture return operation of the automobile diagnostic equipment can meet the requirements of intelligent monitoring and intelligent response in the running process of the equipment, the hardware resources are regulated and controlled, the automobile diagnostic equipment can respond according to different conditions, and the equipment resources are effectively saved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
For a clearer description of embodiments of the application or of the solutions of the prior art, the drawings that are needed in the description of the embodiments or of the prior art will be briefly described, it being obvious that the drawings in the description below are some embodiments of the application, and that other drawings can be obtained from them without inventive effort for a person skilled in the art; the positional relationships described in the drawings in the following description are based on the orientation of the elements shown in the drawings unless otherwise specified.
FIG. 1 is a flow chart of an intelligent gesture monitoring method for an automobile diagnostic device;
FIG. 2 is an overall flow chart of the practical operation of embodiment 1 of the present application;
FIG. 3 is a schematic diagram of a gesture intelligent monitoring device for an automobile diagnostic equipment;
fig. 4 is a schematic structural diagram of an automobile diagnostic device according to the present application.
Reference numerals:
10 core management Module 20 creation Module 30 drawing Module
40 display module 50 monitoring module 60 judging module
70 response module
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application; the technical features designed in the different embodiments of the application described below can be combined with each other as long as they do not conflict with each other; all other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the description of the present application, it should be noted that all terms used in the present application (including technical terms and scientific terms) have the same meaning as commonly understood by one of ordinary skill in the art to which the present application belongs and are not to be construed as limiting the present application; it will be further understood that terms used herein should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Example 1
In order to solve the problem that the gesture response U I is single and functionally simple, and cannot meet the needs of diversification and individuation, the embodiment provides an intelligent gesture monitoring method for an automobile diagnosis device, referring to fig. 1 and 2, applied to the automobile diagnosis device, specifically comprising the following steps:
s100, creating a corresponding core management class in view layout existing in source codes;
s200, creating a gesture view through the core management class;
s300, generating a custom animation icon in the gesture view;
s400, displaying the animation icons in a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor the finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment of S600 is valid.
The gesture intelligent monitoring method of the automobile diagnostic equipment can be developed secondarily based on the android 10, edgeBackGestureHand l er classes can be created in the SystemU I through the D I framework during actual development, and EdgeBackGestureHand l er classes are core management classes of the gesture intelligent monitoring method of the whole automobile diagnostic equipment. When the core management class is constructed, the preset parameters and variables are needed to be judged by initializing the preset gestures, and when the core management class creates a gesture view (Navi gat i onBarEdgePane l), the core management class can inject the preset parameters and variables into the gesture view. And then generating a custom animation icon in the gesture view, and displaying the custom animation icon through a screen. And the core management class registers the monitoring area to monitor the finger touch action in real time. The specific listening area setting flow is constructed with reference to I nputchannel l. And the gesture view judges the validity of the finger touch action and the touch area according to the monitored data such as the touch position, the touch distance and the like of the finger touch action, and when the gesture view is judged to be valid, the corresponding animation icons and the finger touch action responses are returned through the core management class.
Referring to FIG. 2, S100-S400 are choices for establishing multiple sets of gestures U I to switch functions and demand experiences using factory design modes of the design modes. If a user wants a prompt U I with a white arrow on the inside of a red dot, and the function has the same effect on both left and right sliding to return the interface to the upper layer, establishing AP I method 1 according to such a requirement; b, the user wants an intuitive water ripple animation effect, and the user slides left to return to the upper page, slides right to pull out the sidebar, and establishes the AP I method 2. And after the whole method is established, responding according to actual operation. The custom animation icon has the advantages that on one hand, the custom operation can be performed according to the behavior habit of each user so as to improve the efficiency; on the one hand, personalized service can be provided, and the experience of use is improved.
In order to complete the response, the corresponding monitoring step is needed to be performed in S500, the distance between the pressed position of the finger and the edge of the screen is monitored in S500, when the finger touches the pixel of 30-40 pixels from the left edge or the right edge of the screen, a response instruction is sent to an AP I method appointed by the system, and a self-defined icon effect is displayed on the screen; then, according to the left sliding or right sliding of the finger, the sliding distance is a preset effective monitoring area within 0-30 pixels. And then, the step S600 is carried out to judge that if the monitored area is valid, the step S700 is carried out to return to the previous page operation, and if the distance between the monitored position pressed by the finger and the edge of the screen in the step S500 exceeds 30-40 pixels and is smaller than the whole screen width, the area is an invalid area, and the judgment in the step S600 is not carried out. Further, since the application is in the normal immersion mode, if the user slides from the edge, the application does not recognize that the gesture has occurred and cannot signal. The sticky immersion mode is thus globally employed, which allows the device to optimize the sliding effect without requiring a notification bar at the top of the system. Specifically, the creation of the gesture view and the registration of the listening area may be performed at the code level at the same time.
Preferably, the gesture view bearer in step S200 may be on the whole screen.
Specifically, the listening area of the gesture may be registered through Wi ndowManagerServi ce (WMS) in step S500.
The device resources are wasted by always monitoring in the gesture intelligent monitoring method. The periodic frequency of the listening by the persons of the different authorities/different devices is thus different in step S500. For example, the high-level manager authority has complex operation due to the large operation authority, so that high-frequency monitoring is performed, but the gesture response interface of a general user is relatively small, and the gesture response type is single, so that the monitoring frequency is set to be low.
Preferably, the monitoring can also be awakened according to the time of the diagnosis work of the automobile diagnosis device, such as stopping monitoring after the diagnosis work is started, and awakening monitoring when the diagnosis is performed to the final stage. Preferably, when the automobile diagnosis equipment does not work within the set time, the specific time can be set by a user by himself, monitoring is stopped after the set time is unmanned, the automobile diagnosis equipment reduces the screen brightness, and then the automobile diagnosis equipment resumes the brightness after receiving the operation and wakes up the monitoring module.
Example 2
To further solve the problem that the fixed response area is fixed, which leads to inadaptation of some application scenes. The application also provides an embodiment 2 to adapt to different application scenes, which specifically comprises the following steps:
s100, creating a corresponding core management class in view layout existing in source codes;
s200, creating a gesture view through the core management class;
s300, generating a custom animation icon in the gesture view;
s400, displaying the animation icons in a screen;
s500, monitoring finger touch actions in real time in a monitoring area of a registered gesture in a core management class, and dynamically adjusting the range and the position of the monitoring area according to a use scene and an operation user authority, wherein the shape of the range and the position is a shape of a custom non-geometric rule according to the screen layout of the device;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment of S600 is valid.
If the gesture response area is fixed, there is a problem that the gesture response area is unsuitable for some application scenarios, for example, the automobile diagnosis device may be placed at a high place or a low place, and the screen fixation may cause inconvenient use; there is also a case that some areas are physically damaged and cannot respond by using gestures, so that gesture response cannot be used. Therefore, in step S500, the person skilled in the art may preset according to the usage scenario and the authority of the operating user, dynamically adjust the range and the position of the listening area, and the specific shape of the range and the position may be a standard geometric shape or a custom shape according to the screen layout of the device. In actual operation, the method can be realized by dividing regions in advance, selecting the regions in a follow-up mode or customizing the regions. And the specific range and the specific position do not need to be carried on all screens, so that hardware resources are saved, and the problem that the screen cannot be used due to physical damage of certain areas can be avoided.
And the non-geometric regular shape can prevent the users without professional training instruction from being used, so that the fault problem caused by misoperation or messy operation of the non-professional users is avoided.
Example 3
In order to further solve the problem that in practical application, gesture response is low in efficiency and needs to be repeated for many times. The application also provides embodiment 3 to improve the efficiency of gesture operation response, comprising the following steps:
s100, creating a corresponding core management class in view layout existing in source codes;
s200, creating a gesture view through the core management class;
s300, generating a custom animation icon in the gesture view;
s400, displaying the animation icons in a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor the finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view, and judging the sliding distance of the finger to realize different responses;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment of S600 is valid.
If the diagnostic equipment has more software interfaces, the gesture is adopted to respond to page-by-page screen turning, and the efficiency is low. The sliding distance of the finger in S600 is thus determined to achieve different responses. Specifically, the judgment of the finger sliding distance 0-30 pixels in the step S600 may be divided into 3 segments, where each segment corresponds to the operation of turning 1 page, 2 pages or 3 pages. Therefore, if the user wants to turn to the page 2 or the page 3 directly, the user can control the device according to the sliding distance of the finger, so that the efficiency is improved, and the software and hardware resources of the device are saved.
Example 4
In order to solve the problem that the operator is complicated and easy to cause misoperation when the automobile diagnosis equipment is actually used, the application also provides an embodiment 4 for facilitating the operation and management of the equipment, which comprises the following steps:
s100, creating a corresponding core management class in view layout existing in source codes;
s200, creating a gesture view through the core management class, and performing association mapping binding on the generated gesture view and the operation user authority I D, namely, the same user gesture responds to different results due to different operation user authorities;
s300, generating a custom animation icon in the gesture view;
s400, displaying the animation icons in a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor the finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment of S600 is valid.
The existing gesture response lacks of setting the management authority, and can be operated no matter what authority is after entering, so that management is not convenient. Therefore, the gesture view generated in step S200 is associated with the operation user authority I D, that is, the same gesture view responds differently due to different operation user authorities (for example, when the person with different authorities operates to slide horizontally left and right, the responding functions may be different). For example, an advanced manager enters a device interface, the gesture response area is relatively large, and the types of gestures which can be operated are relatively large; for example, the gesture response interface of a general user is relatively small, and the gesture response type is relatively single. Therefore, the management problem caused by the operation of common operators after the advanced manager logs in can be solved. In specific implementation, different authority accounts are set on the device system level, each authority account is correspondingly provided with an operation user authority I D, and a gesture library corresponding to the user authority I D is provided. And activating the corresponding authority account in modes such as fingerprint identification, gesture input, activation password and the like according to the equipment model and configuration, and configuring corresponding operation user authorities I D. When the core management class creates the gesture view, the content judged by the operation user authority I D is injected, the gesture view completes association mapping binding with the operation user authority I D after being created, and then the gesture view can traverse the gesture library and make corresponding judgment.
Example 5
In order to solve the problem that an operator can operate too fast and misoperation occurs when the gesture intelligent monitoring method of the automobile diagnosis equipment is actually operated. The application also provides an optimization of example 5 comprising the steps of:
s100, creating a corresponding core management class in view layout existing in source codes;
s200, creating a gesture view through the core management class;
s300, generating a custom animation icon in the gesture view;
s400, displaying the animation icons in a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor the finger touch action in real time;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, when the judgment of S600 is valid, corresponding animation icons and finger touch action responses are returned through the core management class, and the number of times of returning responses in a preset time is dynamically adjusted according to the frequency of detecting the finger touch actions in real time.
The existing gesture intelligent monitoring method has the problem that misoperation is caused by too fast operation in unit time of a user when return operation is performed. Therefore, the number of times of returning the response within the preset time may be set in S700. Specifically, only 1 gesture response can be responded in a unit time, for example, only 1 gesture response is responded in 0.5s, so that misoperation is avoided, and the operation frequency is avoided to waste hardware resources of the equipment.
The frequency of the finger touch action can be dynamically set and adjusted in real time, and the number of times of response returned in preset time can be adjusted, for example, when the frequency of the finger touch action is detected to be more in real time in a certain period of time, the operation is frequent in the period of time, and quick response is required, so that the number of times of response returned in the preset time can be properly increased; in contrast, if the frequency of detecting the finger touch action in real time within a certain period of time is relatively small, which means that the operation is not frequent in the period of time and quick response is not needed, the number of times of returning the response within the preset time can be properly reduced. Therefore, the method not only meets the operation and use, but also saves the software and hardware resources of the equipment.
Example 6
In order to solve the above problems, the application also provides an embodiment 6 for optimizing the automobile diagnosis equipment, which comprises the following steps:
s100, creating a corresponding core management class in view layout existing in source codes;
s200, creating a gesture view through the core management class;
s300, generating a custom animation icon in the gesture view;
s400, displaying the animation icons in a screen;
s500, registering a monitoring area of gestures in a core management class to monitor finger touch actions in real time, and meanwhile, monitoring periodic frequencies of people with different authorities/different devices are different;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment of S600 is valid.
Example 7
The embodiment also provides an intelligent gesture monitoring device of the automobile diagnosis equipment, as shown in fig. 3, which comprises a core management module, a processing module and a processing module, wherein the core management module is used for creating a corresponding core management class in a view layout existing in source codes;
the creation module is used for creating a gesture view through the core management class;
the drawing module is used for generating a custom animation icon in the gesture view;
the display module is used for displaying the animation icons in a screen;
the monitoring module is used for registering a monitoring area of the gesture in the core management class so as to monitor the finger touch action in real time;
the judging module is used for judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and the response module is used for returning corresponding animation icons and finger touch action responses through the core management class when the judging module judges that the animation icons and the finger touch action responses are valid.
It should be noted that, it should be understood that the division of the modules of the above apparatus is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; the method can also be realized in a form of calling software by a processing element, and the method can be realized in a form of hardware by a part of modules. For example, the processing module may be a processing element that is set up separately, may be implemented in a chip of the above-mentioned apparatus, or may be stored in a memory of the above-mentioned apparatus in the form of program codes, and the functions of the above-mentioned processing module may be called and executed by a processing element of the above-mentioned apparatus. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
Example 8
The embodiment provides an automobile diagnosis device, as shown in fig. 4, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the gesture intelligent monitoring method of the automobile diagnosis device when executing the computer program.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, wherein the computer program realizes the steps of the intelligent gesture monitoring method of the automobile diagnosis equipment when being executed by a processor.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus, device and unit described above may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein. Those of ordinary skill in the art will appreciate that the modules and steps of the examples described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the elements and steps of the examples have been described generally in terms of functionality in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus, device and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the units is merely a logical function division, there may be another division manner in actual implementation, or units having the same function may be integrated into one unit, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or elements, or may be an electrical, mechanical, or other form of connection.
In addition, it should be understood by those skilled in the art that although many problems exist in the prior art, each embodiment or technical solution of the present application may be modified in only one or several respects, without having to solve all technical problems listed in the prior art or the background art at the same time. Those skilled in the art will understand that nothing in one claim should be taken as a limitation on that claim.
Although terms such as source code, core management classes, snooping, etc. are used more herein, the possibility of using other terms is not precluded. These terms are used merely for convenience in describing and explaining the nature of the application; they are to be interpreted as any additional limitation that is not inconsistent with the spirit of the present application; it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or electronic device that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or electronic device. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or electronic device that comprises the element.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (10)

1. The intelligent gesture monitoring method for the automobile diagnosis equipment is characterized by comprising the following steps of:
s100, creating a corresponding core management class in view layout existing in source codes;
s200, creating a gesture view through the core management class;
s300, generating a custom animation icon in the gesture view;
s400, displaying the animation icons in a screen;
s500, registering a monitoring area of the gesture in a core management class to monitor the finger touch action in real time; the periodic frequency of monitoring by people with different rights/different devices is different;
s600, judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and S700, returning corresponding animation icons and finger touch action responses through the core management class when the judgment of S600 is valid.
2. The intelligent monitoring method for gestures of the automobile diagnostic equipment according to claim 1 is characterized in that: the gesture view is carried on the whole screen in step S200.
3. The intelligent monitoring method for gestures of the automobile diagnostic equipment according to claim 1 is characterized in that: in step S500, the listening area of the gesture is registered through windowmanager service.
4. The intelligent monitoring method for gestures of the automobile diagnostic equipment according to claim 1 is characterized in that: in step S500, the monitoring is awakened according to the time of the diagnostic operation of the vehicle diagnostic device.
5. The intelligent monitoring method for gestures of the automobile diagnostic equipment according to claim 1 is characterized in that: in step S600, the sliding distance of the finger is determined to realize different responses.
6. The intelligent monitoring method for gestures of the automobile diagnostic equipment according to claim 1 is characterized in that: the gesture view generated in step S200 is associated with the operation user permission ID, that is, the same gesture view responds differently due to different operation user permissions.
7. The intelligent monitoring method for gestures of an automobile diagnostic device according to any one of claims 1 to 6, characterized in that: in step S700, the number of times of returning the response in the preset time is dynamically adjusted according to the frequency of detecting the finger touch action in real time.
8. An intelligent gesture monitoring device of automobile diagnosis equipment is characterized in that: comprising
The core management module is used for creating a corresponding core management class in view layout existing in the source code;
the creation module is used for creating a gesture view through the core management class;
the drawing module is used for generating a custom animation icon in the gesture view;
the display module is used for displaying the animation icons in a screen;
the monitoring module is used for registering a monitoring area of the gesture in the core management class so as to monitor the finger touch action in real time; the periodic frequency of monitoring by people with different rights/different devices is different;
the judging module is used for judging the effectiveness of the finger touch action and the touch area according to the monitored touch position and touch distance of the finger touch action in the gesture view;
and the response module is used for returning corresponding animation icons and finger touch action responses through the core management class when the judging module judges that the animation icons and the finger touch action responses are valid.
9. An automotive diagnostic apparatus comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method for intelligent monitoring of gestures of an automotive diagnostic apparatus according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the intelligent monitoring method for gestures of an automotive diagnostic device according to any one of claims 1 to 7.
CN202310208397.2A 2022-07-20 2022-07-20 Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment Pending CN116820320A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310208397.2A CN116820320A (en) 2022-07-20 2022-07-20 Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310208397.2A CN116820320A (en) 2022-07-20 2022-07-20 Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment
CN202210850145.5A CN114924686B (en) 2022-07-20 2022-07-20 Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202210850145.5A Division CN114924686B (en) 2022-07-20 2022-07-20 Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment

Publications (1)

Publication Number Publication Date
CN116820320A true CN116820320A (en) 2023-09-29

Family

ID=82815606

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210850145.5A Active CN114924686B (en) 2022-07-20 2022-07-20 Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment
CN202310208397.2A Pending CN116820320A (en) 2022-07-20 2022-07-20 Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210850145.5A Active CN114924686B (en) 2022-07-20 2022-07-20 Intelligent gesture response processing method, device, equipment and medium for automobile diagnosis equipment

Country Status (1)

Country Link
CN (2) CN114924686B (en)

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8799775B2 (en) * 2009-09-25 2014-08-05 Apple Inc. Device, method, and graphical user interface for displaying emphasis animations for an electronic document in a presentation mode
US8509986B1 (en) * 2012-04-27 2013-08-13 Innova Electronics, Inc. Automotive diagnostic tool with projection display and virtual input
US10444819B2 (en) * 2015-06-19 2019-10-15 Intel Corporation Techniques to control computational resources for an electronic device
US20170102697A1 (en) * 2015-10-08 2017-04-13 General Motors Llc Selecting a vehicle function to control using a wearable electronic device
CN105425966A (en) * 2015-12-14 2016-03-23 珠海全志科技股份有限公司 Gesture control method and device based on Android system
WO2017113407A1 (en) * 2015-12-31 2017-07-06 华为技术有限公司 Gesture recognition method and apparatus, and electronic device
KR20170121930A (en) * 2016-04-26 2017-11-03 현대자동차주식회사 Wearable device and apparatus for vehicles diagnosis having the same
CN107357479B (en) * 2016-05-10 2022-05-06 中兴通讯股份有限公司 Application program management method and device
CN106445243B (en) * 2016-11-07 2020-04-10 深圳Tcl数字技术有限公司 Touch response device and method of intelligent equipment
CN106886331B (en) * 2017-01-12 2020-04-24 青岛海信移动通信技术股份有限公司 Data processing method and device of touch terminal and touch terminal
CN109271220A (en) * 2018-08-16 2019-01-25 广州优视网络科技有限公司 Method, calculating equipment and the storage medium that the page returns are controlled by gesture operation
US20200090430A1 (en) * 2018-09-17 2020-03-19 Westinghouse Air Brake Technologies Corporation Diagnostic System for a Transit Vehicle
CN110069207B (en) * 2019-04-24 2024-03-19 努比亚技术有限公司 Zxfoom zxfoom , zxfoom the method is Mobile terminal and mobile terminal read storage medium
CN113849090B (en) * 2020-02-11 2022-10-25 荣耀终端有限公司 Card display method, electronic device and computer readable storage medium
CN120085771A (en) * 2020-06-18 2025-06-03 花瓣云科技有限公司 Terminal device and gesture operation method and medium thereof
CN113110771A (en) * 2021-04-01 2021-07-13 Tcl通讯(宁波)有限公司 Desktop application icon display control method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN114924686B (en) 2023-03-14
CN114924686A (en) 2022-08-19

Similar Documents

Publication Publication Date Title
US10607097B2 (en) Method and device for guiding fingerprint recognition
CN108323195B (en) Press detection method and device of fingerprint identification system and terminal equipment
CN107219988B (en) A kind of interface operation guidance method and mobile terminal
CN111665938A (en) Application starting method and electronic equipment
CN111201501A (en) Method for providing haptic feedback to an operator of a touch sensitive display device
CN113721808B (en) Control method and device
CN112506597B (en) Software interface color matching method and device, computer equipment and storage medium
CN113138818A (en) Interface display method and device and electronic equipment
CN112764611B (en) Application program control method and device and electronic equipment
CN111158806A (en) Interface display method and device, computer equipment and storage medium
CN112631492A (en) Task creation method and device
CN116820320A (en) Gesture intelligent monitoring method, device, equipment and medium for automobile diagnosis equipment
CN113360060B (en) Task implementation methods, devices and electronic equipment
CN107621967B (en) Method and device for realizing system user interface
CN114222355A (en) Terminal power saving display method and device and electronic equipment
CN111796736B (en) Application sharing method and device and electronic equipment
CN105425957A (en) Terminal equipment control method and device
CN104516481A (en) Information processing method and electronic equipment
TWI607369B (en) System and method for adjusting image display
CN113094137B (en) Application installation prompting method, device, electronic equipment and readable storage medium
WO2023155812A1 (en) Application permission management method and apparatus, and electronic device
CN114270298B (en) Touch event processing method and device, mobile terminal and storage medium
CN109896370A (en) A kind of elevator floor authority setting method, device and equipment
CN106604094A (en) Input method and system thereof
CN114265530A (en) Button construction and response method, device and terminal based on iOS system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination