CN114253417B - Multi-touch point identification method and device, computer readable medium and electronic equipment - Google Patents
Multi-touch point identification method and device, computer readable medium and electronic equipment Download PDFInfo
- Publication number
- CN114253417B CN114253417B CN202111455883.1A CN202111455883A CN114253417B CN 114253417 B CN114253417 B CN 114253417B CN 202111455883 A CN202111455883 A CN 202111455883A CN 114253417 B CN114253417 B CN 114253417B
- Authority
- CN
- China
- Prior art keywords
- touch
- points
- direct
- point
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04182—Filtering of noise external to the device and not generated by digitiser components
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Human Computer Interaction (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides a multi-touch point identification method, a multi-touch point identification device, a computer readable medium and electronic equipment. The multi-touch point identification method comprises the following steps: acquiring all touch points on the large screen; determining the distance between the touch points; clustering the touch points according to the distance between the touch points to obtain a touch aggregation area; and determining the position of the touch identification point according to the position of the touch aggregation area. The technical scheme of the application has better noise reduction effect on the noise points with larger discreteness, and solves the problems that the recognition of the touch points is too dead and the noise reduction effect on the point positions with larger discreteness is poor in the prior art.
Description
Technical Field
The present disclosure relates to the field of display technologies, and in particular, to a method and an apparatus for identifying multiple touch points, a computer readable medium, and an electronic device.
Background
Currently, touchable screens are very widely used in display devices, such as projectors, gaming machines, televisions, and other similar devices. For multi-touch point identification positioning, a digital image processing technology is generally adopted, and the position of an actual touch point is determined by calculating the position of the touch point in an image. In practical applications, noise, which is either weak or strong, often occurs in the touch point image captured by the sensor. The current common treatment method is to set a fixed threshold value, and use the threshold value to eliminate noise interference. The method is excessively dead for identifying the touch points, has poor noise reduction effect on the point positions with larger discreteness, and cannot accurately identify and position the touch identification points with larger discreteness.
Disclosure of Invention
The embodiment of the application provides a multi-touch point identification method, a multi-touch point identification device, a computer readable medium and electronic equipment, and aims to solve the problems that in the prior art, the identification of touch points is excessively dead, and the point position noise reduction effect with larger discreteness is poor.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned in part by the practice of the application.
According to an aspect of the embodiments of the present application, there is provided a multi-touch point identification method, including:
acquiring all touch points on the large screen;
determining the distance between the touch points;
clustering the touch points according to the distance between the touch points to obtain a touch aggregation area;
and determining the position of the touch identification point according to the position of the touch aggregation area.
In some embodiments of the present application, the acquiring all touch points on the large screen specifically includes:
collecting touch characteristic values of all points on the large screen, and generating a touch gradient image;
binarizing touch characteristic values of all points in the touch gradient image to obtain a target binary image;
and taking the point position with the touch characteristic value as a first target value in the target binary image as the touch point.
In some embodiments of the present application, binarizing the touch feature value of each point in the touch image to obtain a target binary image specifically includes:
threshold processing is carried out on touch characteristic values of all points in the touch gradient image according to the initial threshold value, and a transition binary image is obtained;
selecting a point position with the touch characteristic value as a second target value in the transition binary image to calculate a histogram;
and performing the oxford binarization processing according to the histogram to obtain a target binary image.
In some embodiments of the present application, the performing an oxford binarization process according to the histogram to obtain a target binary image specifically includes:
performing an oxford method calculation according to the histogram to obtain an oxford threshold;
and performing binarization processing on the transition binary image according to the Ojin threshold to obtain a target binary image.
In some embodiments of the present application, the determining the distance between the touch points specifically includes:
storing touch information of each touch point into a touch information storage table, wherein the stored information comprises position information of the touch point;
and determining the distance between the touch points according to the position information in the touch information storage table.
In some embodiments of the present application, clustering each touch point according to a distance between the touch points to obtain a touch aggregation area specifically includes:
clustering the touch points based on the density direct distance according to the distance between the touch points to obtain a direct aggregation area;
determining the distance between the direct polymerization zones;
and merging the direct polymerization areas based on the density connection distance according to the distance between the direct polymerization areas to obtain the touch control polymerization areas.
In some embodiments of the present application, clustering each of the touch points based on the density direct distance according to the distance between each of the touch points to obtain a direct aggregation area specifically includes:
judging two touch points with the distance smaller than the density direct distance as density direct points, and aggregating all the touch points with the density direct points to form a density direct area;
and dividing the density direct region into a plurality of direct aggregation regions according to the touch characteristic value of each touch point.
In some embodiments of the present application, after the touch points with two distances smaller than the density direct distance are determined to be the density direct points, all the touch points with the density direct points are aggregated together to form a density direct area, the method further includes:
And if the number of the touch points in the density direct area is smaller than a preset number threshold value, filtering the density direct area.
In some embodiments of the present application, the dividing the density direct area into a plurality of density direct areas according to the touch characteristic value of the touch point to obtain a direct aggregation area specifically includes:
selecting a core point in the density direct region according to each touch characteristic value;
selecting a touch point directly reaching the density of the core point;
separating a direct polymerization zone formed by the core points and the touch points from the density direct zone to obtain a residual density direct zone;
and continuing to separate the residual density direct region until the density direct region is divided into a plurality of direct polymerization regions.
In some embodiments of the present application, the merging, based on the density-connected distance, of each direct polymerization zone according to the distance between each direct polymerization zone to obtain a touch polymerization zone specifically includes:
determining the distance between core points of the direct polymerization zones as the distance between the direct polymerization zones;
and if the distance between the two direct polymerization zones is smaller than a preset combining threshold value, combining the two direct polymerization zones.
In some embodiments of the present application, the determining, according to the position of the touch aggregation area, the position of the touch identification point specifically includes:
determining the centroid position of the touch aggregation area according to the position and brightness of each touch point in the touch aggregation area;
and taking the centroid position as the position of the touch identification point.
According to an aspect of the embodiments of the present application, there is provided a multi-touch point recognition apparatus, including: the acquisition module is used for acquiring all touch points on the large screen; the distance module is used for determining the distance between the touch points; the aggregation module is used for clustering the touch points according to the distance between the touch points to obtain a touch aggregation area; and the identification module is used for determining the position of the touch identification point according to the position of the touch aggregation area.
In some embodiments of the present application, the acquiring module specifically includes: the acquisition sub-module is used for acquiring touch characteristic values of all points on the large screen and generating a touch gradient image; the binary submodule is used for binarizing the touch characteristic values of all the points in the touch gradient image to obtain a target binary image; and the screening sub-module is used for taking the point position of which the touch characteristic value is the first target value in the target binary image as the touch point.
In some embodiments of the present application, the binary submodule specifically includes: the transition processing unit is used for carrying out threshold processing on touch characteristic values of all points in the touch gradient image according to the initial threshold value to obtain a transition binary image; the straight-square computing unit is used for selecting point positions with the touch characteristic value in the transition binary image as a second target value to compute a histogram; and the oxford binary unit is used for carrying out oxford binarization processing according to the histogram to obtain a target binary image.
In some embodiments of the present application, the oxford binary unit is specifically configured to perform the following steps: performing an oxford method calculation according to the histogram to obtain an oxford threshold; and performing binarization processing on the transition binary image according to the Ojin threshold to obtain a target binary image.
In some embodiments of the present application, the distance module specifically includes: the storage sub-module is used for storing touch information of each touch point into a touch information storage table, and the storage information comprises position information of the touch point; and the determining submodule is used for determining the distance between the touch points according to the position information in the touch information storage table.
In some embodiments of the present application, the aggregation module specifically includes: the first aggregation sub-module is used for clustering the touch points based on the density direct distance according to the distance between the touch points to obtain a direct aggregation area; a distance determination submodule for determining the distance between the direct polymerization zones; and the second polymerization submodule is used for merging the direct polymerization areas based on the density connection distance according to the distance between the direct polymerization areas to obtain a touch polymerization area.
In some embodiments of the present application, the first aggregation sub-module specifically includes: the aggregation unit is used for judging two touch points with the distance smaller than the density direct distance as density direct points, and aggregating all the touch points with the density direct points together to form a density direct area; the dividing unit is used for dividing the density direct region into a plurality of direct aggregation regions according to the touch characteristic value of each touch point.
In some embodiments of the present application, the first aggregation sub-module further comprises: and the filtering unit is used for filtering the density direct area if the number of the touch points in the density direct area is smaller than a preset number threshold value.
In some embodiments of the present application, the dividing unit is specifically configured to perform the following steps: selecting a core point in the density direct region according to each touch characteristic value; selecting a touch point directly reaching the density of the core point; separating a direct polymerization zone formed by the core points and the touch points from the density direct zone to obtain a residual density direct zone; and continuing to separate the residual density direct region until the density direct region is divided into a plurality of direct polymerization regions.
In some embodiments of the present application, the second polymer sub-module specifically includes: a distance determining unit configured to determine a distance between core points of the direct polymerization zones as a distance between the direct polymerization zones; and the region merging unit is used for merging the two direct polymerization regions if the distance between the two direct polymerization regions is smaller than a preset merging threshold value.
In some embodiments of the present application, the identification module specifically includes: the centroid unit is used for determining the centroid position of the touch aggregation area according to the position and brightness of each touch point in the touch aggregation area; and the determining unit is used for taking the centroid position as the position of the touch identification point.
According to an aspect of the embodiments of the present application, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a multi-touch point recognition method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; and a storage device for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the multi-touch point recognition method as described in the above embodiments.
According to an aspect of embodiments of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium and executes the computer instructions to cause the computer device to perform the multi-touch point recognition method provided in the various alternative implementations described above.
In the technical scheme provided by some embodiments of the application, firstly, by collecting touch information of each point on the display module, according to the touch information of each point, all touch points on a large screen are filtered and identified, so that primary noise reduction is realized, and the touch information can be further processed. And determining the distance between the touch points, so as to cluster and combine the points with the close distance or the same area according to the distance between the touch points, and obtaining a touch aggregation area to realize final noise reduction. And finally, determining the position of the touch identification point according to the position of the touch aggregation area, wherein the determined touch identification point is the point of the touch operation performed by the user when the user is finally considered. The noise reduction processing is carried out from different layers for a plurality of times, noise points with lower touch characteristic values are filtered through preprocessing, similar noise points are filtered through a merging mode, good noise reduction effect is achieved on noise points with larger discreteness, and the problem that the noise reduction effect on point positions with larger discreteness is poor due to excessively dead plates in the touch point identification in the prior art is solved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. It is apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art. In the drawings:
fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solutions of the embodiments of the present application may be applied.
Fig. 2 schematically illustrates a flow chart of a multi-touch point recognition method according to one embodiment of the present application.
Fig. 3 is a flowchart of a specific implementation of step S100 in the method for multi-touch point recognition according to the corresponding embodiment of fig. 2.
Fig. 4 is a flowchart of a specific implementation of step S120 in the method for multi-touch point recognition according to the corresponding embodiment of fig. 3.
Fig. 5 is a flowchart of a specific implementation of step S300 in the method for multi-touch point recognition according to the corresponding embodiment of fig. 2.
Fig. 6 is a flowchart showing a specific implementation of step S310 in the method of multi-touch point recognition according to the corresponding embodiment of fig. 5.
Fig. 7 is a flowchart showing a specific implementation of step S330 in the method of multi-touch point recognition according to the corresponding embodiment of fig. 5.
Fig. 8 is a flowchart of a specific implementation of step S400 in the method for multi-touch point recognition according to the corresponding embodiment of fig. 2.
Fig. 9 schematically illustrates a block diagram of a multi-touch point recognition device according to one embodiment of the present application.
Fig. 10 shows a schematic diagram of a computer system suitable for use in implementing the electronic device of the embodiments of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present application. One skilled in the relevant art will recognize, however, that the aspects of the application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
Fig. 1 is a diagram of an implementation environment of a multi-touch point recognition method according to an embodiment, as shown in fig. 1, in the implementation environment, a large-screen display device is included, which includes a control module 100, a display module 200, and a touch module 300.
The display module 200 is a module that can be installed in any large-screen display device requiring a display interface and provides the display interface for the device, and the display module 200 can be installed in various large-screen devices. The large screen device refers to a device with a screen diagonal size of more than 40 inches. The touch module 300 is mounted on the display module 200, and is a module for collecting touch information of each point on the display module 200. The control module 100 is a module for controlling the display device, and can control the display module 200 to display, or collect and process touch information of each point on the display module 200 through the touch module 300.
In the use process, the control module 100 collects the touch information of each point on the display module 200 through the touch module 300, and finally identifies and locates the touch identification point through noise reduction and clustering.
It should be noted that, the control module 100, the display module 200, and the touch module 300 may be connected by a wired-wireless or other communication connection method, which is not limited herein.
The implementation details of the technical solutions of the embodiments of the present application are described in detail below:
fig. 2 illustrates a flow chart of a multi-touch point recognition method that may be performed by the control module 100 according to one embodiment of the present application, and the control module 100 may be the control module 100 illustrated in fig. 1. Referring to fig. 2, the multi-touch point recognition method at least includes steps S100 to S400, and is described in detail as follows:
step S100, obtaining all touch points on the large screen.
Step S200, determining a distance between the touch points.
Step S300, clustering the touch points according to the distance between the touch points to obtain a touch aggregation area.
Step S400, determining the position of the touch recognition point according to the position of the touch aggregation area.
In this embodiment, touch information of each point on the display module 200 is collected by the touch module 300, and all touch points on the large screen are filtered and identified according to the touch information of each point, so as to realize primary noise reduction, and further process the touch information. And determining the distance between the touch points, so as to cluster and combine the points with the close distance or the same area according to the distance between the touch points, and obtaining a touch aggregation area to realize final noise reduction. And finally, determining the position of the touch recognition point according to the position of the touch aggregation area, wherein the determined touch recognition point is the point which is finally considered to be the point for the user to perform touch operation, and the display equipment responds and performs subsequent touch post-operation based on the touch recognition point.
The above touch point acquisition modes can be acquired by using different touch modules 300 according to different touch modes. For example, when the touch mode is light source touch, the touch module 300 may be a light spot touch module; when the touch mode is press touch, the touch module 300 may be a pressure touch module; when the touch mode is touch, the touch module 300 may be a current touch module or a voltage touch module; when the touch mode is magnetic induction touch, the touch module 300 may be a magnetic induction module.
Specifically, in some embodiments, the specific implementation of step S100 may refer to fig. 3. Fig. 3 is a detailed description of step S100 in the multi-touch point recognition method according to the corresponding embodiment of fig. 2, where step S100 may include the following steps:
step S110, collecting touch characteristic values of all points on the large screen, and generating a touch gradient image.
Step S120, binarizing the touch characteristic value of each point in the touch gradient image to obtain a target binary image.
And step S130, taking the point position with the touch characteristic value as the first target value in the target binary image as the touch point.
In the present application, the touch point is acquired by acquiring touch information of each point on the display module 200 through the touch module 300, the touch information includes a position of the point and a touch feature value, then generating a touch gradient image including the touch feature value of each point according to the touch information, binarizing the touch feature value of each point in the touch gradient image to obtain a target binary image, and finally taking the point with the touch feature value of the target binary image being a first target value as a touch point.
The touch characteristic value may be different values based on different touch modes. For example, when the touch mode is light source touch, the touch characteristic value may be a light spot brightness value; when the touch mode is press touch, the touch characteristic value can be a contact pressure value; when the touch mode is touch, the touch characteristic value can be a voltage value or a current value; when the touch mode is magnetic induction touch, the touch characteristic value may be magnetic flux. In the subsequent embodiments of the present application, a form of light source touch will be taken as an example for explanation.
The result of the binarization may be in the form of 0 and 1, 1 and 2, and the target values may be 0, 1 and 2, depending on the particular manner of binarization.
Specifically, in some embodiments, the specific implementation of step S120 may refer to fig. 4. Fig. 4 is a detailed description of step S120 in the multi-touch point recognition method according to the corresponding embodiment of fig. 3, where step S120 may include the following steps:
step S121, threshold processing is carried out on touch characteristic values of all points in the touch gradient image according to an initial threshold value, and a transition binary image is obtained.
Step S122, selecting the point location with the touch characteristic value as the second target value in the transition binary image to calculate the histogram.
And step S123, performing the Ojin binarization processing on the transition binary image according to the histogram to obtain a target binary image.
In the embodiment of the application, the binarization processing is specifically implemented by setting an initial threshold, performing threshold processing on touch characteristic values of points in the touch gradient image based on the initial threshold to obtain a transition binary image, wherein the transition binary image is a binary image with a transition function, selecting the points with the touch characteristic values in the transition binary image as a second target value to calculate a histogram, and performing the Majin binarization processing on the transition binary image based on the histogram.
The initial threshold in step S121 may be selected or set by a person according to the actual parameters. In the specific implementation manner of step S121, based on the initial threshold, the touch characteristic value of each point in the touch gradient image is subjected to threshold processing to obtain a transition binary image, where the transition binary image is a binary image with a transition function, the point in the transition binary image, where the touch characteristic value is greater than the initial threshold, is given a first set value, and the point in the transition binary image, where the touch characteristic value is less than the initial threshold, is given a second set value. Then, in step S122, a point location with the touch characteristic value as a second target value in the transition binary image is selected to calculate a histogram, where the second target value is a first set value. Finally, in step S123, the transition binary image is subjected to the oxford binarization processing according to the histogram, so as to obtain a target binary image.
In the above embodiment, the conventional binarization process is performed once, so that the obtained histogram shows two symmetrical peaks, which can be better applied to the oxford algorithm to perform oxford binarization.
Specifically, in some embodiments, the following embodiments may be referred to for a specific implementation of step S123. The following embodiment is a detailed description of step S123 in the multi-touch point identification method according to the corresponding embodiment of fig. 4, where step S123 may include the following steps:
And carrying out an oxford method calculation according to the histogram to obtain an oxford threshold.
And performing binarization processing on the transition binary image according to the Ojin threshold to obtain a target binary image.
In this embodiment, the specific manner of obtaining the target binary image is that an oxford threshold value is obtained by performing an oxford method calculation according to a histogram, and then binarizing processing is performed on the transition binary image based on the oxford threshold value to obtain the target binary image. The target binary image in step S130 is the third set value.
In this embodiment, the first setting value, the second setting value, the third setting value, and the fourth setting value may be different values, for example, 0, 1, 2, and 3, respectively, or the first setting value and the third setting value may be equal, for example, 1, respectively, and the fourth setting value and the second setting value may be equal, for example, 0, respectively.
The implementation of the foregoing embodiment may be that binarization processing is performed on the gradient image based on the selected initial threshold, 1 is assigned to a point with a touch characteristic value greater than the initial threshold, and 0 is assigned to a point with a touch characteristic value less than the initial threshold, so as to finally obtain a transitional binary image. And selecting a point with 1 in the transition binary image, calculating a histogram according to the parameter information of the point in the original gradient image, acquiring an oxford threshold value, carrying out binarization processing based on the oxford threshold value, continuously keeping the point with the touch characteristic value larger than the oxford threshold value to be assigned 1 in the point with the value of 1 in the transition binary image, and changing the point with the touch characteristic value smaller than the oxford threshold value to be assigned 0. In step S130, a point location with a touch characteristic value of 1 in the target binary image is output as a touch point, so as to perform the subsequent steps.
Specifically, in some embodiments, the following embodiments may be referred to for a specific implementation of step S200. The following embodiment is a detailed description of step S200 in the multi-touch point identification method according to the corresponding embodiment shown in fig. 2, where step S200 may include the following steps:
storing touch information of each touch point into a touch information storage table, wherein the stored information comprises position information of the touch point;
and determining the distance between the touch points according to the position information in the touch information storage table.
In this embodiment, the specific manner of determining the distance between the touch points is to store the touch information of each touch point into a touch information storage table, where the stored information includes the position information of the touch point and the corresponding touch characteristic value. And then, the position information in the touch information storage table is called, and the distance between the touch points is determined.
Specifically, the touch information storage table may be provided with a point number, an abscissa, an ordinate, a touch characteristic value and a category label of the point.
The calculation can be performed according to the following formula:
dist=(data(i,1)-data(j,1)) 2 +(data(i,2)-data(j,2)) 2
wherein i and j of the first dimension each represent the number of a touch point, 1 of the second dimension represents the first column in the touch information storage table, namely the abscissa column, and 2 of the second dimension represents the second column in the touch information storage table, namely the ordinate column. Namely, data (i, 1) represents the abscissa of the touch point with the calling number i, data (i, 2) represents the ordinate of the touch point with the calling number i, data (j, 1) represents the abscissa of the touch point with the calling number j, and data (j, 2) represents the ordinate of the touch point with the calling number j. The calculated final value dist is the square value of the distance between the touch point with the number i and the touch point with the number j.
Specifically, in some embodiments, the specific implementation of step S300 may refer to fig. 5. Fig. 5 is a detailed description of step S300 in the multi-touch point recognition method according to the corresponding embodiment of fig. 2, where step S300 may include the following steps:
step S310, clustering the touch points based on the density direct distance according to the distance between the touch points, to obtain a direct aggregation area.
Step S320, determining the distance between the direct polymerization zones.
Step S330, merging the direct polymerization zones based on the density connection distance according to the distance between the direct polymerization zones, so as to obtain a touch polymerization zone.
In the embodiment, the specific manner of clustering the touch points is that first clustering is performed based on the density direct distance according to the distance between the touch points, and the touch points meeting the density direct condition are clustered together to form a direct aggregation area; then determining the distance between the direct polymerization zones; and finally, carrying out secondary clustering based on the density connection distance according to the distance between the direct polymerization areas, and merging the direct polymerization areas meeting the density connection condition to form a touch control polymerization area.
The first clustering is to cluster touch points with similar distances into an aggregation area, and the second clustering is to cluster connected aggregation areas into a larger aggregation area.
When the distance between the touch points is the data in the touch information storage table by calling, in step S310, the same category label is given to the touch points in the same direct aggregation area to indicate that the touch points belong to the same direct aggregation area, so that the subsequent calling can perform the second aggregation. In step S330, after the second aggregation is completed, for the touch points in the same touch aggregation area after aggregation, the category modification mark is modified to the same value to indicate that the touch points belong to the same touch aggregation area, so as to invoke calculation subsequently and determine the final touch identification point.
Wherein the embodiment of the first clustering is specifically shown in fig. 6. Specifically, in some embodiments, the specific implementation of step S310 may refer to fig. 6. Fig. 6 is a detailed description of step S310 in the multi-touch point recognition method according to the corresponding embodiment of fig. 5, where step S310 may include the following steps:
step S311, judging that two touch points with the distance smaller than the density direct distance are mutually density direct points, and aggregating all the touch points mutually being the density direct points to form a density direct area;
Step S313, dividing the density direct region into a plurality of direct aggregation regions according to the touch characteristic value of each touch point.
In an embodiment, two touch points with a distance smaller than the density direct distance are judged to be the density direct points, then all the touch points which are the density direct points are aggregated together to form a density direct area, and finally the density direct area is divided into a plurality of direct aggregation areas according to the touch characteristic values of the touch points in the density direct area.
In some embodiments of the present application, between step S311 and step S313, the following steps may be further included:
and if the number of the touch points in the density direct area is smaller than a preset number threshold value, filtering the density direct area.
When the number of the touch points in the density direct area is smaller than the preset number threshold, the touch points in the density direct area can be considered as noise points and should be filtered.
The predetermined number of thresholds is related to the density of the collectors of the touch module 300 arranged on the display module 200, and generally, the number of collectors is 2 if the density is higher, and the number of collectors is 5 if the density is lower.
Specifically, in some embodiments, the following embodiments may be referred to for a specific implementation of step S313. The following embodiment is a detailed description of step S313 in the multi-touch point identification method according to the corresponding embodiment shown in fig. 6, where step S313 may include the following steps:
Selecting a core point in the density direct region according to each touch characteristic value;
selecting a touch point directly reaching the density of the core point;
separating a direct polymerization zone formed by the core points and the touch points from the density direct zone to obtain a residual density direct zone;
and continuing to separate the residual density direct region until the density direct region is divided into a plurality of direct polymerization regions.
In this embodiment, after the density direct area is obtained, i.e. a point with the highest touch characteristic value is selected as a core point in the density direct area, then a direct aggregation area formed by the core point and the touch point is separated from the density direct area, so as to obtain a remaining density direct area, then the point with the highest touch characteristic value is selected as the core point in the remaining density direct area, and the remaining density direct area is separated again until the density direct area is divided into a plurality of direct aggregation areas.
In other embodiments of the present application, the first clustering may be performed by selecting one of all the touch points with the highest touch characteristic value as a core point, then selecting the touch point with the highest density of the core point to form a direct aggregation area, and then selecting the point with the highest touch characteristic value from the rest of the touch points as the core point, and continuing to form the direct aggregation area based on the density direct distance until all the touch points are traversed. Meanwhile, the direct aggregation area containing the touch points which do not meet the requirement can be regarded as noise points to be filtered.
Wherein the embodiment of the first clustering is specifically shown in fig. 7. Specifically, in some embodiments, the specific implementation of step S330 may refer to fig. 7. Fig. 7 is a detailed description of step S330 in the multi-touch point recognition method according to the corresponding embodiment of fig. 5, where step S330 may include the following steps:
step S331, determining the distance between core points of the direct polymerization zones as the distance between the direct polymerization zones;
in step S332, if the distance between the two direct polymerization zones is smaller than the predetermined combining threshold, the two direct polymerization zones are combined.
In this embodiment, the distance between the core points of the direct polymerization zones is taken as the distance between the direct polymerization zones, and when the distance between the two direct polymerization zones is smaller than a predetermined combination threshold, it is proved that the two direct polymerization zones are in density connection, and then the two direct polymerization zones are combined.
In some embodiments of the present application, determining the distance between the direct aggregation areas may also be obtained by calculating by calling the data in the touch information storage table, where a specific formula is:
dist center =(center(m,1)-center(n,1)) 2 +(xenter(m,2)-center(n,2)) 2
wherein m and n of the first dimension both represent category marks of the touch points, 1 of the second dimension represents a first column, i.e. a horizontal sitting, in the touch information storage table And the standard column and 2 of the second dimension represent a second column, namely an ordinate column, in the touch information storage table. Namely, center (m, 1) represents the abscissa of the core point of the call category marker m, center (m, 2) represents the ordinate of the core point of the call category marker m, center (n, 1) represents the abscissa of the core point of the call category marker n, and center (n, 2) represents the ordinate of the core point of the call category marker n. The calculated final value dist center Namely, the square value of the distance between the core point of the direct polymerization zone n and the core point of the direct polymerization zone m.
In other embodiments, the method of performing the direct aggregation area may further be to determine whether there is a density direct point between the two direct aggregation areas based on the distance between the touch points determined in the foregoing steps, and if there is a density direct point, merging the two direct aggregation areas.
Specifically, in some embodiments, the specific implementation of step S400 may refer to fig. 8. Fig. 8 is a detailed description of step S400 in the multi-touch point recognition method according to the corresponding embodiment of fig. 2, where step S400 may include the following steps:
step S410, determining the centroid position of the touch aggregation area according to the position and brightness of each touch point in the touch aggregation area;
Step S420, taking the centroid position as the position of the touch recognition point.
After the cluster merging is completed, the position of the touch recognition point representing the touch aggregation area can be determined according to the position of the touch recognition point. The method comprises the specific steps of determining the centroid position of a touch aggregation area according to the position and brightness of each touch point in the touch aggregation area, and outputting the position of the centroid point as the position of a touch identification point.
The specific calculation mode is shown in the following formula:
wherein,is the abscissa of centroid point, +.>I is the number of each touch point in the touch aggregation area, L is the ordinate of the centroid point i The touch characteristic value X of the touch point with the number of i i Is the abscissa of the touch point numbered i, Y i Is the ordinate of the touch point numbered i.
By the mode, the position of the more accurate touch control identification point can be obtained and output. The method can accurately detect the laser bright points with larger discreteness, perform high-precision positioning, has strong anti-interference performance, and can eliminate weak noise interference brought by a system; the calculation is simple, and the occupied memory is small.
The following describes an embodiment of the apparatus of the present application, which may be used to execute the multi-touch point recognition method in the above embodiment of the present application. It will be appreciated that the apparatus may be a computer program (including program code) running in a computer device, for example the apparatus being an application software; the device can be used for executing corresponding steps in the method provided by the embodiment of the application. For details not disclosed in the embodiments of the device of the present application, please refer to the embodiments of the multi-touch point identification method described in the present application.
Fig. 9 shows a block diagram of a multi-touch point recognition device according to one embodiment of the present application.
Referring to fig. 9, a multi-touch point recognition apparatus 900 according to an embodiment of the present application includes: an acquiring module 910, configured to acquire all touch points on the large screen; a distance module 920, configured to determine a distance between the touch points; the aggregation module 930 is configured to cluster each of the touch points according to a distance between the touch points, to obtain a touch aggregation area; and the identification module 940 is configured to determine a position of a touch identification point according to the position of the touch aggregation area.
In some embodiments of the present application, the acquiring module specifically includes: the acquisition sub-module is used for acquiring touch characteristic values of all points on the large screen and generating a touch gradient image; the binary submodule is used for binarizing the touch characteristic values of all the points in the touch gradient image to obtain a target binary image; and the screening sub-module is used for taking the point position of which the touch characteristic value is the first target value in the target binary image as the touch point.
In some embodiments of the present application, the binary submodule specifically includes: the transition processing unit is used for carrying out threshold processing on touch characteristic values of all points in the touch gradient image according to the initial threshold value to obtain a transition binary image; the straight-square computing unit is used for selecting point positions with the touch characteristic value in the transition binary image as a second target value to compute a histogram; and the oxford binary unit is used for carrying out oxford binarization processing according to the histogram to obtain a target binary image.
In some embodiments of the present application, the oxford binary unit is specifically configured to perform the following steps: performing an oxford method calculation according to the histogram to obtain an oxford threshold; and performing binarization processing on the transition binary image according to the Ojin threshold to obtain a target binary image.
In some embodiments of the present application, the distance module specifically includes: the storage sub-module is used for storing touch information of each touch point into a touch information storage table, and the storage information comprises position information of the touch point; and the determining submodule is used for determining the distance between the touch points according to the position information in the touch information storage table.
In some embodiments of the present application, the aggregation module specifically includes: the first aggregation sub-module is used for clustering the touch points based on the density direct distance according to the distance between the touch points to obtain a direct aggregation area; a distance determination submodule for determining the distance between the direct polymerization zones; and the second polymerization submodule is used for merging the direct polymerization areas based on the density connection distance according to the distance between the direct polymerization areas to obtain a touch polymerization area.
In some embodiments of the present application, the first aggregation sub-module specifically includes: the aggregation unit is used for judging two touch points with the distance smaller than the density direct distance as density direct points, and aggregating all the touch points with the density direct points together to form a density direct area; the dividing unit is used for dividing the density direct region into a plurality of direct aggregation regions according to the touch characteristic value of each touch point.
In some embodiments of the present application, the first aggregation sub-module further comprises: and the filtering unit is used for filtering the density direct area if the number of the touch points in the density direct area is smaller than a preset number threshold value.
In some embodiments of the present application, the dividing unit is specifically configured to perform the following steps: selecting a core point in the density direct region according to each touch characteristic value; selecting a touch point directly reaching the density of the core point; separating a direct polymerization zone formed by the core points and the touch points from the density direct zone to obtain a residual density direct zone; and continuing to separate the residual density direct region until the density direct region is divided into a plurality of direct polymerization regions.
In some embodiments of the present application, the second polymer sub-module specifically includes: a distance determining unit configured to determine a distance between core points of the direct polymerization zones as a distance between the direct polymerization zones; and the region merging unit is used for merging the two direct polymerization regions if the distance between the two direct polymerization regions is smaller than a preset merging threshold value.
In some embodiments of the present application, the identification module specifically includes: the centroid unit is used for determining the centroid position of the touch aggregation area according to the position and brightness of each touch point in the touch aggregation area; and the determining unit is used for taking the centroid position as the position of the touch identification point.
Fig. 10 shows a schematic diagram of a computer system suitable for use in implementing the electronic device of the embodiments of the present application.
It should be noted that, the computer system 1500 of the electronic device shown in fig. 10 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 10, the computer system 1500 includes a central processing unit (Central Processing Unit, CPU) 1501, which can perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 1502 or a program loaded from a storage section 1508 into a random access Memory (Random Access Memory, RAM) 1503. In the RAM 1503, various programs and data required for the operation of the system are also stored. The CPU 1501, ROM 1502, and RAM 1503 are connected to each other through a bus 1504. An Input/Output (I/O) interface 1505 is also connected to bus 1504.
The following components are connected to I/O interface 1505: an input section 1506 including a keyboard, mouse, and the like; an output portion 1507 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and a speaker; a storage section 1508 including a hard disk and the like; and a communication section 1509 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 1509 performs communication processing via a network such as the internet. A drive 1510 is also connected to the I/O interface 1505 as needed. Removable media 1511, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is mounted on the drive 1510 as needed so that a computer program read therefrom is mounted into the storage section 1508 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method shown in the flowchart. In such an embodiment, the computer program can be downloaded and installed from a network via the communication portion 1509, and/or installed from the removable medium 1511. When executed by a Central Processing Unit (CPU) 1501, performs the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium is not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with a computer-readable computer program embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. A computer program embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by means of software, or may be implemented by means of hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
According to one aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read from the computer-readable storage medium by a processor of a computer device, and executed by the processor, cause the computer device to perform the methods provided in the various alternative implementations described above.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer-readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to implement the methods described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, in accordance with embodiments of the present application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a usb disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.
Claims (8)
1. A multi-touch point identification method applied to a large screen, the method comprising:
acquiring all touch points on the large screen;
determining the distance between the touch points;
clustering the touch points according to the distance between the touch points to obtain a touch aggregation area;
determining the position of a touch identification point according to the position of the touch aggregation area;
the obtaining all touch points on the large screen includes: collecting touch characteristic values of all points on the large screen, and generating a touch gradient image; binarizing touch characteristic values of all points in the touch gradient image to obtain a target binary image; taking a point position with the touch characteristic value as a first target value in the target binary image as the touch point;
the binarizing the touch characteristic value of each point in the touch gradient image to obtain a target binary image comprises the following steps: threshold processing is carried out on touch characteristic values of all points in the touch gradient image according to the initial threshold value, and a transition binary image is obtained; selecting a point position with the touch characteristic value as a second target value in the transition binary image to calculate a histogram; performing oxford binarization processing according to the histogram to obtain a target binary image;
Performing the oxford binarization processing according to the histogram to obtain a target binary image, including: performing an oxford method calculation according to the histogram to obtain an oxford threshold; and performing binarization processing on the transition binary image according to the Ojin threshold to obtain a target binary image.
2. The method for identifying multiple touch points according to claim 1, wherein the clustering the touch points according to the distance between the touch points to obtain a touch aggregation area specifically comprises:
clustering the touch points based on the density direct distance according to the distance between the touch points to obtain a direct aggregation area;
determining the distance between the direct polymerization zones;
and merging the direct polymerization areas based on the density connection distance according to the distance between the direct polymerization areas to obtain the touch control polymerization areas.
3. The method for identifying multiple touch points according to claim 2, wherein clustering the touch points based on the density direct distance according to the distance between the touch points to obtain a direct aggregation area comprises:
judging two touch points with the distance smaller than the density direct distance as density direct points, and aggregating all the touch points with the density direct points to form a density direct area;
And dividing the density direct region into a plurality of direct aggregation regions according to the touch characteristic value of each touch point.
4. The method for identifying multiple touch points according to claim 3, wherein the dividing the density direct area into multiple direct aggregation areas according to the touch characteristic value of each touch point specifically comprises:
selecting a core point in the density direct region according to each touch characteristic value;
selecting a touch point directly reaching the density of the core point;
separating a direct polymerization zone formed by the core points and the touch points from the density direct zone to obtain a residual density direct zone;
and continuing to separate the residual density direct region until the density direct region is divided into a plurality of direct polymerization regions.
5. The method for identifying multiple touch points according to claim 4, wherein the merging the direct aggregation areas based on the density-connected distance according to the distance between the direct aggregation areas to obtain the touch aggregation areas specifically comprises:
determining the distance between core points of the direct polymerization zones as the distance between the direct polymerization zones;
and if the distance between the two direct polymerization zones is smaller than a preset combining threshold value, combining the two direct polymerization zones.
6. A multi-touch point recognition device applied to a large screen, comprising:
the acquisition module is used for acquiring all touch points on the large screen;
the distance module is used for determining the distance between the touch points;
the aggregation module is used for clustering the touch points according to the distance between the touch points to obtain a touch aggregation area;
the identification module is used for determining the position of the touch identification point according to the position of the touch aggregation area;
the obtaining all touch points on the large screen includes: collecting touch characteristic values of all points on the large screen, and generating a touch gradient image; binarizing touch characteristic values of all points in the touch gradient image to obtain a target binary image; taking a point position with the touch characteristic value as a first target value in the target binary image as the touch point;
the binarizing the touch characteristic value of each point in the touch gradient image to obtain a target binary image specifically comprises the following steps: threshold processing is carried out on touch characteristic values of all points in the touch gradient image according to the initial threshold value, and a transition binary image is obtained; selecting a point position with the touch characteristic value as a second target value in the transition binary image to calculate a histogram; performing oxford binarization processing according to the histogram to obtain a target binary image;
Performing the oxford binarization processing according to the histogram to obtain a target binary image, wherein the method specifically comprises the following steps of: performing an oxford method calculation according to the histogram to obtain an oxford threshold; and performing binarization processing on the transition binary image according to the Ojin threshold to obtain a target binary image.
7. A computer readable medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the multi-touch point recognition method according to any one of claims 1 to 5.
8. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the multi-touch point identification method of any of claims 1 to 5.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202111455883.1A CN114253417B (en) | 2021-12-02 | 2021-12-02 | Multi-touch point identification method and device, computer readable medium and electronic equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202111455883.1A CN114253417B (en) | 2021-12-02 | 2021-12-02 | Multi-touch point identification method and device, computer readable medium and electronic equipment |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN114253417A CN114253417A (en) | 2022-03-29 |
| CN114253417B true CN114253417B (en) | 2024-02-02 |
Family
ID=80793700
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202111455883.1A Active CN114253417B (en) | 2021-12-02 | 2021-12-02 | Multi-touch point identification method and device, computer readable medium and electronic equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN114253417B (en) |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101593058A (en) * | 2009-06-19 | 2009-12-02 | 友达光电股份有限公司 | Touch point coordinate detection method |
| CN102129332A (en) * | 2011-03-07 | 2011-07-20 | 广东威创视讯科技股份有限公司 | Detection method and device of touch points for image recognition |
| CN102419663A (en) * | 2011-08-03 | 2012-04-18 | 北京汇冠新技术股份有限公司 | Infrared touch screen multi-point identification method and system |
| WO2014067296A1 (en) * | 2012-11-05 | 2014-05-08 | 深圳市恩普电子技术有限公司 | Method for identifying, tracing and measuring internal and external membranes of vessel |
| CN107450840A (en) * | 2017-08-04 | 2017-12-08 | 歌尔科技有限公司 | The determination method, apparatus and electronic equipment of finger touch connected domain |
| CN107526482A (en) * | 2017-06-15 | 2017-12-29 | 北京仁光科技有限公司 | The system for controlling the touch-control of screen display content movement/switching |
| CN107967083A (en) * | 2017-12-18 | 2018-04-27 | 青岛海信电器股份有限公司 | The definite method and device of touch point |
| KR20190002328A (en) * | 2017-06-29 | 2019-01-08 | 삼성전자주식회사 | Method for separating text and figures in document images and apparatus thereof |
| CN109656457A (en) * | 2017-10-10 | 2019-04-19 | 北京仁光科技有限公司 | Refer to touch control method, device, equipment and computer readable storage medium more |
| CN109656393A (en) * | 2017-10-10 | 2019-04-19 | 北京仁光科技有限公司 | Refer to tracking, device, equipment and the computer readable storage medium of contact more |
| CN110162257A (en) * | 2018-02-13 | 2019-08-23 | 北京仁光科技有限公司 | Multiconductor touch control method, device, equipment and computer readable storage medium |
| CN111340815A (en) * | 2020-03-09 | 2020-06-26 | 电子科技大学 | Adaptive image segmentation method based on Otsu method and K mean value method |
-
2021
- 2021-12-02 CN CN202111455883.1A patent/CN114253417B/en active Active
Patent Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101593058A (en) * | 2009-06-19 | 2009-12-02 | 友达光电股份有限公司 | Touch point coordinate detection method |
| CN102129332A (en) * | 2011-03-07 | 2011-07-20 | 广东威创视讯科技股份有限公司 | Detection method and device of touch points for image recognition |
| CN102419663A (en) * | 2011-08-03 | 2012-04-18 | 北京汇冠新技术股份有限公司 | Infrared touch screen multi-point identification method and system |
| WO2014067296A1 (en) * | 2012-11-05 | 2014-05-08 | 深圳市恩普电子技术有限公司 | Method for identifying, tracing and measuring internal and external membranes of vessel |
| CN107526482A (en) * | 2017-06-15 | 2017-12-29 | 北京仁光科技有限公司 | The system for controlling the touch-control of screen display content movement/switching |
| KR20190002328A (en) * | 2017-06-29 | 2019-01-08 | 삼성전자주식회사 | Method for separating text and figures in document images and apparatus thereof |
| CN107450840A (en) * | 2017-08-04 | 2017-12-08 | 歌尔科技有限公司 | The determination method, apparatus and electronic equipment of finger touch connected domain |
| CN109656457A (en) * | 2017-10-10 | 2019-04-19 | 北京仁光科技有限公司 | Refer to touch control method, device, equipment and computer readable storage medium more |
| CN109656393A (en) * | 2017-10-10 | 2019-04-19 | 北京仁光科技有限公司 | Refer to tracking, device, equipment and the computer readable storage medium of contact more |
| CN107967083A (en) * | 2017-12-18 | 2018-04-27 | 青岛海信电器股份有限公司 | The definite method and device of touch point |
| CN110162257A (en) * | 2018-02-13 | 2019-08-23 | 北京仁光科技有限公司 | Multiconductor touch control method, device, equipment and computer readable storage medium |
| CN111340815A (en) * | 2020-03-09 | 2020-06-26 | 电子科技大学 | Adaptive image segmentation method based on Otsu method and K mean value method |
Also Published As
| Publication number | Publication date |
|---|---|
| CN114253417A (en) | 2022-03-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| TWI619080B (en) | Method for calculating fingerprint overlapping region and electronic device | |
| CN111144215B (en) | Image processing methods, devices, electronic equipment and storage media | |
| CN110660066A (en) | Network training method, image processing method, network, terminal equipment and medium | |
| CN108830780B (en) | Image processing method and device, electronic device and storage medium | |
| CN111368587B (en) | Scene detection method, device, terminal equipment and computer readable storage medium | |
| CN111950723A (en) | Neural network model training method, image processing method, device and terminal equipment | |
| CN114764768A (en) | Defect detection and classification method and device, electronic equipment and storage medium | |
| CN110570390B (en) | Image detection method and device | |
| CN116363140B (en) | Method, system and device for detecting defects of medium borosilicate glass and storage medium | |
| CN116993653B (en) | Camera lens defect detection method, device, equipment, storage medium and product | |
| CN110910445B (en) | Object size detection method, device, detection equipment and storage medium | |
| CN114783021A (en) | Intelligent detection method, device, equipment and medium for wearing of mask | |
| CN114253417B (en) | Multi-touch point identification method and device, computer readable medium and electronic equipment | |
| CN110310341B (en) | Method, device, device and storage medium for generating default parameters in color algorithm | |
| CN107945186A (en) | Method, apparatus, computer-readable recording medium and the terminal device of segmentation figure picture | |
| TW201911230A (en) | Surveillance method, computing device, and non-transitory storage medium | |
| CN112629828A (en) | Optical information detection method, device and equipment | |
| JP6365117B2 (en) | Information processing apparatus, image determination method, and program | |
| CN118781623A (en) | Document image extraction method, device, storage medium and electronic device | |
| CN118823071A (en) | River video flow velocity identification method, device, storage medium and program product | |
| CN110543799A (en) | two-dimensional code processing method and device, storage medium and mobile terminal | |
| CN110827261A (en) | Image quality detection method and device, storage medium and electronic equipment | |
| CN109886865A (en) | Method, apparatus, computer equipment and the storage medium of automatic shield flame | |
| CN116580079A (en) | Method and device for detecting size of plate and electronic equipment | |
| CN114740284A (en) | Detection method and device for touch screen panel |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |