CN118104229A - Projection equipment and display control method of projection image - Google Patents
Projection equipment and display control method of projection image Download PDFInfo
- Publication number
- CN118104229A CN118104229A CN202280063190.4A CN202280063190A CN118104229A CN 118104229 A CN118104229 A CN 118104229A CN 202280063190 A CN202280063190 A CN 202280063190A CN 118104229 A CN118104229 A CN 118104229A
- Authority
- CN
- China
- Prior art keywords
- projection
- image
- area
- correction
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 113
- 238000012937 correction Methods 0.000 claims abstract description 198
- 230000003287 optical effect Effects 0.000 claims abstract description 143
- 238000013507 mapping Methods 0.000 claims abstract description 19
- 238000004422 calculation algorithm Methods 0.000 claims description 72
- 230000011664 signaling Effects 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 14
- 238000001514 detection method Methods 0.000 claims description 12
- 230000000694 effects Effects 0.000 claims description 7
- 230000004438 eyesight Effects 0.000 claims description 7
- 238000012544 monitoring process Methods 0.000 claims description 7
- 230000002265 prevention Effects 0.000 claims description 6
- 238000003708 edge detection Methods 0.000 claims description 2
- 238000011426 transformation method Methods 0.000 claims description 2
- 230000001133 acceleration Effects 0.000 description 125
- 238000003702 image correction Methods 0.000 description 77
- 230000008569 process Effects 0.000 description 67
- 230000006870 function Effects 0.000 description 57
- 238000010586 diagram Methods 0.000 description 28
- 239000011159 matrix material Substances 0.000 description 16
- 230000001960 triggered effect Effects 0.000 description 15
- 238000004891 communication Methods 0.000 description 13
- 230000003068 static effect Effects 0.000 description 11
- 230000003993 interaction Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 230000004888 barrier function Effects 0.000 description 7
- 238000011156 evaluation Methods 0.000 description 7
- 238000004458 analytical method Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 6
- 239000000654 additive Substances 0.000 description 5
- 230000000996 additive effect Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000002159 abnormal effect Effects 0.000 description 4
- 239000000284 extract Substances 0.000 description 4
- 238000002347 injection Methods 0.000 description 4
- 239000007924 injection Substances 0.000 description 4
- 238000007789 sealing Methods 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000013021 overheating Methods 0.000 description 2
- 238000010845 search algorithm Methods 0.000 description 2
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000003749 cleanliness Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 229920001973 fluoroelastomer Polymers 0.000 description 1
- 230000017525 heat dissipation Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000003566 sealing material Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
The technical field of display equipment, in particular to a projection equipment and a display control method of a projection image, which can solve the problems that the assembly of components of the projection equipment is not standard, the parameter precision is low, the included angle data caused by measuring and calculating the included angle measuring difficulty is inaccurate, and the projection equipment repeatedly corrects the projection image, or the correction time is long, or the projection image cannot be accurately corrected, the projection equipment comprises: a light machine; a binocular camera; a controller configured to: after the projection equipment moves, a first image and a second image projected to the projection surface by the optical machine are sequentially acquired, wherein the first image corresponds to the first image card, the second image corresponds to the second image card, and the first image and the second image are used for corresponding feature comparison so as to determine the mapping relation between the projection image card and the projection surface under the world coordinate system; and acquiring a shooting image based on the mapping relation, converting the shooting image into a rectangular area under a world coordinate system, and controlling the optical machine to project the playing content to the rectangular area.
Description
Cross Reference to Related Applications
The application is required to be filed in 2021, 11 and 16 days, and the application number is 202111355866.0; submitted at 15 of 04 th 2022 with application number 202210399514.3; priority of chinese patent application number 202210050600.3 filed on month 17 2022, 01, the entire contents of which are incorporated herein by reference.
The present application relates to the field of display devices, and in particular, to a projection device and a display control method for a projection image.
The deviation of the projection angle or the projection distance caused by the movement of the position of the projector can cause that the projection image can not be completely projected to a preset projection area, for example, a trapezoid projection image is formed; by correcting the projection image function of the projector, the display position and the display shape of the projection image can be corrected, so that the projection image is completely displayed in a preset projection area, and the center of the projection image is overlapped with the center of the projection area.
In some implementations of automatically correcting projection images, the position of the projection surface under the camera coordinate system is typically first obtained by an RGBD (RGB Deep: color depth) camera configured by the projector; then, obtaining external parameters between a projector optical machine and an RGBD camera in a calibration mode, and converting the position of a projection image under a camera coordinate system into the optical machine coordinate system; and finally, obtaining the measuring and calculating included angle between the projector optical machine and the projection surface to measure and calculate, thereby realizing the correction of the projection image.
However, when there is an error in assembling components such as a projector optical machine, a camera, a curtain, etc., the accuracy of the parameters is reduced; moreover, the calibration mode has complex process, and the included angle between the optical machine of the projector and the projection surface can be inaccurate sometimes, so that the projector repeatedly corrects the projection image, or the correction time is long, or the projection image can not be corrected accurately.
Disclosure of Invention
A first aspect of an embodiment of the present application provides a projection apparatus, including: the optical machine is used for projecting the playing content to the projection surface; the binocular camera comprises a left camera and a right camera and is used for acquiring a display image of the projection surface; a controller configured to: after detecting that the projection equipment moves, controlling a binocular camera to acquire a first image and a second image which are projected onto the projection surface by an optical machine in sequence, wherein the first image corresponds to the projection of a first image card, the second image corresponds to the projection of a second image card, and the first image and the second image are used for determining the homography relation between the projection image card and the projection surface under a world coordinate system by corresponding feature comparison; and acquiring a rectangular area of which one camera shooting image is converted to a world coordinate system based on the homography relation, and controlling an optical engine to project playing contents to the rectangular area, wherein the rectangular area is used for replacing a malformed projection area formed on a projection surface after the projection equipment moves.
A second aspect of an embodiment of the present application provides a display control method for a projection screen, the method including: after the movement is detected, a first image and a second image which are projected onto a projection surface in sequence are acquired, wherein the first image corresponds to the projection of a first image card, the second image corresponds to the projection of a second image card, and the first image and the second image are used for determining the homography relation between the projection image card and the projection surface under a world coordinate system through corresponding feature comparison; and acquiring a rectangular area of which one camera shooting image is converted to a world coordinate system based on the homography relation, and projecting the playing content to the rectangular area, wherein the rectangular area is used for replacing a malformed projection area formed on a projection surface after movement.
FIG. 1A is a schematic illustration showing a projection apparatus according to an embodiment of the present application;
FIG. 1B is a schematic view of a projection apparatus according to an embodiment of the present application;
FIG. 2 is a schematic circuit diagram of a projection apparatus according to an embodiment of the application;
FIG. 3 is a schematic diagram of a projection apparatus according to an embodiment of the present application;
FIG. 4 is a schematic view of a projection apparatus according to another embodiment of the present application;
FIG. 5 is a schematic circuit diagram of a projection apparatus according to an embodiment of the present application;
FIG. 6A is a schematic diagram of a system frame for implementing display control of a projection device according to an embodiment of the present application;
FIG. 6B is a schematic diagram of signaling interaction timing for implementing a radial eye function by a projection device according to another embodiment of the present application;
FIG. 6C is a schematic diagram illustrating signaling interaction timing for implementing a display correction function in a projection device according to another embodiment of the present application;
FIG. 6D is a flowchart illustrating an auto-focus algorithm implemented by a projection apparatus according to another embodiment of the present application;
FIG. 6E is a flowchart illustrating a projection device implementing a trapezoidal correction and obstacle avoidance algorithm according to another embodiment of the present application;
FIG. 6F is a flowchart illustrating a projection device implementing an in-curtain algorithm according to another embodiment of the present application;
FIG. 6G is a flowchart illustrating an embodiment of a projection device implementing an anti-eye-emission algorithm according to the present application;
FIG. 6H is a schematic view of a projection apparatus according to another embodiment of the present application;
FIG. 6I is a schematic view of a projection apparatus according to another embodiment of the present application;
FIG. 6J is a flowchart illustrating an implementation of an automatic correction of a projected image by a projection device according to another embodiment of the present application;
FIG. 6K is a flowchart illustrating a projection apparatus implementing automatic correction of a projected image according to another embodiment of the present application;
FIG. 7 is a schematic view of a lens structure of a projection apparatus according to another embodiment of the present application;
FIG. 8 is a schematic diagram of a distance sensor and camera of a projection device according to another embodiment of the present application;
FIG. 9 is a schematic diagram of a distance sensor and camera according to another embodiment of the present application;
FIG. 10 is a schematic diagram of a projection device triggering image correction in accordance with another embodiment of the present application;
FIG. 11 is a diagram illustrating a projection device according to another embodiment of the present application;
FIG. 12 is a schematic diagram illustrating interactions between a controller, an optical engine and a projection surface according to another embodiment of the present application;
FIG. 13 is a schematic diagram of a projection device controlling an optical engine to project a corrected white card according to another embodiment of the present application;
FIG. 14 is a schematic diagram of a projection calibration chart card of a projection device control optical engine according to another embodiment of the application;
FIG. 15 is a schematic diagram of a calibration chart according to another embodiment of the present application;
FIG. 16 is a schematic diagram of another calibration chart according to another embodiment of the present application;
fig. 17 is a flowchart of a projection device triggering an eye-protection mode according to another embodiment of the present application.
For the purposes of making the objects and embodiments of the present application more apparent, an exemplary embodiment of the present application will be described in detail below with reference to the accompanying drawings in which exemplary embodiments of the present application are illustrated, it being apparent that the exemplary embodiments described are only some, but not all, of the embodiments of the present application.
Fig. 1A is a schematic layout view of a projection apparatus according to an embodiment of the application.
In some embodiments, the present application provides a projection apparatus comprising a projection screen 1 and a means for projecting 2. The projection screen 1 is fixed in a first position and the device 2 for projection is placed in a second position such that its projected picture coincides with the projection screen 1, which is operated by a professional after-market technician, i.e. the second position is the optimal placement position of the device 2 for projection.
FIG. 1B is a schematic diagram of an optical path of a projection apparatus according to an embodiment of the application.
The embodiment of the application provides a projection device, which comprises a laser light source 100, a light machine 200, a lens 300 and a projection medium 400. The laser light source 100 provides illumination for the optical machine 200, and the optical machine 200 modulates the light beam of the light source, outputs the modulated light beam to the lens 300 for imaging, and projects the imaged light beam onto the projection medium 400 to form a projection screen.
In some embodiments, the laser light source of the projection device includes a laser assembly and an optical lens assembly, and a light beam emitted from the laser assembly can penetrate the optical lens assembly to provide illumination for the optical machine. Wherein, for example, the optical lens assembly requires a higher level of environmental cleanliness, hermetic level of sealing; and the chamber for mounting the laser assembly can be sealed by adopting a dustproof grade with a lower sealing grade so as to reduce the sealing cost.
In some embodiments, the light engine 200 of the projection device may be implemented to include a blue light engine, a green light engine, a red light engine, a heat dissipation system, a circuit control system, and the like. It should be noted that, in some embodiments, the light emitting component of the projection device may also be implemented by an LED (LIGHT EMITTING Diode) light source.
In some embodiments, the present application provides a projection device comprising a trichromatic machine and a controller; the trichromatic machine is used for modulating and generating laser of the user interface including pixel points, and comprises a blue ray machine, a green ray machine and a red ray machine; the controller is configured to: acquiring an average gray value of a user interface; and when the average gray value is judged to be larger than a first threshold value and the duration time of the average gray value is judged to be larger than a time threshold value, the working current value of the red light machine is controlled to be reduced according to a preset gradient value so as to reduce the heating of the trichromatic light machine. It can be found that by reducing the working current of the integrated red light engine in the trichromatic light engine, the overheating of the red light engine can be controlled, so as to control the overheating of the trichromatic light engine and the projection equipment.
The light engine 200 may be implemented as a trichromatic engine integrating a blue light engine, a green light engine, and a red light engine.
The following description will take the implementation of the optical engine 200 of the projection device as including a blue optical engine, a green optical engine, and a red optical engine as an example.
In some embodiments, the optical system of the projection device is composed of a light source part and a light machine part, the light source part is used for providing illumination for the light machine, the light machine part is used for modulating the illumination beam provided by the light source, and finally the illumination beam is emitted through the lens to form a projection picture.
In some embodiments, the laser assembly includes a red laser module, a green laser module, a blue laser module, each laser module and a corresponding mounting port are mounted in a dust seal via a sealing ring (fluororubber or other sealing material is used).
Fig. 2 is a schematic circuit diagram of a projection apparatus according to an embodiment of the application.
In some embodiments, the projection device provided by the present disclosure includes a plurality of sets of lasers, and the brightness sensor may detect a first brightness value of the laser light source by setting the brightness sensor in an outgoing path of the laser light source and send the first brightness value to the display control circuit.
The display control circuit can acquire a second brightness value corresponding to the driving current of each laser, and determine that the laser has COD (Catastrophic Optical Damage: optical catastrophic damage) fault when determining that the difference between the second brightness value of the laser and the first brightness value of the laser is greater than a difference threshold; the display control circuit can adjust the current control signals of the corresponding laser driving components of the lasers until the difference value is smaller than or equal to the difference value threshold value, so that the COD fault of the blue laser is eliminated; the projection equipment can timely eliminate the COD fault of the laser, reduce the damage rate of the laser and ensure the image display effect of the projection equipment.
In some embodiments, the projection device may include a display control circuit 10, a laser light source 20, at least one laser driving assembly 30, and at least one brightness sensor 40, and the laser light source 20 may include at least one laser in one-to-one correspondence with the at least one laser driving assembly 30. Wherein, the at least one means one or more, and the plurality means two or more.
In some embodiments, the projection device includes a laser driving assembly 30 and a brightness sensor 40, and correspondingly, the laser light source 20 includes three lasers, which may be a blue laser 201, a red laser 202, and a green laser 203, respectively, in a one-to-one correspondence with the laser driving assembly 30. The blue laser 201 is used for emitting blue laser light, the red laser 202 is used for emitting red laser light, and the green laser 203 is used for emitting green laser light. In some embodiments, the laser driving assembly 30 may be implemented to include a plurality of sub-laser driving assemblies, each corresponding to a different color laser.
The display control circuit 10 is configured to output a primary color enable signal and a primary color current control signal to the laser driving assembly 30 to drive the laser to emit light, specifically, as shown in fig. 2, the display control circuit 10 is connected to the laser driving assembly 30, and is configured to output at least one enable signal corresponding to three primary colors of each frame of images in the multi-frame display image one by one, transmit the at least one enable signal to the corresponding laser driving assembly 30, and output at least one current control signal corresponding to the three primary colors of each frame of image one by one, and transmit the at least one current control signal to the corresponding laser driving assembly 30, respectively. For example, the display control circuit 10 may be a micro control unit (microcontroller unit, MCU), also referred to as a single chip microcomputer. The current control signal may be a pulse width modulation (pulse width modulation, PWM) signal, among others.
In some embodiments, blue laser 201, red laser 202, and green laser 203 are each coupled to laser drive assembly 30. The laser driving assembly 30 may provide a corresponding driving current to the blue laser 201 in response to the blue PWM signal b_pwm and the enable signal b_en transmitted from the display control circuit 10. The blue laser 201 is configured to emit light when driven by the driving current.
The brightness sensor is arranged in the light-emitting path of the laser light source, and is usually arranged at one side of the light-emitting path, and does not shade the light path. As shown in fig. 2, at least one luminance sensor 40 is provided in the light-emitting path of the laser light source 20, and each of the luminance sensors is connected to the display control circuit 10 for detecting a first luminance value of one laser and transmitting the first luminance value to the display control circuit 10.
In some embodiments, the display control circuit 10 may obtain, from the correspondence, a second luminance value corresponding to a driving current of each laser, where the driving current is a current actual working current of the laser, and the second luminance value corresponding to the driving current is a luminance value that can be emitted when the laser is working normally under the driving of the driving current. The difference threshold may be a fixed value stored in advance in the display control circuit 10.
In some embodiments, the display control circuit 10 may reduce the duty cycle of the current control signal of the laser driving assembly 30 corresponding to the laser when adjusting the current control signal of the laser driving assembly 30 corresponding to the laser, thereby reducing the driving current of the laser.
In some embodiments, the brightness sensor 40 may detect a first brightness value of the blue laser 201 and send the first brightness value to the display control circuit 10. The display control circuit 10 may obtain the driving current of the blue laser 201, and obtain the second luminance value corresponding to the driving current from the correspondence relationship between the current and the luminance value. Then, it is detected whether the difference between the second luminance value and the first luminance value is greater than a difference threshold, and if the difference is greater than the difference threshold, it indicates that the blue laser 201 has a COD failure, and the display control circuit 10 may reduce the current control signal of the laser driving component 30 corresponding to the blue laser 201. The display control circuit 10 may then acquire the first luminance value of the blue laser 201 and the second luminance value corresponding to the driving current of the blue laser 201 again, and reduce the current control signal of the laser driving assembly 30 corresponding to the blue laser 201 again when the difference between the second luminance value and the first luminance value is greater than the difference threshold. And circulating until the difference value is smaller than or equal to the difference value threshold value. Thereby eliminating the COD failure of the blue laser 201 by reducing the drive current of the blue laser 201.
Fig. 3 is a schematic structural diagram of a projection apparatus according to an embodiment of the application.
In some embodiments, the laser light source 20 in the projection device may include a blue laser 201, a red laser 202 and a green laser 203 that are separately disposed, and the projection device may also be referred to as a three-color projection device, where the blue laser 201, the red laser 202 and the green laser 203 are all MCL-type packaged lasers, which has a small size and facilitates compact arrangement of the optical paths.
In some embodiments, referring to fig. 3, the at least one luminance sensor 40 may include a first luminance sensor 401, a second luminance sensor 402, and a third luminance sensor 403, wherein the first luminance sensor 401 is a blue luminance sensor or a white luminance sensor, the second luminance sensor 402 is a red luminance sensor or a white luminance sensor, and the third luminance sensor 403 is a green luminance sensor or a white luminance sensor.
The display control circuit 10 is also configured to read the luminance value detected by the first luminance sensor 401 when controlling the blue laser 201 to emit blue laser light. And stops reading the luminance value detected by the first luminance sensor 401 when the blue laser 201 is controlled to be turned off.
The display control circuit 10 is further configured to read the luminance value detected by the second luminance sensor 402 when the red laser 202 is controlled to emit red laser light, and stop reading the luminance value detected by the second luminance sensor 402 when the red laser 202 is controlled to be turned off.
The display control circuit 10 is further configured to read the luminance value detected by the third luminance sensor 403 when the green laser 203 is controlled to emit green laser light, and stop reading the luminance value detected by the third luminance sensor 403 when the green laser 203 is controlled to be turned off.
The brightness sensor may be one, and may be disposed in the light combining path of the three-color laser.
Fig. 4 is a schematic structural diagram of a projection apparatus according to another embodiment of the present application.
In some embodiments, the projection device may further include a light pipe 110, where the light pipe 110 acts as a light collecting optic for receiving and homogenizing the three-color laser light in the output light combining state.
In some embodiments, the luminance sensor 40 may include a fourth luminance sensor 404, and the fourth luminance sensor 404 may be a white light luminance sensor. The fourth luminance sensor 404 is disposed in the light emitting path of the light guide 110, for example, on the light emitting side of the light guide, and is close to the light emitting surface. And the fourth brightness sensor is a white light brightness sensor.
The display control circuit 10 is further configured to read the luminance value detected by the fourth luminance sensor 404 when the blue laser 201, the red laser 202, and the green laser 203 are controlled to be turned on in a time-sharing manner, so as to ensure that the fourth luminance sensor 404 can detect the first luminance value of the blue laser 201, the first luminance value of the red laser 202, and the first luminance value of the green laser 203. And stops reading the luminance value detected by the fourth luminance sensor 404 when the blue laser 201, the red laser 202, and the green laser 203 are controlled to be turned off.
In some embodiments, the fourth intensity sensor 404 is always on during the projection of the image by the projection device.
In some embodiments, referring to fig. 3 or 4, the projection device may further include a fourth dichroic plate 604, a fifth dichroic plate 605, a fifth mirror 904, a second lens assembly 90, a diffusion wheel 150, a TIR lens 120, a DMD 130, and a projection lens 140. The second lens assembly 90 includes a first lens 901, a second lens 902, and a third lens 903. The fourth dichroic plate 604 transmits blue laser light and reflects green laser light. The fifth dichroic sheet 605 transmits red laser light, and reflects green laser light and blue laser light.
Fig. 5 is a schematic circuit diagram of a projection apparatus according to an embodiment of the application.
In some embodiments, the laser drive assembly 30 may include a drive circuit 301, a switching circuit 302, and an amplification circuit 303. The driving circuit 301 may be a driving chip. The switching circuit 302 may be a metal-oxide-semiconductor (MOS) transistor.
The driving circuit 301 is connected to the switching circuit 302, the amplifying circuit 303, and the corresponding laser included in the laser light source 20, respectively. The driving circuit 301 is configured to output a driving current to a corresponding laser in the laser light source 20 through the VOUT terminal based on a current control signal sent from the display control circuit 10, and transmit a received enable signal to the switching circuit 302 through the ENOUT terminal. The lasers may include n sub lasers in series, namely sub lasers LD1 to LDn, respectively. n is a positive integer greater than 0.
The switch circuit 302 is connected in series in the current path of the laser, and is used for controlling the current path to be conducted when the received enabling signal is at an effective potential.
The amplifying circuit 303 is connected to the detection node E in the current path of the laser light source 20 and the display control circuit 10, respectively, for converting the detected driving current of the laser assembly into a driving voltage, amplifying the driving voltage, and transmitting the amplified driving voltage to the display control circuit 10.
The display control circuit 10 is further configured to determine the amplified driving voltage as a driving current of the laser, and obtain a second luminance value corresponding to the driving current.
In some embodiments, the amplifying circuit 303 may include: a first operational amplifier A1, a first resistor (also called a sampling power resistor) R1, a second resistor R2, a third resistor R3 and a fourth resistor R4.
In some embodiments, the display control circuit 10, the driving circuit 301, the switching circuit 302 and the amplifying circuit 303 form a closed loop to realize feedback adjustment of the driving current of the laser, so that the display control circuit 10 can timely adjust the driving current of the laser through the difference value between the second brightness value and the first brightness value of the laser, that is, timely adjust the actual light-emitting brightness of the laser, avoid the occurrence of COD failure of the laser for a long time, and improve the accuracy of light-emitting control of the laser.
It should be noted that, referring to fig. 3 and 4, if the laser light source 20 includes one blue laser, one red laser, and one green laser. The blue laser 201 may be disposed at the L1 position, the red laser 202 may be disposed at the L2 position, and the green laser 203 may be disposed at the L3 position.
Referring to fig. 3 and 4, the laser light at the L1 position is transmitted once through the fourth dichroic sheet 604, reflected once through the fifth dichroic sheet 605, and then enters the first lens 901. The light efficiency p1=pt×pf at this L1 position. Where Pt denotes the transmittance of the dichroic sheet, and Pf denotes the reflectance of the dichroic sheet or the fifth reflectance.
In some embodiments, among the three L1, L2, and L3 positions, the laser light at the L3 position has the highest optical efficiency, and the laser light at the L1 position has the lowest optical efficiency. Since the maximum optical power pb=4.5 watts (W) output by the blue laser 201, the maximum optical power pr=2.5W output by the red laser 202, and the maximum optical power pg=1.5W output by the green laser 203. That is, the maximum optical power output from the blue laser 201 is maximum, the maximum optical power output from the red laser 202 is sub-maximum, and the maximum optical power output from the green laser 203 is minimum. Thus, the green laser 203 is disposed at the L3 position, the red laser 202 is disposed at the L2 position, and the blue laser 201 is disposed at the L1 position. That is, the green laser 203 is disposed in the optical path having the highest light efficiency, thereby ensuring that the projection apparatus can obtain the highest light efficiency.
In some embodiments the controller includes at least one of a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processor (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read-Only Memory, ROM), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
In some embodiments, the projection device may directly enter the display interface of the signal source selected last time after being started, or the signal source selection interface, where the signal source may be a preset video on demand program, or may be at least one of an HDMI interface, a live tv interface, etc., and after the user selects a different signal source, the projector may display the content obtained from the different signal source.
The embodiment of the application can be applied to various types of projection devices. Hereinafter, a projection apparatus and a display control method for automatically correcting a projection screen will be described by taking the projection apparatus as an example.
The projection device is a device capable of projecting images or videos on a screen, and can play corresponding video signals by being connected with a computer, a broadcasting network, the Internet, a VCD (Video Compact Disc: video high-density optical disc), a DVD (DIGITAL VERSATILE DISC Recordable: digital video disc), a game machine, DV and the like through different interfaces. Projection devices are widely used in homes, offices, schools, entertainment venues, and the like.
In some embodiments, the projection device configured camera may be embodied as a 3-Dimensional (3D) camera, or as a binocular camera; when the camera is implemented as a binocular camera, the camera specifically includes a left camera and a right camera; the binocular camera can acquire the corresponding curtain of the projection device, namely the image and the playing content presented by the projection surface, and the image or the playing content is projected by a laser component built in the projection device.
When the projection device moves to a position, the projection angle and the projection surface distance of the projection device are changed, so that the projection image is deformed, and the projection image is displayed as a trapezoid image or other malformed images; the projection equipment controller can realize automatic trapezoid correction based on the deep learning neural network by coupling the included angle between the projection surfaces of the optical machine and the correct display of the projection image; however, the correction method is slow in correction speed, and a large amount of scene data is required for model training to achieve a certain precision, so that the method is not suitable for immediately and rapidly correcting the scene of the user equipment.
In some embodiments, for projection image correction, an association relationship between distance, horizontal included angle, and offset angle may be created in advance;
Then the projection equipment controller determines the included angle between the light machine and the projection surface at the moment by acquiring the current distance from the light machine to the projection surface and combining the association relation to realize the correction of the projection image;
Wherein the included angle is embodied as an included angle between the central axis of the optical machine and the projection surface; however, in some complex environments, it may happen that the association relationship created in advance cannot adapt to all complex environments, which may cause the projection image to fail to correct.
In some embodiments, the projection device provided by the application can exert the advantages of two dimming modes to the greatest extent and weaken the influence caused by the defects by combining the working mode and the dynamic switching dimming of the working scene. The projection equipment can monitor the movement of the equipment in real time through the configuration component thereof, and feeds back the monitoring result to the projection equipment controller in real time so as to realize the automatic correction of the projection image at the first time by the controller starting the image correction function in real time after the movement of the projection equipment.
For example, by a gyroscope, or TOF (Time of Flight) sensor, configured by the projection device, the controller receives monitoring data from the gyroscope, TOF sensor, to determine whether the projection device is moving;
After determining that the projection apparatus has moved, the controller will activate the projection screen auto-correction process and/or the barrier process to implement functions such as trapezoidal projection area correction and projection barrier avoidance.
The time-of-flight sensor realizes distance measurement and position movement monitoring by adjusting the frequency change of the transmitted pulse, the measurement accuracy of the time-of-flight sensor cannot be reduced along with the increase of the measurement distance, and the time-of-flight sensor has strong anti-interference capability.
In some embodiments, the laser light emitted by the projection device is reflected by the nanoscale mirrors of the DMD (Digital Micromirror Device: digital micromirror device) chip, wherein the optical lens is also a precision element, which can distort the geometry of the image projected onto the screen when the image plane, and thus the object plane, are not parallel.
Fig. 6A is a schematic diagram of a system frame for implementing display control of a projection device according to an embodiment of the application.
In some embodiments, the projection device provided by the application has the characteristic of long-focus micro-projection, and the projection device comprises a controller, wherein the controller can perform display control on an optical machine picture through a preset algorithm so as to realize the functions of automatic trapezoid correction, automatic curtain entering, automatic obstacle avoidance, automatic focusing, eye injection prevention and the like of the display picture.
It can be understood that the projection equipment can realize flexible position movement under the long-focus micro-projection scene by a display control method based on geometric correction; in the process of moving the equipment each time, aiming at the problems of projection picture distortion, projection plane foreign matter shielding, projection picture abnormality from a curtain and the like which possibly occur, the controller can control the projection equipment to realize an automatic display correction function, so that the projection equipment automatically resumes normal display.
In some embodiments, a geometry correction based display control system includes an application service layer (APK SERVICE: android Application PACKAGE SERVICE), a service layer, and an underlying algorithm library.
The application service layer is used for realizing interaction between the projection equipment and the user; based on the display of the user interface, the user can configure various parameters and display pictures of the projection equipment, and the controller can realize the function of automatically correcting the display pictures of the projection equipment when the display of the projection equipment is abnormal by coordinating and calling algorithm services corresponding to various functions.
The service layer may include correction services, camera services, time of flight (TOF) services, etc., which correspond up to the application service layer (APK SERVICE), implementing corresponding specific functions of different configuration services of the projection device; the service layer is downwards connected with data acquisition services such as an algorithm library, a camera, a flight time sensor and the like, so that the packaging of the bottom layer complex logic is realized.
The bottom algorithm library provides correction service and control algorithm for the projection equipment to realize various functions, and the algorithm library can complete various mathematical operations based on OpenCV (open source based on permission) to provide basic computing capacity for the correction service; openCV is a cross-platform computer vision, machine learning software library that is based on BSD license open source release and can be run in a variety of existing operating system environments.
In some embodiments, the projection device is configured with a gyroscopic sensor; in the moving process of the equipment, the gyroscope sensor can sense the position movement and actively collect movement data; and then the acquired data is sent to an application program service layer through a system framework layer and used for supporting user interface interaction and application data required in the application program interaction process, wherein the acquired data can also be used for data calling of the controller in algorithm service realization.
In some embodiments, the projection device is configured with a time of flight (TOF) sensor that, after acquiring corresponding data, will be sent to a corresponding time of flight service of the service layer;
After the flight time service acquires the data, the acquired data is sent to an application program service layer through a process communication framework, and the data is used for data calling, user interfaces, program applications and the like of the controller in an interactive mode.
In some embodiments, the projection device is configured with a camera for capturing images, which may be implemented as a binocular camera, or a depth camera, or a 3D camera, or the like;
The camera acquisition data is sent to a camera service, and then the camera service sends the acquisition image data to a process communication frame and/or a projection equipment correction service; the projection equipment correction service can receive camera acquisition data sent by the camera service, and the controller can call a corresponding control algorithm in the algorithm library aiming at different functions to be realized.
In some embodiments, data interaction with the application service is performed through the process communication framework, and then the calculation result is fed back to the correction service through the process communication framework; the correction service sends the acquired calculation result to the projection equipment operating system to generate a control signaling, and sends the control signaling to the optical machine control drive to control the working condition of the optical machine so as to realize automatic correction of the display image.
Fig. 6B is a schematic signaling interaction timing diagram of a projection device implementing an anti-eye function according to another embodiment of the present application.
In some embodiments, the projection device may implement an eye-protection function, so as to prevent the user from accidentally entering the range of the laser track emitted by the projection device and causing damage to eyesight, when the user enters a preset specific unsafe area where the projection device is located, the controller may control the user interface to display corresponding prompt information to remind the user to leave the current area, and the controller may also control the user interface to reduce display brightness, so as to prevent the damage to eyesight of the user caused by the laser.
In some embodiments, the controller will automatically turn on the anti-eye switch when the projection device is configured in child viewing mode. In some embodiments, the controller will control the projection device to turn on the anti-eye switch after receiving position movement data sent by the gyroscopic sensor, or receiving foreign object intrusion data collected by other sensors.
In some embodiments, when the time of flight (TOF) sensor and the data collected by the camera trigger any preset threshold condition, the controller controls the user interface to reduce the display brightness, display prompt information, reduce the light engine emission power, brightness and intensity.
In some embodiments, the projection device controller may control the correction service to send signaling to the time-of-flight sensor, querying the projection device for the current status, and the controller will accept data feedback from the time-of-flight sensor.
The correction service sends a notification algorithm service to the process communication framework to start the anti-eye-emission flow signaling, and the process communication framework calls service capacity from the algorithm library to call the corresponding algorithm service; for example, the algorithm may include: a photographing detection algorithm, a screenshot picture algorithm, a foreign matter detection algorithm and the like.
The process communication framework is based on the algorithm service and can return a foreign matter detection result to the correction service; for the return result, if the preset threshold condition is reached, the controller controls the user interface to display prompt information and reduce display brightness, and the signaling time sequence is shown in fig. 6B.
In some embodiments, when the user enters a preset specific area in an open state of the anti-projection eye, the projection device automatically reduces the laser intensity emitted by the optical machine, reduces the display brightness of the user interface and displays safety prompt information; the projection equipment can control the eye-injection preventing function by the following method:
based on a projection picture obtained by a camera, identifying a projection area of the projection equipment by utilizing an edge detection algorithm; when the projection area is displayed as a rectangle or a rectangle-like shape, the controller acquires coordinate values of four vertexes of the rectangular projection area through a preset algorithm;
When the detection of the foreign matters in the projection area is realized, the projection area can be corrected to be rectangular by using a perspective transformation method, and the difference value between the rectangle and the projection screenshot is calculated to judge whether the foreign matters exist in the display area; if the judging result is that the foreign matter exists, the projection equipment automatically starts the anti-injection eye.
When foreign matter detection is realized on a certain area outside the projection area, the camera content of the current frame and the camera content of the previous frame can be made into a difference value to judge whether foreign matter enters the area outside the projection area; if it is judged that foreign matters enter, the starting of the anti-injection function is triggered.
At the same time, the projection device may also detect real-time depth changes in a particular area using a time-of-flight (ToF) camera, or a time-of-flight sensor; if the depth value changes beyond the preset threshold, the projection device will automatically trigger the anti-eye function.
In some embodiments, as shown in fig. 6G, a schematic flow chart of the projection device implementing the anti-eye-shooting algorithm is provided, and the projection device determines whether the anti-eye-shooting function needs to be turned on based on the collected flight time data, screenshot data, and camera data analysis. The projection device can realize the following three modes:
mode one:
Step 6G00-1, acquiring time of flight (TOF) data;
Step 6G00-2, performing depth difference analysis according to the acquired flight time data;
Step 6G00-3, judging whether the depth difference value is larger than a preset threshold value X, and executing step 6G03 when X is 0 if the depth difference value is larger than the preset threshold value X;
And 6G03, darkening the picture and popping up a prompt.
If the depth difference value is larger than a preset threshold value X, and the preset threshold value X is implemented to be 0, the fact that the foreign matter is in a specific area of the projection equipment can be judged;
if the user is located in a specific area, the vision of the user is at risk of being damaged by laser, and the projection equipment automatically starts the eye-shot prevention function so as to reduce the laser intensity emitted by the optical machine, reduce the display brightness of the user interface and display safety prompt information.
Mode two:
step 6G01-1, collecting screenshot data;
step 6G01-2, performing additive color mode (RGB) difference analysis according to the collected screenshot data;
Step 6G01-3, judging whether the RGB difference value is larger than a preset threshold value Y, and if so, executing step 6G03;
And 6G03, darkening the picture and popping up a prompt.
The projection equipment performs difference analysis of an additive color mode (RGB) according to the collected screenshot data, and when the difference is greater than a preset threshold Y, the projection equipment can judge that foreign matters are in a specific area of the projection equipment; if a user exists in the specific area, the vision of the user is at risk of being damaged by laser, and the projection equipment automatically starts the eye-shooting prevention function.
Mode three:
step 6G02-1, collecting camera data;
Step 6G02-2, acquiring projection coordinates according to the acquired camera data, if the acquired projection coordinates are in a projection area, executing step 6G01-3, and if the acquired projection coordinates are in an expansion area, executing step 6G01-3;
step 6G02-3, performing additive color mode (RGB) difference analysis according to the collected camera data;
And step 6G02-4, judging whether the RGB difference value is larger than a preset threshold value Y, and if so, executing step 6G03.
And 6G03, darkening the picture and popping up a prompt.
The projection equipment acquires projection coordinates according to the acquired camera data; then determining a projection area of the projection equipment according to the projection coordinates, and further carrying out difference analysis of an additive color mode (RGB) in the projection area; when the difference value is larger than a preset threshold value Y, judging that the foreign matter is in a specific area of the projection equipment; the projection device will automatically activate the anti-eye function.
If the obtained projection coordinates are in the expansion area, the controller can still perform differential analysis of an additive color mode (RGB) in the expansion area; if the difference is greater than the preset threshold Y, it can be determined that a foreign object is in a specific area of the projection device, and the projection device will automatically activate the anti-eye function.
Fig. 6C is a schematic signaling interaction timing diagram of a projection apparatus implementing a display screen correction function according to another embodiment of the present application.
In some embodiments, the projection device monitors device movement via a gyroscope or gyroscopic sensor. The correction service sends a signaling for inquiring the state of the equipment to the gyroscope, and receives a signaling which is fed back by the gyroscope and used for judging whether the equipment moves or not.
In some embodiments, the display correction policy may be configured to: when the gyroscope and the flight time sensor are changed at the same time, the projection equipment triggers the trapezoidal correction preferentially; when the trapezoidal correction is performed, the controller does not respond to the instruction sent by the key of the remote controller so as to match the realization of the trapezoidal correction, and the projection equipment can play a pure white graphic card.
The trapezoidal correction algorithm can construct a conversion matrix of a projection plane and an optical machine coordinate system under a world coordinate system based on the binocular camera, calculate a homography relation between a projection image and a playing graphic card by combining an optical machine internal parameter, the homography relation is also called a mapping relation, and realize arbitrary shape conversion between the projection image and the playing graphic card by utilizing the homography relation.
In some embodiments, the correction service sends signaling for notifying the algorithm service to start the ladder correction flow to a process communication framework, which further sends service capability call signaling to the algorithm service to obtain an algorithm corresponding to the capability;
the algorithm service obtains and executes photographing and picture algorithm processing service and obstacle avoidance algorithm service, and sends the service to the process communication framework through signaling carrying; in some embodiments, the process communication framework executes the algorithm and feeds back the execution result to the correction service, wherein the execution result can include feedback information such as photographing success, obstacle avoidance success and the like.
In some embodiments, the projection device executes the algorithm or the data transmission process, if an error occurs, the correction service controls the user interface to display an error return prompt, and controls the user interface to perform trapezoid correction again and automatically focus the image card.
The projection equipment can identify the curtain through an automatic obstacle avoidance algorithm; and the projection image is corrected to be displayed in the curtain by utilizing the projection change, so that the effect of alignment with the edge of the curtain is realized. Through an automatic focusing algorithm, the projection equipment can acquire the distance between the optical machine and the projection surface by using a time-of-flight (ToF) sensor, search the optimal image distance in a preset mapping table based on the distance, and evaluate the definition degree of a projection picture by using an image algorithm, thereby realizing fine adjustment of the image distance.
In some embodiments, the automatic ladder correction signaling sent by the correction service to the process communication framework may include other functional configuration instructions, such as control instructions including whether to implement synchronous obstacle avoidance, whether to enter a curtain, and the like.
The process communication framework sends a service capability calling signaling to the algorithm service, so that the algorithm service obtains and executes an automatic focusing algorithm to realize the adjustment of the viewing distance between the equipment and the curtain; in some embodiments, after the auto-focus algorithm is applied to realize the function, the algorithm service may also acquire and execute the auto-in-curtain algorithm, and the process may include a trapezoidal correction algorithm.
The projection equipment executes automatic curtain entering, and the algorithm service sets 8-position coordinates between the projection equipment and the curtain; then, adjusting the viewing distance between the projection equipment and the curtain through an automatic focusing algorithm; finally, the correction result is fed back to the correction service, and the user interface is controlled to display the correction result, as shown in fig. 6C.
In some embodiments, the projection device may obtain the current object distance by using its configured laser ranging through an auto-focusing algorithm to calculate the initial focal length and search range; then the projection device drives a Camera (Camera) to take a picture, and the definition evaluation is carried out by using a corresponding algorithm.
And the projection equipment searches the possible optimal focal length based on a search algorithm in the search range, repeats the steps of photographing and definition evaluation, and finally finds the optimal focal length through definition comparison to complete automatic focusing.
For example, after the projection device is started, the steps for implementing the auto-focus algorithm are shown in fig. 6D, and include the following steps:
step 6D01, the user moves the equipment, and the projection equipment automatically completes the correction and refocuses;
Step 6D02, the controller detects whether the automatic focusing function is started, if not, the controller ends the automatic focusing service, and if yes, step 6D03 is executed;
Step 6D03, middleware acquiring a detection distance of time of flight (TOF);
When the auto-focus function is on, the projection device will calculate the detection distance of the time of flight (TOF) sensor acquired by the middleware.
Step 6D04, obtaining a rough focal length according to the distance query mapping table;
step 6D05, setting the focus light-giving machine by the middleware;
The controller queries a preset mapping table according to the obtained distance to obtain the focal length of the projection equipment; the middleware then sets the acquisition focal length to the bare engine of the projection device.
Step 6D06, photographing by a camera;
step 6D07, judging whether focusing is completed or not according to the evaluation function, if yes, ending the automatic focusing flow, otherwise executing step 6D08;
Step 6D08, the middleware fine-tunes the focal length (step size), and step 6D05 is performed again. After the optical machine emits laser with the focal length, the camera executes a photographing instruction; the controller judges whether focusing of the projection equipment is finished or not according to the acquired shooting image and the evaluation function; if the judging result meets the preset finishing condition, the automatic focusing process is controlled to be finished;
If the judging result does not meet the preset finishing condition, the middleware finely adjusts the focal length parameters of the optical machine of the projection equipment, for example, the focal length can be finely adjusted by preset step length, and the adjusted focal length parameters are set to the optical machine again; therefore, repeated photographing and definition evaluation steps are realized, and finally, the optimal focal length is found through definition comparison to complete automatic focusing.
In some embodiments, the projection device may project a 75-100 inch screen; after purchasing the laser projection equipment, the user can acquire a proper A projection area through projection installation and debugging, wherein the A projection area is rectangular, and is shown on the left side of fig. 6H;
After the projection device position is moved from a to B, a deformed projection area is often generated, such as a trapezoidal projection area shown on the right side of fig. 6H; it can be appreciated that the projection equipment has defects in placement, and the placement position of the projection equipment can be changed for cleaning and the like; under the condition of fixed screen, the matching of the picture of the projection equipment and the screen is determined by the position of the projection equipment, and the position of the projection equipment is changed, so that the projection position is changed, and the problem of non-matching of the picture and the screen is caused.
Therefore, when the position of the projection device is changed, the user needs to adjust the projection device to be at the optimal placement position so as to readjust the relative relationship between the edge of the picture and the screen, and ensure that the picture is consistent with the screen. However, for an ordinary user who is unfamiliar with the projection equipment, the situation that the adjusted laser picture does not show a regular rectangle usually exists in self-adjustment, and the projection image correction in a software mode can be realized by the automatic correction projection image display control method provided by the application, as shown in fig. 6I;
Correction of the misshapen projection area by the projection device will be explained below. In some embodiments, the projection device provided by the present application may implement a display correction function through a trapezoidal correction algorithm, as shown in fig. 6J, and the present application provides a method for automatically correcting a projected image, including the following steps:
Step 6J01, monitoring the position change between the projector and the projection surface in real time;
step 6J02, time of flight or gyro sensor detected change;
Step 6J03, projecting a white chart;
step 6J04, shooting and storing a photo by a binocular camera;
Step 6J05, projecting a feature map card;
step 6J06, shooting and storing a photo by a binocular camera;
after the controller detects that the position of the projection equipment changes, the controller controls the optical machine to project a first graphic card and a second graphic card to the projection surface; in the process of projecting the image cards, the controller controls the binocular camera to acquire a first image corresponding to the first image card and a second image corresponding to the second image card;
For example, the controller firstly projects a white image card to the projection surface by using the control optical machine, and then controls the binocular camera to shoot the projection surface to obtain a picture corresponding to the white image card, wherein the white image card is the first image card, and the first image card can be implemented as other color image cards;
After the projection device stores the picture corresponding to the white picture card, the controller controls the optical machine to project the feature picture card again to the projection surface, where the feature picture card is the second picture card, and the second picture card may be implemented as a checkerboard picture card or other picture cards with graphics and color features, as shown in fig. 6H.
Step 6J07, identifying feature points corresponding to the binocular camera;
In some embodiments, the controller may determine a homography relationship between the projection card and the projection surface under the world coordinate system by comparing corresponding features of the first image and the second image, so as to obtain an association relationship between the projection card and the projection surface display for correcting the projection image.
Step 6J08, calculating coordinates of points on the projection surface corresponding to the camera feature points under a camera coordinate system;
step 6J09, obtaining coordinates of points of the projection surface under a world coordinate system;
step 6J10, obtaining homography relation between the projection feature image card and the projection surface feature points;
Step 6J11, obtaining a maximum rectangular area by a world coordinate system;
and 6J12, obtaining the area to be projected by the optical machine according to the homography relation between the projection feature image card and the projection surface.
The controller obtains the coordinates of the corresponding feature points of the binocular camera under the image coordinate system according to the image recognition algorithm, and then calculates the coordinates of the projection surface under any one of the binocular cameras, such as the left camera coordinate system, according to the coordinates of the feature points of the binocular camera under the image coordinate system, so as to obtain the coordinates of the feature points of the projection surface under the world coordinate system.
For example, according to the obtained three-dimensional coordinates of the discrete points in the left camera coordinate system, the hyperstatic equation is solved, and the plane equation of the projection plane in the left camera coordinate system can be calculated and obtained, which is expressed as:
z=a*x+b*y+c;
Then, based on the plane equation, the controller calculates and acquires a unit normal vector of a projection surface under a left camera coordinate system; continuing to define the world coordinate system 1, taking the origin of the left camera coordinate system as the origin, and establishing a direct coordinate system of space by taking a projection plane as an XOY plane in parallel, so that an equation of the projection plane under the world coordinate system 1 is expressed as:
z=c′;
wherein, as a constant, the unit normal vector is expressed as:
m=(0,0,1) T;
according to the representation of the unit normal vector of the projection surface under different coordinate systems, a rotation matrix between the two coordinate systems can be obtained, and the representation is as follows:
m=R1*n;
Obtaining the coordinates of the projection plane point P in the world coordinate system 1 according to the rotation matrix R1 and the coordinates of the projection plane point P (x ', y ', z ') in the optical machine coordinate system; the method comprises the steps that the projection of the origin of the world coordinate 1 on a projection plane is taken as the origin, the projection plane is an XOY plane, a world coordinate system 2 is established, the coordinates of a projection point P on the world coordinate system 2 are expressed as (x ', y', 0), and a controller can acquire the coordinates of all projection feature points under the world coordinate system 2;
And calculating a homography matrix according to the obtained coordinates of the projection characteristic points and the obtained characteristic points (known) of the optical machine projection image card, so as to obtain the homography relation between the projection characteristic image card and the characteristic points of the projection surface, namely the homography relation between the projection image card and the projection surface under the world coordinate system, and the homography relation can also be called a mapping relation.
In some embodiments, the projection device projects the broadcast content before the projection, for example, the controller detects that an obstacle blocking the projection beam exists between the optical machine and the projection surface, and the controller starts the obstacle avoidance process.
When the projection equipment is provided with a barrier function switch, the controller judges whether the barrier function of the projection equipment is started or not, and if the barrier function is not started, barrier detection is not carried out; if the projection device starts the obstacle avoidance function, the controller calculates and acquires a homography matrix, i.e., a homography relationship, of any one of the binocular cameras, for example, the left camera and the optical machine.
The projection equipment stores the characteristic points of the projection image card, and after the characteristic points of the image corresponding to the characteristic image card shot by the left camera are identified, the characteristic points are compared with the stored characteristic points of the projection image card, so that the homography matrix of the left camera and the optical machine can be calculated and acquired.
For example, the controller subjects the captured image obtained by the binocular camera to graying processing, and then performs clustering processing based on a clustering algorithm.
In the clustering process, the controller clusters the photographed images subjected to the above-mentioned graying processing into a projection beam region and an obstacle region according to a clustering algorithm and a gray value, for example, according to the gray size, the gray images in the projection region are clustered into two types, wherein the projection beam gray value is necessarily the largest type, and the type with the small gray value is determined as the obstacle region.
It will be appreciated that the controller binarizes the projected beam region and the obstacle region, which facilitates extraction of a first closed profile greater than a predetermined obstacle avoidance area threshold, the first closed profile corresponding to an obstacle, such as a household item between the projection device and the curtain;
In the above process, the controller may further extract a second closed contour smaller than a preset obstacle avoidance area threshold, the second closed contour corresponding to a non-obstacle, such as a small stain of a wall or curtain, or the like; after the controller identifies the second closed contour, in order to improve the identification efficiency of the subsequent algorithm, the second closed contour can be filled into a projection beam area;
The first closed contour can be used for the controller to control the light machine to avoid the closed contour in the projection process, so that a final rectangular area generated by the projection surface can accurately avoid projection obstacle avoidance objects, and the projection obstacle avoidance function is realized; meanwhile, the recognition of the second closed contour and filling in the subsequent algorithm recognition process also cannot influence the position of the final rectangular area generated by the projection surface, and as shown in fig. 6K, a method for realizing automatic correction of the projection image by the projection device is provided, and the method comprises the following steps:
Step 6K01, acquiring a homography relation between the projection feature image card and the projection surface feature points;
Step 6K02, judging whether the obstacle avoidance is opened, if yes, executing step 6K03, otherwise, executing step 6K07;
Step 6K03, acquiring a homography matrix of the camera and the optical machine;
Step 6K04, detecting the outline of the obstacle;
step 6K05, converting the picture into an optical machine coordinate system;
step 6K06, converting the binarized picture of the extracted obstacle outline into a world coordinate system;
Step 6K07, obtaining a maximum rectangular area by a world coordinate system;
and 6K08, obtaining the area to be projected by the optical machine according to the homography relation between the projection feature map card and the projection surface.
For example, the controller performs binarization processing on the two types of gray values, wherein a large gray value can be configured to be 255, and a small gray value can be configured to be 0; then extracting all the closed contours contained in the binarized image; in this step, in order to improve the user experience effect and avoid identifying some very small objects, such as stains existing on a wall, as obstacles for avoiding when the projection device performs the obstacle avoidance function, a preset obstacle avoidance area threshold may be defined;
A contour with more than one percent of the area of the contour area is regarded as an obstacle, and a contour with less than one percent of the area of the contour area is not regarded as an obstacle and obstacle avoidance is not carried out; at the same time, the controller fills the gray value of the area where the contour is located, which is less than one percent of the area of the contour area, to the maximum value 255, so that the projection device does not avoid the extremely small contour when the projection device is in subsequent obstacle avoidance.
It should be noted that, in some embodiments, the projection device directly adopts binarization processing for the obstacle profile detection; then adopting an empirical threshold to identify the obstacle; the recognition efficiency of the method is reduced when the method is used for dealing with complex environments, for example, when projection beams are projected to a black curtain area, if the empirical threshold is selected too large, projection which does not contain an obstacle can be erroneously recognized as the obstacle, and the obstacle avoidance effect is affected.
In some embodiments, the projection device controller may convert the captured image acquired by any one of the binocular cameras into the world coordinate system based on the homography relationship acquired in the foregoing, that is, the mapping relationship, to obtain the final rectangular region.
After a left camera of the binocular camera shoots a picture, the controller carries out gray level binarization on the shot image; then the controller converts the binary image into an optical machine coordinate system according to a homography matrix between the binocular camera left camera and the optical machine of the projection equipment;
it can be understood that after homography matrix conversion is performed on the binarized image, the image in the projection area is reserved, and other irrelevant areas are removed, so that the search of the obstacle avoidance area in the subsequent step only needs to consider the projection area;
Then, the controller converts the image acquired by the left camera and identified by the binarized contour into a world coordinate system 2 again through a homography matrix; according to the acquired picture, the controller selects a rectangular area under the world coordinate system 2, and controls the optical machine to project the playing content to the rectangular area, wherein the rectangular area is used for replacing a malformed projection area formed on the projection surface after the projection device moves, as shown in fig. 6H.
It can be understood that for a common micro-projection type projection device, due to the complex use environment, a light machine, a projection wall surface or a curtain cannot be avoided, and a projection picture generates a trapezoid due to the fact that the projection wall surface or the curtain is not vertical; when the projection area is provided with an obstacle, the projection picture is blocked by the obstacle, and the display control method for automatically correcting the projection picture can realize the functions of automatically correcting the trapezoid and avoiding the obstacle.
In some embodiments, the projection device controller may determine the coordinates of the center point of the projected card corresponding captured image in the world coordinate system based on the homography relationship described above.
For example, a projection card of a projection device light engine is defined as 3840 x 2160 resolution, and the controller may calculate a center point (1920,1080) for obtaining the projection card; and then, according to the obtained homography matrix, calculating to obtain a coordinate system of the center point of the projection card in the world coordinate system 2.
And the controller expands the rectangle by taking the coordinates of the central point of the projection card as a reference according to the aspect ratio of the projection card until reaching the gray value change boundary, and acquiring four vertexes of the rectangle at the moment as vertexes of a final rectangular area.
For example, if the aspect ratio of the projection size of the projection card is configured to be 16:9, selecting a rectangle sequentially increases according to the proportion of 16:9 based on the coordinates of the center point of the card in the world coordinate system 2; the gray value of the projection area is set to 255, and the gray value of the corresponding barrier area is set to 0;
Thus, when points on four sides of the rectangle hit a region with a gray value of 0, the controller will control the rectangle to stop expanding, and four vertices of the final rectangular region can be acquired at that time; it can be converted into the coordinate system of the optical machine to obtain the area to be projected by the optical machine of the projection equipment.
In some embodiments, the controller may obtain the final rectangular region by obtaining a maximum inscribed rectangle having a gray value of 255 in the photographed image, and then correcting a wide value, a high value of the maximum inscribed rectangle according to an aspect ratio of the projected card.
For example, the controller acquires a maximum inscribed rectangle within a 255-pixel region in the captured image, and calculates the aspect ratio of the maximum inscribed rectangle; if the aspect ratio of the maximum inscribed rectangle:
width/height>16:9;
the final rectangle selected is unchanged in height, and the controller corrects the width value of the final rectangle to be:
width1=height/9*16;
if the aspect ratio of the maximum inscribed rectangle:
width/height<16:9;
the final rectangle width selected is unchanged, and the controller corrects its height to:
height1=width/16*9;
The controller can determine any vertex of the maximum inscribed rectangle based on the final corrected wide value and high value so as to position the rectangular area corresponding to the projection area; the projection equipment can obtain a finally selected rectangle according to the acquired width, height and any 1 vertex of the maximum inscribed rectangle obtained by calculation; the controller converts four vertexes of the rectangle into an optical-mechanical coordinate system to obtain the area to be projected by the optical machine.
In some embodiments, the controller may first obtain two sets of external parameters, i.e., rotation and translation matrices, between two cameras and between the camera and the optical machine based on a calibration algorithm; then playing a specific checkerboard card through an optical machine of the projection equipment, calculating the depth value of the corner points of the projection checkerboard, and solving xyz coordinate values through a translation relation between binocular cameras and a similar triangle principle; then, a projection plane is fitted based on the xyz, and the rotation relation and the translation relation between the projection plane and a camera coordinate system are obtained, wherein the rotation relation and the translation relation can specifically comprise a Pitch relation (Pitch) and a Yaw relation (Yaw).
The gyroscope configured by the projection equipment can obtain Roll (Roll) parameter values to combine a complete rotation matrix, and finally calculate and obtain external parameters from the projection surface under the world coordinate system to the optical machine coordinate system.
The conversion relation between the projection world coordinate system and the optical machine coordinate system can be obtained by combining R, T values of the camera and the optical machine obtained by calculation in the steps; the homography matrix from the point of the projection surface to the stuck point of the optical machine graph can be formed by combining the optical machine internal parameters.
And finally, selecting a rectangle on the projection surface, reversely solving the coordinate corresponding to the optical machine image card by utilizing homography, wherein the coordinate is the correction coordinate, and setting the correction coordinate to the optical machine to realize trapezoidal correction.
As shown in fig. 6E, a schematic flow chart of a projection device implementing a trapezoidal correction and obstacle avoidance algorithm is provided, including the following steps:
step 6E01, the projection equipment controller obtains the depth value of the corresponding point of the photo pixel point or the coordinate of the projection point under the camera coordinate system;
step 6E02, acquiring the relation between the optical machine coordinate system and the camera coordinate system by the middleware through the depth value;
step 6E03, then the controller calculates to obtain the coordinate value of the projection point under the coordinate system of the optical machine;
Step 6E04, acquiring an included angle between the projection surface and the optical machine based on the coordinate value fitting plane;
step 6E05, obtaining corresponding coordinates of the projection points in a world coordinate system of the projection surface according to the included angle relation;
Step 6E06, calculating to obtain a homography matrix according to the coordinates of the card under the optical machine coordinate system and the coordinates of the corresponding points of the projection plane projection surface;
Step 6E07, the controller determines whether an obstacle exists based on the acquired data, if yes, step 6E08 is executed, otherwise, step 6E09 is executed;
Step 6E08 is executed, rectangular coordinates are arbitrarily taken on a projection plane under a world coordinate system, and the area to be projected by the optical machine is calculated according to the homography relation;
step 6E09, when the obstacle does not exist, the controller may, for example, acquire the feature points of the two-dimensional code;
Step 6E10, then obtaining the coordinates of the two-dimension code on the prefabricated graphic card;
Step 6E11, acquiring a homography relation between the camera photo and the drawing card;
step 6E12, converting the acquired obstacle coordinates into a graphic card, namely acquiring obstacle shielding graphic card coordinates;
Step 6E13, obtaining the coordinate of the shielding area of the projection surface through homography matrix conversion according to the coordinate of the shielding area of the obstacle graphic card under the coordinate system of the optical machine;
and 6E14, obtaining rectangular coordinates on a projection surface under a world coordinate system, avoiding the obstacle, and obtaining the area to be projected by the optical machine according to the homography relation.
It can be understood that the obstacle avoidance algorithm utilizes an algorithm (OpenCV) library to complete foreign object contour extraction when the trapezoidal correction algorithm flow selects a rectangle step, and avoids the obstacle when selecting a rectangle, so as to realize the projection obstacle avoidance function.
In some embodiments, as shown in fig. 6F, a flowchart of an implementation of an in-curtain algorithm for a projection device includes the following steps:
step 6F01, the middleware acquires a two-dimensional code image card shot by the camera;
Step 6F02, identifying two-dimensional code feature points and obtaining coordinates under a camera coordinate system;
step 6F03, the controller further obtains the coordinates of the preset graphic card under the coordinate system of the optical machine;
step 6F04, solving the homography relation between the camera plane and the optical plane;
Step 6F05, the controller identifies four vertex coordinates of the curtain shot by the camera based on the homography relation;
and 6F06, acquiring the range of the projection to the curtain light machine to project the graphic card according to the homography matrix.
It will be appreciated that in some embodiments, the on-screen algorithm may identify and extract the maximum black closed rectangular outline based on the algorithm library, and determine whether it is 16:9 size; the specific image card is projected, a camera is used for shooting a photo, a plurality of corner points in the photo are extracted to be used for calculating homography of a projection surface (curtain) and the photo-mechanical playing image card, four vertexes of the curtain are converted into a photo-mechanical pixel coordinate system through homography, and the photo-mechanical image card is converted into four vertexes of the curtain, so that calculation and comparison can be completed.
The long-focus micro-projection television has the characteristic of flexible movement, the projection picture can be distorted after each displacement, and in addition, if foreign matter shielding exists on the projection surface or the projection picture is abnormal from a curtain, the projection equipment provided by the application can automatically complete correction aiming at the problems, and the functions of automatic trapezoid correction, automatic curtain entering, automatic obstacle avoidance, automatic focusing, eye shooting prevention and the like are realized.
Based on the above description of the display control scheme for implementing automatic correction of the projection picture by the projection device and the related drawings, the present application also provides a display control method for implementing automatic correction of the projection picture, which is described in detail in the display control scheme for implementing automatic correction of the projection picture by the projection device, and will not be described herein.
The embodiment of the application has the beneficial effects that the corresponding first image and second image can be obtained by projecting the first image card and the second image card; further, the homography relation between the projection image card and the projection surface can be determined by comparing the characteristic points of the first image and the second image, the shooting image is converted into a world coordinate system based on the homography relation, the corrected projection rectangular area is determined, and the automatic correction of the generated malformed projection area after the position change of the projection equipment is realized.
The application provides a display control method for automatically correcting a projection image based on a trigger correction method, which can be applied to projection equipment, wherein the projection equipment comprises a projection screen 1 and a device 2 for projection, so that the problem of false triggering of the image correction and eye protection functions is solved under the scene that the projection equipment can trigger the image correction and the eye protection mode by default, such as in the moving process of the projection equipment.
Fig. 7 is a schematic view of a lens structure of the apparatus 2 for projection in some embodiments. In order to support the autofocus process of the apparatus 2 for projection, as shown in fig. 7, the lens 300 of the projection device may further include an optical assembly 310 and a driving motor 320. The optical component 310 is a lens group formed by one or more lenses, and can refract the light emitted by the optical machine 200, so that the light emitted by the optical machine 200 can be transmitted to the projection surface to form a transmission content image.
The optical assembly 310 may include a barrel and a plurality of lenses disposed within the barrel. Depending on whether the lens position is movable, the lenses in the optical assembly 310 may be divided into a moving lens 311 and a fixed lens 312, and the distance between the moving lens 311 and the fixed lens 312 is adjusted by changing the position of the moving lens 311, thereby changing the overall focal length of the optical assembly 310. Therefore, the driving motor 320 can drive the moving lens 311 to move by connecting with the moving lens 311 in the optical assembly 310, so as to realize an auto-focusing function.
It should be noted that, in some embodiments of the present application, the focusing process is to change the position of the moving lens 311 by driving the motor 320, so as to adjust the distance between the moving lens 311 and the fixed lens 312, i.e. adjust the image plane position, so that the lens assembly of the optical assembly 310 has an imaging principle that the adjusting focal length is actually adjusting the image distance, but in terms of the overall structure of the optical assembly 310, adjusting the position of the moving lens 311 is equivalent to adjusting the overall focal length of the optical assembly 310.
When the projection device is at different distances from the projection surface, the lens of the projection device is required to adjust different focal lengths so as to transmit a clear image on the projection surface. In the projection process, the distance between the projection device and the projection surface is different depending on the placement positions of the users, so that different focal lengths are required. Accordingly, to accommodate different usage scenarios, the projection device needs to adjust the focal length of the optical assembly 310.
Fig. 8 is a schematic diagram of a distance sensor and camera configuration in some embodiments. As shown in fig. 8, the apparatus 2 for projection may further have a built-in or external camera 700, and the camera 700 may perform image capturing on a screen projected by the projection device to obtain a projected content image. The projection device then determines whether the current lens focal length is proper or not by detecting the definition of the projected content image, and adjusts the focal length when the current lens focal length is improper. When automatic focusing is performed based on the projection content image shot by the camera 700, the projection device can continuously adjust the lens position and take a picture, and find the focusing position by comparing the definition of the front and rear position pictures, so as to adjust the moving lens 311 in the optical assembly to a proper position. For example, the controller 500 may control the driving motor 320 to gradually move the focusing start position to the focusing end position of the moving mirror 311, and continuously acquire the projection content image through the camera 700 during this time. And then, the position with the highest definition is determined by performing definition detection on the plurality of projection content images, and finally, the driving motor 320 is controlled to adjust the movable lens 311 from the focusing terminal to the position with the highest definition, so that automatic focusing is completed.
Fig. 9 is a schematic diagram of some embodiments in which the position of the projection device is changed. The projection device is initially in the position a and can project to a suitable projection area, which is rectangular and can normally completely and accurately cover a corresponding rectangular curtain. When the projection device moves from the position a to the position B, a deformed projection image is often generated, for example, a trapezoidal image appears, which causes a problem that the projection image does not coincide with the rectangular curtain.
In general, a projection apparatus has an image correction function. When the projection image projected by the projection device in the projection plane has a deformed image such as a trapezoid, the projection device can automatically correct the projection image to obtain an image with a regular shape, which can be a rectangular image, so that the image correction is realized.
Specifically, when receiving the image correction instruction, the projection device may turn on the image correction function to correct the projection image. The image correction instruction refers to a control instruction for triggering the projection device to automatically perform image correction.
In some embodiments, the image correction instruction may be an instruction that is actively entered by a user. For example, the projection device may project an image on the projection surface after the power of the projection device is turned on. At this time, the user can press an image correction switch preset in the projection device or an image correction key on a remote controller matched with the projection device, so that the projection device starts an image correction function and automatically performs image correction on the projection image.
In some embodiments, the image correction instructions may also be automatically generated according to a control program built into the projection device.
The projection device can actively generate an image correction instruction after being started. For example, when the projection device detects a first video signal input after power-on, an image correction instruction may be generated, triggering the image correction function.
Or the projection equipment can automatically generate an image correction instruction in the working process. In consideration of the fact that a user may actively move the projection device or inadvertently touch the projection device during use of the projection device, the placement posture or the setting position of the projection device may be changed, and the projection image may also become a trapezoidal image. At this time, in order to secure viewing experience of the user, the projection apparatus may automatically perform image correction.
Specifically, the projection device may detect its own situation in real time. When the projection device detects that the self-placement posture or the setting position is changed, an image correction instruction can be generated, so that an image correction function is triggered.
In some embodiments, the projection device can monitor the movement of the device in real time through its configuration component, and feed back the monitoring result to the projection device controller in real time, so as to realize the correction of the projection image at the first time by starting the image correction function in real time after the projection device moves. For example, with an acceleration sensor configured with the projection device, the controller receives monitoring data from the acceleration sensor to determine whether the projection device is moving.
After determining that the projection device is moving, the controller may generate an image correction instruction and turn on the image correction function.
In some embodiments, the camera captures images in real time during image correction by the projection device. Meanwhile, if a person is detected to enter the image, the projection device automatically triggers an eye protection mode. Therefore, in order to reduce the damage of the light source to human eyes, the projection equipment can change the brightness of the image by adjusting the brightness of the light source, so that the effect of protecting the human eyes is realized.
In general, during movement of the projection device, the projection device acquires acceleration data in real time based on the gyro sensor, and determines whether the projection device is moved by a change in the acceleration data. If the projection device is moved, the image correction function is activated when the projection device is in a stationary state.
However, acquiring acceleration data is abnormal data due to hardware performance of the gyro sensor. Such as when it is determined that the projection apparatus is in a stationary state during movement based on the abnormal data. Therefore, when the projection apparatus is moving, the image correction and the eye protection function are erroneously triggered.
To this end, the present application proposes a projection device that may comprise an optical engine, a light source, an acceleration sensor, a camera and a controller. The optical machine is used for projecting the preset playing content to the projection surface, and the projection surface can be a wall surface or a curtain. The camera may perform image correction on the projection device through an image captured by the camera. The light source is used for providing illumination for the light machine. The acceleration sensor is used for judging whether the projection device moves or not. The problem that the image correction and the eye protection function are triggered by mistake in the moving process of the projection equipment is solved.
It should be noted that, when the projection device receives the image correction instruction, the projection device may trigger the image correction function. Further, the method for generating the image correction command includes, but is not limited to, a method for generating the image correction command by selecting a preset image correction switch by a user, or a method for automatically generating the image correction command based on a change of a gyro sensor. Therefore, the embodiments of the present application will be specifically described with respect to two states, i.e., an on state of the image correction switch and an off state of the image correction switch. It will be appreciated that the projection device may automatically turn on the image correction function based on changes in the gyro sensor when the image correction switch is in the on state. When the image correction switch is in an off state, the projection apparatus does not turn on the image correction function.
In some embodiments, fig. 10 is a schematic diagram of a projection device triggering image correction in an embodiment of the present application. Referring to fig. 10, the controller acquires acceleration data acquired by the acceleration sensor. If the acceleration data is smaller than the preset acceleration threshold value, the projection equipment is judged to be in a static state, and the controller controls the optical machine to project the correction card to the projection surface. And determining a region to be projected in the projection surface based on the correction card, and projecting the play content to the region to be projected.
The controller monitors and receives acceleration data acquired by the gyroscope sensor, namely the acceleration sensor in real time. The acceleration data includes acceleration data on three axis coordinates (X, Y and the Z axis). Since the acceleration data represents acceleration data acquired in three axis coordinate directions, when acceleration data on one of the axes changes, it is explained that the projection apparatus is displaced in the axis coordinate direction. Therefore, whether or not the projection device is moved can be determined by judging the data in the three axis directions of the acceleration data. Next, acceleration data in the three-axis coordinate directions are converted into angular velocity data, and the movement angles of the projection apparatus in the three-axis coordinate directions are determined from the angular velocity data. In this way, the controller calculates the movement angles of the projection device in the three-axis coordinate directions by acquiring the acceleration data acquired last time and the acceleration data acquired currently. If the moving angles in the three axis coordinate directions are smaller than the preset acceleration threshold, the projection equipment is judged to be in a static state, and the optical machine can be controlled to project the correction chart card to the projection surface so as to trigger the image correction process. It should be noted that, the preset acceleration threshold is not limited in value in the embodiment of the present application, and a person skilled in the art may determine the preset acceleration threshold value according to the actual requirement, for example, 0.1, 0.2, 0.3, etc., which do not exceed the protection scope of the embodiment of the present application.
In some embodiments, fig. 11 is a schematic diagram of determining whether a projection device moves according to an embodiment of the present application, including the following steps:
Step 1101, judging whether the acceleration data is smaller than a preset acceleration threshold, if not, executing step 1102, if yes, executing step 1104;
step 1102, assigning a data tag corresponding to the acceleration data as a second state identifier;
Step 1103, characterizing that the projection device is in a moving state;
Step 1104, assigning a data tag corresponding to the acceleration data as a first state identifier;
step 1105, acquiring acceleration data at a first moment;
Step 1106, detecting whether the data tag corresponding to the acceleration data at the first moment is the first state identifier, if not, executing step 1107, and if so, executing step 1108;
Step 1107, representing that the projection device is in a moving state;
step 1108, acquiring acceleration data at a second moment;
step 1109, detecting whether the acceleration data at the second moment is smaller than the preset acceleration threshold, if not, executing step 1110, if yes, executing step 1111;
Step 1110, assigning the data tag corresponding to the second acceleration data as a second state identifier;
step 1111, an image auto-correction process is triggered.
Referring to fig. 11, the controller is further configured to: if the acceleration data is smaller than a preset acceleration threshold value, the data label corresponding to the acceleration data is assigned to be a first state identifier, and the first state identifier is used for representing that the projection equipment is in a static state. If the acceleration data is greater than the preset acceleration threshold value, the data tag is assigned to a second state identifier, and the second state identifier is used for representing that the projection device is in a moving state.
The preset acceleration threshold is 0.1. When the acceleration data is greater than 0.1, the current data tag is assigned as the second state identifier, for example movingType is moving, STATICTYPE is false. When the acceleration data is smaller than 0.1, the current data tag is assigned as a first state identifier, for example movingType is moving, and STATICTYPE is true.
To more accurately determine whether the projection device is truly in a stationary state, the controller is further configured to, if the acceleration data is less than a preset acceleration threshold: and acquiring acceleration data at the first moment based on the preset time length. Then, a data tag corresponding to the acceleration data at the first time is detected. If the data tag value is the first state identification, acquiring acceleration data at a second moment; the second time and the first time are separated by a preset time length. And calculating the movement angle of the projection equipment according to the acceleration data at the second moment, so as to control the optical machine to project the correction chart card to the projection surface according to the movement angle.
The preset time period is 1s. And when the acceleration data is smaller than 0.1, acquiring the acceleration data at the first moment again after the interval of 1s. And judging the data label of the acceleration data at the first moment. At this time, two cases may occur in the data tag.
First case: if the data tag of the acceleration data at the first moment is that the second state identifier movingType is moving and STATICTYPE is false, acquiring the acceleration data at the second moment again after the interval of 1 sS. And judging the data tag of the second moment acceleration data again, if the data tag of the second moment acceleration data is still the second state identifier movingType is moving and STATICTYPE is false, judging that the projection equipment is in a moving state, and not triggering the subsequent image correction process by the projection equipment.
Second case: if the data tag of the acceleration data at the first moment is that the first state identifier movingType is moving and STATICTYPE is true, acquiring the acceleration data at the second moment again after an interval of1 s. And judging the data tag of the second moment acceleration data again, if the data tag of the second moment acceleration data is still the first state identifier movingType is moving and STATICTYPE is true, judging that the projection equipment is in a static state, and triggering the subsequent image correction process by the projection equipment.
It should be noted that, since the acceleration data is collected by the gyro sensor in real time and transmitted to the controller. Furthermore, the process of acquiring acceleration data at different moments and judging corresponding data labels can be performed for multiple times, so that whether the projection equipment is actually in a static state can be accurately judged. The embodiment of the application does not specifically limit the execution times of the acquisition and judgment process, can perform multiple setting according to the equipment state corresponding to the specific projection equipment, and is within the protection scope of the embodiment of the application.
In some embodiments, after the step of acquiring the acceleration data acquired by the acceleration sensor, the controller is configured to, if the acceleration data is greater than a preset acceleration threshold: acquiring acceleration data at a third moment based on a preset duration; the third time is spaced from the time for acquiring the acceleration data by a preset time length. And if the acceleration data at the third moment is smaller than the preset acceleration threshold value, calculating the movement angle of the projection equipment, and controlling the optical machine to project the correction chart card to the projection surface according to the movement angle.
If the acceleration data acquired by the controller is greater than 0.1, the data tag of the acceleration data is a second state identifier movingType which is moving, and STATICTYPE is false. The projection device may be considered to be in a mobile state at this time. Because the acceleration sensor is collected in real time and sent to the controller, the controller still acquires and detects acceleration data in real time after the projection device is in a moving state. In this way, the controller acquires acceleration data at the third time again after the interval of 1 s. If the acceleration data at the third moment is still greater than 0.1, the projection device is judged to be still in a moving state. If the acceleration data at the third moment is smaller than 0.1, continuously acquiring the acceleration data at different moments for multiple times and judging the corresponding data tag to determine that the projection equipment is in a static state. If the projection device is in a static state, the controller calculates the movement angle of the projection device according to the acceleration data, and triggers an image correction process, which is not described herein.
In one implementation, the image correction process specifically includes: the controller controls the optical machine to project the correction card onto the projection surface. And determining a region to be projected in the projection surface based on the correction card. And finally, projecting the playing content to the area to be projected. The correction card comprises a correction white card and a correction chart card.
The image correction process is specifically described below with reference to fig. 12.
Referring to fig. 12, after receiving an input image correction command, the controller controls the optical machine to project a correction white card onto the projection surface and extract a first correction coordinate in the correction white card. Then, the controller controls the optical machine to throw the correction chart card onto the projection surface and extracts the second correction coordinates in the correction chart card. The controller obtains an angle relation between the coordinates based on the first correction coordinates and the second correction coordinates, wherein the angle relation is a projection relation between the playing content projected by the optical machine and the projection surface. Finally, the controller updates the first correction coordinates based on the angular relationship so that the area to be projected is determined according to the updated first correction coordinates.
After the controller receives the input image correction command, referring to fig. 13, the optical machine is controlled to project the correction white card onto the projection surface. And continuously extracting first correction coordinates in the correction white card, wherein the first correction coordinates comprise four vertex coordinates of the correction white card. Referring to fig. 14, the controller controls the optical engine to project the correction chart card onto the projection surface, and continues to extract the second correction coordinates in the correction chart card, where the second correction coordinates include four vertex coordinates of the correction chart card. In this way, the controller compares the four vertex coordinates of the correction card and the four vertex coordinates of the correction card, such as the vertex coordinates of the upper left corner in the correction card and the vertex coordinates of the upper left corner in the correction card, the vertex coordinates of the lower left corner in the correction card and the vertex coordinates of the lower left corner in the correction card, the vertex coordinates of the upper right corner in the correction card and the vertex coordinates of the upper right corner in the correction card, and the vertex coordinates of the lower right corner in the correction card. And calculating the angle relation of the coordinates by comparison, and updating and correcting the four vertex coordinates in the white card according to the angle relation. In this way, the area to be projected can be determined based on the four vertex coordinates in the update-completed correction white card, that is, the image correction is completed. Finally, the optical machine is controlled to project the playing content to the area to be projected for the user to watch.
In one implementation, the projection device may acquire the angular relationship between the first correction coordinate and the second correction coordinate through a preset correction algorithm. The projection relationship refers to a projection relationship of the projection device to project an image onto a projection surface, and specifically refers to a mapping relationship between play content projected by an optical machine of the projection device and the projection surface.
In one implementation manner, the projection device may convert the first correction coordinate into the second correction coordinate through a preset correction algorithm, so as to complete updating of the first correction coordinate.
In order to obtain the first correction coordinates and the second correction coordinates, the controller controls the optical machine to project the correction white card and the correction chart card onto the projection surface, and then the controller can also control the camera to shoot the correction white card and the correction chart card displayed in the projection surface, so as to obtain a correction white card image and a correction chart card image. Therefore, the first correction coordinates and the second correction coordinates can be acquired from the corrected white card image and the corrected card image captured by the camera.
And determining a first correction coordinate and a second correction coordinate under an image coordinate system corresponding to the corrected white card image and the corrected image card image. The image coordinate system refers to: the center of the image is taken as the origin of coordinates, and the X and Y axes are parallel to the coordinate systems on two sides of the image. And the following steps: for a projection area preset by a user, the center point of the projection area may be set as an origin, the horizontal direction is the X axis, and the vertical direction is the Y axis.
It should be noted that the first correction coordinates and the second correction coordinates include, but are not limited to, four vertex coordinates, and the number of coordinates and the positions of the coordinates can be adjusted according to the correction white card, the correction chart card and the projection device.
In some embodiments, the calibration chart may include a pattern and a color feature preset by a user. FIG. 15 is a schematic diagram of a calibration chart in some embodiments. Three correction areas may be included in the correction chart card: a first correction region, a second correction region, and a third correction region. The three correction areas may be arranged one after the other along the direction from the center to the edge of the correction chart card. The first correction area is located in the middle area of the correction chart card, such as area A in the chart. The first correction area may be a checkerboard pattern or a circular ring pattern. The second correction region is located between the first correction region and the third correction region, as four B regions in the figure, including four correction regions B1 to B4. The area of each second correction region may be the same, and may be smaller than the area of the first correction region. Each second correction area includes a pattern preset by a user, where the pattern of each second correction area may be the same or different, and the embodiment of the present application is not limited. The third correction area is located in the edge area of the correction chart card, such as 16C areas in the chart, and comprises 16 correction areas from C1 to C16. The area of each third correction region may be the same, being smaller than the area of the second correction region. Each third correction area may include a pattern preset by a user, where the pattern of each third correction area may be the same or different.
Further, the second correction coordinates may be four vertex coordinates of all the patterns in the three correction areas, or four vertex coordinates of any of several patterns in any of the correction areas.
In some embodiments, as shown in fig. 16, the pattern in the correction map card may also be provided as a circular pattern, including a circular pattern. It should be noted that the correction chart card may also be configured as a combination of the two types of patterns, or may be configured as other patterns with image correction being possible.
In some embodiments, the controller sequentially controls the optical machine to project the correction white card and the correction chart card onto the projection surface, and then triggers the auto-focusing process. For convenience of description, the auto-focusing process performed after the correction white card is projected onto the projection surface is referred to as first-time focusing. The auto-focusing process performed after the correction chart card is projected onto the projection surface is called secondary focusing.
The projection device provided by the embodiment of the application further comprises a lens, wherein the lens comprises an optical component and a driving motor; the drive motor is connected to the optical assembly. The focal length of the optical component is adjusted by the driving motor so as to realize a first focusing process and a second focusing process. When the first focusing process and the second focusing process are executed, the controller controls the lens to adopt a contrast focusing method, and controls the driving motor to drive the optical component to move step by step. And simultaneously calculating a contrast value corresponding to each moving position of the lens. And determining a target position according to the contrast value, wherein the target position is the corresponding position of the image with the highest contrast value. Finally, the drive motor is controlled to move the position of the optical assembly according to the target position.
In some implementations, the projection device may also obtain the current object distance by using its configured laser ranging through an auto-focus algorithm to calculate the initial focal length and search range. And then the projection equipment drives the camera to take pictures, and the definition evaluation is carried out by utilizing a corresponding algorithm. Therefore, the projection device searches for the best possible focal length based on the search algorithm within the above search range. And repeating the photographing and definition evaluation steps, and finally finding the optimal focal length through definition comparison to complete automatic focusing.
In some embodiments, after the projection device is activated and the user changes its position, the projection device detects whether the auto-focus function is on before triggering the image correction. When the autofocus function is not on, the controller will not trigger the autofocus process. When the auto-focus function is on, the projection device will automatically trigger the first focus process, the second focus process, and the focus process after the image correction is completed.
In some embodiments, the controller may also trigger the eye-protection mode after performing the image correction process. The display brightness of the user interface is reduced by triggering the eye protection mode, so that the danger of vision damage caused by the fact that a user accidentally enters the range of the laser track emitted by the projection device is prevented. When a user enters a projection area where the projection device is located, the controller can control the user interface to display corresponding prompt information so as to remind the user of leaving the current area.
As shown in fig. 17, a flowchart of the projection device triggering an eye-protection mode is provided, which includes the following steps:
Step 1701, judging whether a child video mode switch is turned on, if yes, executing step 1702, otherwise, ending triggering;
Step 1702, judging whether a radiation eye switch is turned on, if yes, executing step 1703, otherwise, ending triggering;
step 1703, judging whether the image correction is performed, if not, executing step 1704, and if yes, ending triggering;
Step 1704, judging whether automatic focusing is performed, if not, executing step 1705, otherwise, ending triggering;
Step 1705, a trigger to enter eye-protection mode. In one implementation, referring to fig. 17, after performing the image correction process, the controller detects whether the child video mode is on, such as detecting whether a child video mode switch is on. And in the state that the child video watching mode switch is turned on, the controller detects whether the anti-shooting eye switch is turned on or not. If the anti-eye-shot switch is in an on state, when a user enters a projection area, the projection equipment triggers the projection equipment to enter an eye-protection mode, namely, the laser intensity emitted by the light source is automatically reduced, the display brightness of a user interface is reduced, and safety prompt information is displayed.
In some embodiments, the controller may also automatically turn on the anti-eye switch. After receiving the acceleration data sent by the acceleration sensor or the data collected by other sensors, the controller controls the projection equipment to open the anti-eye-shooting switch.
In one implementation, when the controller triggers the eye-protection function after performing the image correction procedure, the controller is further configured to: and acquiring images with the same content of the preset frames through a camera. Calculating a similarity value of a corresponding image in a preset frame; and if the similarity value is larger than the similarity threshold value, adjusting the brightness of the light source to the preset brightness. Wherein the preset brightness is smaller than the brightness in normal projection.
The controller acquires images of two frames, such as calculating a similarity value of the image of the current frame and the image of the previous frame. If the similarity value is smaller than the similarity threshold value, the two frames of images are judged to be identical, and no person enters the projection area. If the similarity value is smaller than the similarity threshold value, judging that the two frames of images are different, and enabling people to enter a projection area. Thus, when it is detected that a person enters the projection area, the brightness of the light source is adjusted to a preset brightness. Therefore, the purpose of protecting human eyes is achieved by adjusting the brightness of the light source to be low. It should be noted that the present application is not limited to an implementation manner of how to detect that a person enters the projection area, and only takes as an example whether or not the preset frame images are identical.
The image correction, the automatic focusing process and the eye protection mode cannot be triggered synchronously. The eye-protection mode cannot be triggered to prevent the effect of correction from being affected when the image correction or the auto-focusing process is performed. Thus, if the images corresponding to the preset frames are not identical, with continued reference to fig. 17, the controller is further configured to: detecting the correction progress of the image; and if the correction progress is not carried out, acquiring acceleration data at the current moment. And adjusting the brightness of the light source to a preset brightness according to the acceleration data at the current moment.
Thus, after detecting that a person enters the projection area, the controller needs to detect the progress of correction of the image. If the image correction is not performed, the acceleration data at the current moment is acquired and detected. And judging whether the projection equipment is in a moving state or not according to the acceleration data at the current moment. If the projection device is in a static state, the brightness of the light source is adjusted to a preset brightness, namely, an eye protection mode is triggered.
In some embodiments, if the correction schedule is not ongoing, the controller is further configured to: and detecting the focusing progress of the image. And if the focusing progress is not carried out, acquiring and detecting acceleration data at the current moment. Judging whether the projection equipment is in a moving state or not according to the acceleration data at the current moment, and triggering an eye protection mode if the projection equipment is in a static state.
The controller may detect whether the autofocus switch is on when detecting whether the autofocus function is on. And if the automatic focusing switch is not turned on, triggering an eye protection mode. If the automatic focusing switch is turned on, the controller needs to judge the focusing progress of the image at the current moment. And if the focusing progress is not carried out, triggering an eye protection mode.
In the embodiment of the application, the projection device can trigger the image correction process based on the change of the acceleration sensor under the condition that the image correction switch is in the default on state, and trigger the image to enter the eye protection mode after the image self correction process is completed. By detecting acceleration data acquired by the acceleration sensor, whether the projection equipment is in a static state or not is more accurately judged. The false triggering of the image correction process and the entering of the eye protection mode during the moving state of the projection equipment are avoided. Further, the image correction process and the automatic focusing process are prevented from being triggered when the eye protection mode is entered, and the use experience of a user is improved.
In some embodiments, the controller does not trigger the image correction process when the image correction switch of the projection device is in an off state. At this time, when the person is detected to enter the projection area in the open state of the anti-eye, the projection device automatically enters the eye-protecting mode, i.e. the processes of reducing the laser intensity emitted by the light source, reducing the display brightness of the user interface, displaying safety prompt information and the like are performed. It should be noted that, the specific step of triggering the eye protection mode refers to the description in the above embodiment, and is not repeated here.
Thus, when the image correction switch is in a default off state, the projection device triggers the eye protection mode if the auto-focus switch is in the off state after detecting the switch off state of the auto-focus switch. Meanwhile, in the process of triggering the eye protection mode, the image correction process and the automatic focusing process are prevented from being triggered simultaneously when the eye protection mode is entered by detecting the correction progress and the focusing progress, so that the use experience of a user is improved.
According to the technical scheme, the projection equipment provided by some embodiments of the application acquires acceleration data acquired by the acceleration sensor under the scene that the image correction and the eye protection mode can be triggered by default. And if the acceleration data is smaller than a preset acceleration threshold value, controlling the optical machine to project the correction card to the projection surface. And determining a region to be projected in the projection surface based on the correction card, and projecting the playing content to the region to be projected. Thereby, the image correction is triggered after it is determined that the projection apparatus is in a stationary state after moving. And then the camera acquires images with the same content in the preset frames, and the brightness of the light source is adjusted according to the images, namely the eye protection function is triggered. Thus, after the image correction process is triggered, the eye protection mode is triggered based on the image having the same content of the preset frame. The problem of in projection equipment's removal in-process, mistake triggering image correction and eye protection function is solved, user's use experience is promoted.
In some embodiments, the present application further provides a trigger correction method applied to a projection device, where the projection device includes a light source, an acceleration sensor, a light engine, a camera, and a controller; the trigger correction method comprises the following steps:
acquiring acceleration data acquired by an acceleration sensor; if the acceleration data is smaller than a preset acceleration threshold value, controlling the optical machine to project the correction chart card to a projection surface; determining a region to be projected in a projection plane based on the correction chart card, and projecting the playing content to the region to be projected; acquiring images with the same content of a preset frame through a camera; adjusting the brightness of the light source to a preset brightness; wherein the preset brightness is smaller than the brightness in normal projection.
In some embodiments, a method comprises: if the acceleration data is smaller than a preset acceleration threshold value, assigning a data tag corresponding to the acceleration data as a first state identifier, wherein the first state identifier is used for representing that the projection equipment is in a static state; if the acceleration data is greater than the preset acceleration threshold value, the data tag is assigned to a second state identifier, and the second state identifier is used for representing that the projection device is in a moving state.
In some embodiments, if the acceleration data is less than a preset acceleration threshold, the method comprises: acquiring acceleration data at a first moment based on a preset duration; detecting a data tag corresponding to acceleration data at a first moment; if the data tag value is the first state identification, acquiring acceleration data at a second moment; the interval between the second moment and the first moment is preset; and calculating the movement angle of the projection equipment according to the acceleration data at the second moment, so as to control the optical machine to project the correction chart card to the projection surface according to the movement angle.
In some embodiments, if the acceleration data is greater than a preset acceleration threshold, the method includes: after the step of acquiring the acceleration data acquired by the acceleration sensor, acquiring the acceleration data at a third moment based on a preset time length; the third moment is spaced from the moment of acquiring the acceleration data by a preset time length; and if the acceleration data at the third moment is smaller than the preset acceleration threshold value, calculating the movement angle of the projection equipment, and controlling the optical machine to project the correction chart card to the projection surface according to the movement angle.
In some embodiments, a method comprises: acquiring the last acquired acceleration data in the step that the acceleration data is smaller than a preset acceleration threshold value; calculating the moving angles of the projection equipment in the three axis coordinate directions according to the acceleration data; and if the moving angles in the three axis coordinate directions are smaller than the preset acceleration threshold, controlling the optical machine to project the correction chart card to the projection surface.
In some embodiments, a method comprises: in the step of adjusting the brightness of the light source to the preset brightness, a similarity value of a corresponding image in the preset frame is calculated. And if the similarity value is larger than the similarity threshold value, adjusting the brightness of the light source to the preset brightness.
In some embodiments, if the corresponding images in the preset number of frames are the same, the method includes: detecting the correction progress of the image; if the correction progress is not carried out, acquiring acceleration data at the current moment; and adjusting the brightness of the light source to a preset brightness according to the acceleration data at the current moment.
In some embodiments, the correction card includes a correction white card and a correction map card; the method comprises the following steps: in the step of determining the area to be projected in the projection surface based on the correction card, sequentially controlling the optical machine to project the correction white card and the correction chart card to the projection surface; respectively extracting a first correction coordinate in the correction white card and a second correction coordinate in the correction chart card; acquiring an angle relation between the first correction coordinate and the second correction coordinate, wherein the angle relation is a projection relation between the playing content projected by the optical machine and the projection surface; the first correction coordinates are updated based on the angular relationship such that the area to be projected is determined from the updated first correction coordinates.
In some embodiments, the projection device further comprises a lens comprising an optical assembly and a drive motor; the driving motor is connected with the optical component to adjust the focal length of the optical component; the method comprises the following steps: after the step of controlling the optical machine to project and correct the white card, controlling a driving motor to drive the optical component to move step by step; calculating a contrast value corresponding to each moving position of the lens; determining a target position according to the contrast value, wherein the target position is the image corresponding position with the highest contrast value; the drive motor is controlled to move the position of the optical assembly in accordance with the target position.
The same and similar parts of the embodiments in this specification are referred to each other, and are not described herein.
Claims (19)
- A projection device, comprising:the optical machine is used for projecting the play content to the projection surface;a camera for acquiring a display image of the projection surface;a controller configured to:After detecting that the projection equipment moves, controlling the camera to sequentially acquire a first image and a second image projected to the projection surface by the optical machine, wherein the first image corresponds to a first image projected by the optical machine, the second image corresponds to a second image projected by the optical machine, and the first image and the second image are used for corresponding feature comparison to determine the mapping relation between the projection image and the projection surface under a world coordinate system;And acquiring a rectangular area of the camera shooting image, converting the camera shooting image into the rectangular area under the world coordinate system based on the mapping relation, controlling the optical machine to project the playing content to the rectangular area, and replacing a malformed projection area formed on the projection surface after the projection equipment moves by the rectangular area.
- The projection device of claim 1, the controller to control the light engine to project the play content to the rectangular area, the controller further configured to:Carrying out graying treatment on the shot image;clustering the photographed images subjected to the graying treatment into a projection beam area and an obstacle area based on a clustering algorithm and a gray value;And binarizing the projection beam area and the obstacle area to extract a first closed contour which is larger than a preset obstacle avoidance area threshold, wherein the first closed contour corresponds to an obstacle, and the first closed contour is used for avoiding the rectangular area generated by the projection surface so as to realize projection obstacle avoidance.
- The projection device of claim 2, the controller to control the light engine to project the play content to the rectangular area, the controller further configured to:And binarizing the projection beam area and the obstacle area to extract a second closed contour smaller than a preset obstacle avoidance area threshold, wherein the second closed contour corresponds to a non-obstacle, the second closed contour is filled into the projection beam area, and the position of the rectangular area generated by the projection surface is not influenced by the second closed contour.
- The projection device of claim 1, wherein the controller obtains, based on the mapping relationship, a rectangular region in which the camera-captured image is converted to a world coordinate system, specifically comprising the controller:Based on the mapping relation, determining the center point coordinates of the shooting image corresponding to the projection image card in a world coordinate system;And performing rectangular expansion by taking the coordinates of the central point as a reference according to the aspect ratio of the projection card until reaching the gray value change boundary, and acquiring four vertexes of the rectangle at the moment as vertexes of the rectangular region.
- The projection device of claim 1, wherein the controller obtains, based on the mapping relationship, a rectangular region in which the camera-captured image is converted to a world coordinate system, specifically comprising the controller:Acquiring a maximum inscribed rectangle with a gray value of 255 in a photographed image;Correcting the wide value and the high value of the maximum inscribed rectangle according to the aspect ratio of the projection card;and obtaining any vertex of the maximum inscribed rectangle based on the wide value and the high value so as to position the rectangular area.
- The projection device of claim 1, the controller further configured to:After the step of controlling the optical machine to project the playing content to the rectangular area, identifying a projection area of the projection device by utilizing an edge detection algorithm based on the display image of the projection surface acquired by the camera; when the projection area is displayed as a rectangle or a rectangle-like shape, the controller acquires coordinate values of four vertexes of the rectangular projection area through a preset algorithm.
- The projection device of claim 6, the controller further configured to:And correcting the projection area into a rectangle by using a perspective transformation method, and calculating the difference between the rectangle and the projection screen shot to judge whether foreign matters exist in the display area.
- The projection device of claim 6, the controller further configured to:When foreign matter detection is realized on a certain area outside the projection area, the camera content of the current frame and the camera content of the previous frame can be subjected to difference value to judge whether foreign matter enters the area outside the projection range; if the foreign matter is judged to enter, the projection equipment automatically triggers the eye-shot preventing function.
- The projection device of claim 6, the controller further configured to:Detecting real-time depth changes of the specific area with a time-of-flight camera, or a time-of-flight sensor; if the depth value changes beyond the preset threshold, the projection device will automatically trigger the anti-eye function.
- The projection device of claim 6, the controller further configured to: and analyzing and judging whether an eye-shooting prevention function needs to be started or not based on the collected flight time data, screenshot data and camera data.
- The projection device of claim 6, the controller further configured to:if the user is located in a specific area, the vision of the user is at risk of being damaged by laser, the eye-shot prevention function is automatically started, so that the laser intensity emitted by the optical machine is reduced, the display brightness of a user interface is reduced, and safety prompt information is displayed.
- The projection device of claim 6, the controller further configured to:Monitoring the movement of the device by a gyroscope, or a gyroscopic sensor; signaling for inquiring the state of the equipment is sent to the gyroscope, and signaling for judging whether the equipment moves or not is received from the gyroscope.
- The projection device of claim 6, the controller further configured to:after the gyroscope data is stable for a preset time length, controlling the starting to trigger the trapezoidal correction, and not responding to an instruction sent by a key of the remote controller when the trapezoidal correction is performed.
- The projection device of claim 6, the controller further configured to:The curtain is identified through an automatic obstacle avoidance algorithm, and the play content is corrected to the curtain for display by utilizing projection change, so that the effect of alignment with the edge of the curtain is realized.
- A display control method for a projected image of a projection apparatus, the method comprising:After detecting movement of the projection device, successively acquiring a first image and a second image projected to a projection surface, wherein the first image corresponds to projection of a first image card, the second image corresponds to projection of a second image card, and the first image and the second image are used for corresponding feature comparison to determine a mapping relation between the projection image card and the projection surface under a world coordinate system, and the projection device comprises: the camera is used for projecting the broadcasting content to the optical machine of the projection surface and acquiring the display image of the projection surface;And acquiring a rectangular area of the camera shooting image, converting the camera shooting image into a world coordinate system based on the mapping relation, and projecting the playing content to the rectangular area, wherein the rectangular area is used for replacing a malformed projection area formed on a projection surface after movement.
- The display control method of a projection image according to claim 15, before projecting the play content to the rectangular area, the method further comprising:Carrying out graying treatment on the shot image;clustering the photographed images subjected to the graying treatment into a projection beam area and an obstacle area based on a clustering algorithm and a gray value;And binarizing the projection beam area and the obstacle area to extract a first closed contour which is larger than a preset obstacle avoidance area threshold, wherein the first closed contour corresponds to an obstacle, and the first closed contour is used for avoiding the rectangular area generated by the projection surface so as to realize projection obstacle avoidance.
- The display control method of a projection image according to claim 16, before projecting the play content to the rectangular area, the method further comprising:And binarizing the projection beam area and the obstacle area to extract a second closed contour smaller than a preset obstacle avoidance area threshold, wherein the second closed contour corresponds to a non-obstacle, the second closed contour is filled into the projection beam area, and the position of the rectangular area generated by the projection surface is not influenced by the second closed contour.
- The display control method of the projection image according to claim 15, wherein the obtaining the rectangular region of the camera-captured image converted to the world coordinate system based on the mapping relation specifically comprises:Based on the mapping relation, determining the center point coordinates of the shooting image corresponding to the projection image card in a world coordinate system;And performing rectangular expansion by taking the coordinates of the central point as a reference according to the aspect ratio of the projection card until reaching the gray value change boundary, and acquiring four vertexes of the rectangle at the moment as vertexes of the rectangular region.
- The display control method of the projection image according to claim 15, wherein the obtaining the rectangular region of the camera-captured image converted to the world coordinate system based on the mapping relation specifically comprises:acquiring a maximum inscribed rectangle with a gray value of 255 in the photographed image;Correcting the wide value and the high value of the maximum inscribed rectangle according to the aspect ratio of the projection card;and obtaining any vertex of the maximum inscribed rectangle based on the wide value and the high value so as to position the rectangular area.
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111355866 | 2021-11-16 | ||
CN2021113558660 | 2021-11-16 | ||
CN2022100506003 | 2022-01-17 | ||
CN202210050600.3A CN114205570B (en) | 2021-11-16 | 2022-01-17 | Projection equipment and display control method for automatically correcting projection image |
CN2022103995143 | 2022-04-15 | ||
CN202210399514.3A CN114866751B (en) | 2022-04-15 | 2022-04-15 | Projection equipment and trigger correction method |
PCT/CN2022/122816 WO2023087951A1 (en) | 2021-11-16 | 2022-09-29 | Projection device, and display control method for projected image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118104229A true CN118104229A (en) | 2024-05-28 |
Family
ID=86396219
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202280063190.4A Pending CN118104229A (en) | 2021-11-16 | 2022-09-29 | Projection equipment and display control method of projection image |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN118104229A (en) |
WO (1) | WO2023087951A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN119323567B (en) * | 2024-12-17 | 2025-09-12 | 宁波舜宇光电信息有限公司 | Module lens dirt detection and glue line detection method |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101916175B (en) * | 2010-08-20 | 2012-05-02 | 浙江大学 | Intelligent projection method adaptive to projection surface |
KR101820905B1 (en) * | 2016-12-16 | 2018-01-22 | 씨제이씨지브이 주식회사 | An image-based projection area automatic correction method photographed by a photographing apparatus and a system therefor |
CN110336987B (en) * | 2019-04-03 | 2021-10-08 | 北京小鸟听听科技有限公司 | Projector distortion correction method and device and projector |
CN110677634B (en) * | 2019-11-27 | 2021-06-29 | 成都极米科技股份有限公司 | Trapezoidal correction method, device and system for projector and readable storage medium |
CN112584113B (en) * | 2020-12-02 | 2022-08-30 | 深圳市当智科技有限公司 | Wide-screen projection method and system based on mapping correction and readable storage medium |
CN114466173B (en) * | 2021-11-16 | 2024-12-20 | 海信视像科技股份有限公司 | Projection equipment and projection display control method for automatically putting into screen area |
CN114866751B (en) * | 2022-04-15 | 2024-08-20 | 海信视像科技股份有限公司 | Projection equipment and trigger correction method |
-
2022
- 2022-09-29 WO PCT/CN2022/122816 patent/WO2023087951A1/en not_active Ceased
- 2022-09-29 CN CN202280063190.4A patent/CN118104229A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2023087951A1 (en) | 2023-05-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114205570B (en) | Projection equipment and display control method for automatically correcting projection image | |
CN118541967A (en) | Projection device and correction method | |
CN114866751B (en) | Projection equipment and trigger correction method | |
CN115002433B (en) | Projection equipment and ROI feature area selection method | |
US20240305754A1 (en) | Projection device and obstacle avoidance projection method | |
CN115002432B (en) | Projection equipment and obstacle avoidance projection method | |
CN116320335A (en) | Projection equipment and method for adjusting projection picture size | |
CN118476210A (en) | Projection equipment and display control method | |
CN114885142B (en) | Projection equipment and method for adjusting projection brightness | |
CN114760454B (en) | Projection device and trigger correction method | |
CN115883803A (en) | Projection equipment and projection screen correction method | |
CN115243021A (en) | Projection equipment and obstacle avoidance projection method | |
CN114928728B (en) | Projection apparatus and foreign matter detection method | |
CN118104229A (en) | Projection equipment and display control method of projection image | |
JP2012181264A (en) | Projection device, projection method, and program | |
US20170214894A1 (en) | Projector and method of controlling projector | |
CN120034632A (en) | Projection device and projection obstacle avoidance method | |
CN119788824A (en) | Projection apparatus and projection control method | |
CN119788823A (en) | Projection apparatus and projection control method | |
CN118158367A (en) | Projection equipment and projection picture curtain entering method | |
JP2017152765A (en) | Projector and projector control method | |
CN119854464A (en) | Projection equipment and projection method | |
CN118383027A (en) | Laser projection device and method for correcting projected image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |