[go: up one dir, main page]

WO2018233533A1 - Online integration of augmented reality editing devices and systems - Google Patents

Online integration of augmented reality editing devices and systems Download PDF

Info

Publication number
WO2018233533A1
WO2018233533A1 PCT/CN2018/091180 CN2018091180W WO2018233533A1 WO 2018233533 A1 WO2018233533 A1 WO 2018233533A1 CN 2018091180 W CN2018091180 W CN 2018091180W WO 2018233533 A1 WO2018233533 A1 WO 2018233533A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
cloud server
editing
file
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2018/091180
Other languages
French (fr)
Chinese (zh)
Inventor
卢俊谚
蔡雅雯
卢博爵
黄柏元
费祥霆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2018233533A1 publication Critical patent/WO2018233533A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality

Definitions

  • the present invention relates to a media editing apparatus and system, and in particular to an editing apparatus and system for integrating an augmented reality on-line.
  • AR Augmented Reality
  • the AR function is a technique for calculating the position and angle of a camera image in real time, plus the corresponding image, image, sound or other multimedia object.
  • the goal of this technology is to apply virtual world images to virtual objects on the display screen, allowing users to interact with virtual objects.
  • AR technology requires high development costs and a large amount of manpower input; this has caused the entry threshold of AR to be reduced, hindering the subsequent development of VR technology.
  • the invention provides an editing device for online integration of augmented reality.
  • the editor side of the editing device can obtain the AR temporary storage file from the online in real time, and can also apply the AR temporary storage file to the captured environmental movie.
  • the editor side can also upload the edited AR target file to the cloud server, so that the AR target file becomes a new AR temporary file in the cloud server.
  • the online integrated augmented reality editing device of the present invention comprises a communication unit, a display unit, an object operating unit and a processing unit.
  • the communication unit network is connected to the cloud server, and the AR target file or the interactive object can be obtained from the cloud server; the display unit is used to display the environment image, the AR temporary storage file, the AR target file or the interactive object; and the object operation unit is used to control the interactive object.
  • the display position and the display angle in the display unit; the processing unit and the communication unit, the display unit and the object operation unit are electrically connected; the processing unit can drive the communication unit to obtain the phase of the identification information from the cloud server according to the information of the identification information.
  • the processing unit may generate an AR temporary storage file according to the interactive object and each step generated in the editing process of the selected AR target file. Select any AR temporary file and generate an AR target file, and the processing unit sends the generated AR target file to the cloud server through the communication unit.
  • the invention also provides an editing system for online integration of augmented reality, the editing system comprising a cloud server, at least one editor end and a mobile device end.
  • the cloud server is used to record multiple AR target files, multiple interactive objects or user databases, and the user database records the identification information of each user.
  • the editor end network is connected to the cloud server, and the editor end comprises a first communication unit, a first storage unit, a first display unit, a first processing unit and an object operation unit, a first processing unit and a first communication unit, and a first
  • the storage unit, the first display unit and the object operation unit are electrically connected, and the first communication unit acquires index information of the AR target file corresponding to the identification information from the cloud server.
  • the first storage unit is configured to store an object editing interface, at least one AR temporary file, an interactive object, or an AR target file.
  • the first display unit is used to edit the object editing interface and the selected interactive object.
  • the object operating unit is configured to control a placement position and a display angle of the interactive object in the first display unit, and generate an AR temporary storage file corresponding to the interactive object in the editing process.
  • the first processing unit generates an AR target file according to the interactive object and the selected AR temporary storage file, and the first processing unit sends the generated AR target file to the cloud server through the first communication unit.
  • the mobile device includes a second processing unit, a second communication unit, a second display unit, an image selection unit and a second storage unit, a second processing unit and a second communication unit, a second display unit, an image selection unit, and a second storage unit.
  • the unit is electrically connected.
  • the second communication unit network is connected to the cloud server, the image selection unit is used for capturing the target to be identified, and the second processing unit sends the target to be identified to the cloud server, and the cloud server identifies whether the target to be identified is a plurality of interactive objects.
  • the cloud server sends the AR target file corresponding to the target to be identified to the mobile device end, and plays the received AR target file through the second display unit.
  • the online integrated and augmented reality editing device and system can enable the editor to obtain different AR temporary storage files from the line, and the AR target file can also be uploaded to the cloud server after the editor side is edited. .
  • the editor can trade or exchange various materials in the AR scratch file, such as interactive objects or scenes.
  • the mobile device can download the existing AR target file, or send the AR target file to other users through community sharing.
  • the mobile device can also capture different objects or images, so that the captured target is converted into the content of the AR temporary file.
  • FIG. 1 is a schematic diagram of a system architecture of the present invention.
  • FIG. 2 is a schematic diagram of an editing process of the AR temporary storage file of the present invention.
  • FIG. 3 is a schematic diagram of selecting an interactive object in the object editing interface of the present invention.
  • FIG. 4 is a schematic diagram of selecting an AR temporary file in the object editing interface of the present invention.
  • FIG. 5 is a schematic diagram of the operation of the second image recognition program of the present invention.
  • Figure 6 is a schematic diagram of the identification code input operation of the present invention.
  • FIG. 7 is a schematic diagram of the operation of sharing by the community of the present invention.
  • Editing system 100 cloud server 110; AR temporary file 111; AR target file 112; user database 114; identification information 115; index information 116; editor terminal 120; first communication unit 121; first storage unit 122
  • FIG. 1 is a schematic diagram showing the architecture and composition of the system of the present invention.
  • the online integrated augmented reality editing system 100 of the present invention includes a cloud server 110, at least one editor terminal 120 and a mobile device terminal 130. Both the editor terminal 120 and the mobile device terminal 130 can be connected to the cloud server 110 via a network.
  • the composition structure of the present invention comprises two parts: an editing unit and an execution unit; the editing unit includes an editor terminal 120 for establishing or editing an augmented reality; the execution unit includes a mobile device end 130 for executing or displaying augmented reality content. .
  • the cloud server 110 is used to record information including a plurality of AR target files 112, a plurality of interactive objects 126, or a user database 114.
  • the AR temporary storage file 111 is used for recording temporary data of each step in the user editing process, and the steps are the editing process of the interactive object 126.
  • the user can select the interactive object 126 and the at least one AR temporary file 111 to be used to generate the AR target file 112.
  • the user database 114 is used to record the identification information 115 of each user, and each user can have a plurality of AR temporary files 111 or AR target files 112.
  • the user can download the AR target file 112; the user can only view the downloaded AR target file 112, and cannot view the AR target file 112 at the cloud server 110.
  • the AR temporary file 111 to which the AR target file 112 at the cloud server 110 belongs cannot be edited or edited.
  • the user can open the AR target file 112 or edit the AR temporary file 111 after downloading the AR target file.
  • the editor terminal 120 includes a first communication unit 121, a first storage unit 122, a first display unit 123, a first processing unit 124, and an object operation unit 125.
  • the first processing unit 124 is electrically connected to the first communication unit 121, the first storage unit 122, the first display unit 123, and the object operating unit 125.
  • the mobile device end 130 includes a second processing unit 131, a second communication unit 132, a second display unit 133, an image selection unit 134, an input unit 135, and a second storage unit 136.
  • the second processing unit 131 is electrically connected to the second communication unit 132, the second display unit 133, the image selection unit 134, the input unit 135, and the second storage unit 136.
  • the editor terminal 120 can be a personal computer, a notebook computer, a smart phone, a tablet computer, or a wearable device, but is not limited thereto.
  • the first communication unit 121 is connected to the cloud server 110 via a network.
  • the first communication unit 121 can transmit the information of the identification information 115 and the AR target file 112 to the cloud server 110.
  • the first storage unit 122 is configured to store information of the object editing interface 127, the interactive object 126, the AR target file 112, or the identification information 115.
  • the first display unit 123 can make a selection of the object editing interface 127, and can also edit the selected interactive object 126.
  • the user can select the button, image object, list, or image object of the object editing interface 127 through the object operating unit 125; or move or rotate the interactive object 126 to move to any position of the object editing interface 127.
  • the object operating unit 125 may be a keyboard, a mouse, a touch screen, but is not limited thereto.
  • the cloud server 110 When the editor terminal 120 is connected to the cloud server 110, the cloud server 110 performs a confirmation step of the user's identification information 115. After the cloud server 110 completes the verification of the identification information 115, the cloud server 110 transmits the corresponding index information 116 to the user according to the identification information 115.
  • the AR target file 112 that the user can select is recorded in the index information 116.
  • the content stored in the cloud server 110 includes public index information 116 and index information 116 for personal use.
  • the editor terminal 120 downloads the selected AR target file 112 and displays the scene content of the selected AR target file 112 on the object editing interface 127.
  • the scene of the AR temporary file 111 may be a still image, a festive greeting card, a postcard, or a dynamic movie, but is not limited thereto.
  • the object editing interface 127 in this embodiment includes at least an AR target file list, an interactive object list, an editing area, a location information, an event list, and an action list.
  • the user can select an object to be edited from the AR target file list and the interactive object list.
  • the related scene or interactive object 126 of the AR target file 112 is downloaded and displayed in the editing area of the object editing interface 127.
  • the user can select a model in any of the interactive object lists and edit the selected interactive object 126 in the editing area.
  • the user can select at least one interactive object 126 from the list of interactive objects of the object editing interface 127, or can download a new interactive object 126 from an external source. For example, the user downloads any image, picture or 3D model from the hard disk or optical disc of the editor terminal 120 as a new interactive object 126.
  • FIG. 2 it is a schematic diagram of an editing process of the AR target file 112 of the present invention.
  • the editing process of the AR target file 112 is described in terms of vehicle travel, but is not limited thereto.
  • the user selects at least one interactive object 126 in the object editing interface 127 and moves the interactive object 126 to the scene of the AR target file 112 through the object operating unit 125. Meanwhile, when the user is editing the interactive object 126, the first processing unit 124 records the steps of editing the interactive object 126, and stores it in the AR temporary file 111, as shown in FIG. 2 and FIG.
  • the user drags the interactive object 126 from the left side of the object editing interface 127 into the scene.
  • the first processing unit 124 selects or operates the interactive object 126
  • the first processing unit 124 begins to create a corresponding AR temporary file 111.
  • the first processing unit 124 records the above operation process in the AR temporary storage file for the operation of setting the interactive object 126 to move, zoom, change the image, set the object action, change the display menu or add or delete the image in the scene.
  • Figure 4 The first processing unit 124 records each edited process of the interactive object 126 as a new AR temporary file 111.
  • the user may select at least one AR temporary storage file 111 from the current plurality of AR temporary storage files 111, and associate the selected AR temporary storage file 111 with the interactive object 126 to generate a new AR. Target file 112.
  • the user may also select a plurality of AR temporary files 111 and associate the selected plurality of AR temporary files 111 with the interactive object 126 to generate an AR target file 112.
  • the editor side 120 can also set the triggering action of the interactive object 126 when clicking or moving, such as an animation, playing a sound, playing a movie, or opening a new window screen, but is not limited thereto.
  • the first processing unit 124 After the editing of the interactive object 126 and the AR temporary file 111 is completed, the first processing unit 124 generates the result of the editing completion as the AR target file 112.
  • the editor side 120 can set access rights to the AR target file 112, and the access rights can be publicly used or limited to users.
  • the setting of the rights includes the access rights of the AR target file 112 in the identification information, but is not limited thereto.
  • the editor 120 After setting the identification information 115, the editor 120 stores the interactive object 126 and the selected AR temporary file 111 as the AR target file 112, and uploads the AR target file 112 to the cloud server 110.
  • the cloud server 110 updates the index information and records the AR target file 112.
  • the editor terminal 120 uploads the AR target file 112 to the cloud server 110 through the first communication unit 121, and the cloud server 110 synchronously updates the user's identification information 115.
  • the mobile device end 130 is connected to the cloud server 110 through the second communication unit 132 for transmitting the information of the index information 116, the identification information 115 and the AR target file 112.
  • the second display unit 133 is configured to play the captured image or AR target file 112.
  • the image selection unit 134 is configured to capture an environment image, and the environment image includes a target 151 to be identified.
  • the input unit 135 is for controlling the interaction of the operation of the mobile device side 130 with the AR target file 112.
  • the input unit 135 can be implemented by a touch screen, or can be a button or a joystick.
  • the target to be identified 151 of the present invention may be a QR code (Quick Response, QR for short), or other graphics or physical objects preset by the cloud server 110, or graphics uploaded by the user.
  • the second storage unit 136 is configured to store the first image recognition program 141, the second image recognition program 142, the community sharing program 143, and the simple editing program 144.
  • the image selection unit 134 starts shooting.
  • the second processing unit 131 determines in real time whether the captured object 151 to be identified has the identification information 115.
  • the identification information 115 is automatically recognized by the second processing unit 131 in the process of capturing the object 151 to be identified.
  • the second processing unit 131 downloads the corresponding AR target file 112 from the cloud server 110 according to the request of the index information 116. After the mobile device end 130 downloads the AR target file 112, the second display unit 133 will play the AR target file 112.
  • the user can operate the interactive object 126 of the AR target file 112 through the input unit 135, taking the above-mentioned vehicle (ie, the interactive object 126) as an example.
  • the image selection unit 134 captures the object to be identified 151 having the identification information 115, as shown in FIG. 5, the second processing unit 131 acquires the related AR target file 112 from the cloud server 110 based on the identification information 115, and then displays the second target file 112.
  • Unit 133 plays AR target file 112.
  • the present invention can also download the corresponding AR target file 112 by the second image recognition program 142.
  • the image selection unit 134 starts shooting. During the shooting, the image selection unit 134 captures the target 151 to be identified. At the same time, a related icon appears in the second display unit 133, as shown in FIG. 6. After the user clicks on the icon, an input box appears. After the user inputs the correct identification code, the second processing unit 131 downloads the user information belonging to the identification code from the cloud server, and scans the corresponding identification image. To generate a corresponding AR target file 112, link the corresponding AR target file 112 to the related activity or page.
  • the second processing unit 131 can identify the presence or absence of the corresponding interactive object 126 from the activity or page. It is also mentioned above that the identification information 115 can identify access rights to the interactive object 126. In order to achieve the non-disclosure purpose, the second image recognition program 142 further protects the access rights of the target 151 to be identified.
  • the operation of the community sharing program 143 is established on the first image recognition program 141 and the second image recognition program 142 described above.
  • the user can add the AR target file 112 to the management page.
  • each AR target file 112 is arranged in order, as shown in FIG.
  • the user can select the option of any AR target file 112 and the community share button 152 via the input unit 135.
  • the second processing unit 131 will run the community sharing program 143 and display the corresponding community icon 153 on the second display unit 133.
  • the user can select any community icon 153.
  • the second processing unit 131 forwards the index information 116 of the AR target file 112 to the selected community.
  • the second processing unit 131 executes the simple editing program 144, the user can capture an environmental image or an image object. After the shooting is completed, the second processing unit 131 uploads the captured result to the cloud server 110 in the manner of the AR temporary file 111. In other words, the user can take any object or image through the mobile device end 130 as the material required for the AR temporary file 111. In addition, the mobile device 130 can also upload the externally downloaded movie to the cloud server 110 and store the movie as a new AR temporary file 111.
  • Embodiments of the invention are as described above. However, the embodiments of the present invention are not limited thereto, and the present invention can also be made without departing from the above-described basic technical idea of the present invention, in accordance with the above-mentioned contents of the present invention, using the ordinary technical knowledge and conventional means in the art. Other various modifications, substitutions and alterations are possible within the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An editing device and system for on-line integrated augmented reality. The editing device can achieve real-time and on-line acquisition and editing of an AR temporary file. The editing system of the present invention comprises a cloud server, an editor end, and a mobile device end. The editor end can select an AR temporary file and an interaction object in the cloud server and store the same as an AR target file after editing is completed. The editor end uploads the AR target file to the cloud server. The mobile device end acquires, from the cloud server, the AR target file and displays the acquired AR target file. In addition, the mobile device end can capture different content, use the captured content to generate a new AR temporary file, and deliver the new AR temporary file to the cloud server for storage.

Description

线上整合扩增实境的编辑装置与系统Online integration of augmented reality editing devices and systems

本申请要求享有2017年6月23日提交的名称为“线上整合扩增实境的编辑装置与系统”的中国专利申请CN201710487618.9的优先权,其全部内容通过引用并入本文中。The present application claims priority to Chinese Patent Application No. CN201710487618.9, filed on Jun. 23,,,,,,,,,,,,,,,,,,,,

技术领域Technical field

本发明涉及一种媒体编辑装置与系统,具体涉及一种线上整合扩增实境的编辑装置与系统。The present invention relates to a media editing apparatus and system, and in particular to an editing apparatus and system for integrating an augmented reality on-line.

背景技术Background technique

随着移动装置运算能力增长,手机变得轻薄、体积小,并且可以拍摄及播放影像或影片。然而,现有技术中静态影像或动态影片只能显示当时情景,无法与使用者进行任何互动。换言之,使用者仅能单纯的观看影像或影片。为了提供所述影像的互动功能,一些厂商开始加入扩增实境(Augmented Reality,简称AR)的功能。AR功能是一种实时地计算摄影机影像的位置及角度,并加上相应图像、影像、声音或其他多媒体物件的技术。这种技术的目标是在显示屏幕上使虚拟物件套用现实世界的影像,使用者可以与虚拟物件进行互动。As mobile computing power increases, handsets become thinner, smaller, and can capture and play back images or movies. However, in the prior art, still images or dynamic movies can only display the current scene and cannot interact with the user. In other words, the user can only view the image or the movie simply. In order to provide the interactive function of the images, some manufacturers began to add the functions of Augmented Reality (AR). The AR function is a technique for calculating the position and angle of a camera image in real time, plus the corresponding image, image, sound or other multimedia object. The goal of this technology is to apply virtual world images to virtual objects on the display screen, allowing users to interact with virtual objects.

现有技术中,AR的虚拟物件或影像的开发需要由3D美术人员与相关专业工程师进行协力开发。因此,AR技术需要高额的开发成本与大量的人力投入;这造成了AR的入门门槛无法降低,阻碍了VR技术的后续开发。In the prior art, the development of virtual objects or images of AR needs to be jointly developed by 3D artists and related professional engineers. Therefore, AR technology requires high development costs and a large amount of manpower input; this has caused the entry threshold of AR to be reduced, hindering the subsequent development of VR technology.

发明内容Summary of the invention

本发明提供一种线上整合扩增实境的编辑装置,该编辑装置的编辑器端可以实时从线上取得AR暂存文件,也可以将AR暂存文件套用在所拍摄的环境影片中。编辑器端还可以将编辑完成的AR目标档案上传至云端伺服器,使得AR目标档案成为云端伺服器中新的AR暂存文件。The invention provides an editing device for online integration of augmented reality. The editor side of the editing device can obtain the AR temporary storage file from the online in real time, and can also apply the AR temporary storage file to the captured environmental movie. The editor side can also upload the edited AR target file to the cloud server, so that the AR target file becomes a new AR temporary file in the cloud server.

本发明的线上整合扩增实境的编辑装置包括通讯单元、显示单元、物件操作单元与处理单元。通讯单元网络连接于云端伺服器,可从云端伺服器获取AR目标档案或互动物件;显示单元用于显示环境影像、AR暂存文件、AR目标档案或互动物件;物件操作单元用 于控制互动物件在显示单元中的摆放位置与显示角度;处理单元与通讯单元、显示单元与物件操作单元电性连接;处理单元可根据识别资讯的信息,驱动通讯单元从云端伺服器获得与识别资讯的相对应的所述AR暂存文件的索引资讯,并根据索引资讯下载任一AR目标档案;处理单元可根据互动物件与在所选的AR目标档案的编辑过程中所生成各步骤的AR暂存文件,选择任一AR暂存文件并生成AR目标档案,处理单元通过通讯单元将生成的AR目标档案发送至云端伺服器。The online integrated augmented reality editing device of the present invention comprises a communication unit, a display unit, an object operating unit and a processing unit. The communication unit network is connected to the cloud server, and the AR target file or the interactive object can be obtained from the cloud server; the display unit is used to display the environment image, the AR temporary storage file, the AR target file or the interactive object; and the object operation unit is used to control the interactive object. The display position and the display angle in the display unit; the processing unit and the communication unit, the display unit and the object operation unit are electrically connected; the processing unit can drive the communication unit to obtain the phase of the identification information from the cloud server according to the information of the identification information. Corresponding index information of the AR temporary storage file, and downloading any AR target file according to the index information; the processing unit may generate an AR temporary storage file according to the interactive object and each step generated in the editing process of the selected AR target file. Select any AR temporary file and generate an AR target file, and the processing unit sends the generated AR target file to the cloud server through the communication unit.

本发明还提供了一种线上整合扩增实境的编辑系统,该编辑系统包括云端伺服器、至少一个编辑器端与行动装置端。The invention also provides an editing system for online integration of augmented reality, the editing system comprising a cloud server, at least one editor end and a mobile device end.

云端伺服器用于记录多个AR目标档案、多个互动物件或使用者资料库,使用者资料库记录每个使用者的识别资讯。编辑器端网络连接于云端伺服器,编辑器端包括第一通讯单元、第一储存单元、第一显示单元、第一处理单元与物件操作单元,第一处理单元与第一通讯单元、第一储存单元、第一显示单元及物件操作单元电性连接,第一通讯单元从云端伺服器获取与识别资讯相对应的AR目标档案的索引资讯。第一储存单元用于储存物件编辑界面、至少一个AR暂存文件、互动物件或AR目标档案。第一显示单元用于编辑物件编辑界面与所选的互动物件。物件操作单元用于控制第一显示单元中互动物件的摆放位置与显示角度,并在所述编辑过程中生成与该互动物件相应的AR暂存文件。第一处理单元根据互动物件与所选的AR暂存文件生成AR目标档案,第一处理单元通过第一通讯单元将生成的AR目标档案发送至云端伺服器。The cloud server is used to record multiple AR target files, multiple interactive objects or user databases, and the user database records the identification information of each user. The editor end network is connected to the cloud server, and the editor end comprises a first communication unit, a first storage unit, a first display unit, a first processing unit and an object operation unit, a first processing unit and a first communication unit, and a first The storage unit, the first display unit and the object operation unit are electrically connected, and the first communication unit acquires index information of the AR target file corresponding to the identification information from the cloud server. The first storage unit is configured to store an object editing interface, at least one AR temporary file, an interactive object, or an AR target file. The first display unit is used to edit the object editing interface and the selected interactive object. The object operating unit is configured to control a placement position and a display angle of the interactive object in the first display unit, and generate an AR temporary storage file corresponding to the interactive object in the editing process. The first processing unit generates an AR target file according to the interactive object and the selected AR temporary storage file, and the first processing unit sends the generated AR target file to the cloud server through the first communication unit.

行动装置端包括第二处理单元、第二通讯单元、第二显示单元、影像选取单元与第二储存单元,第二处理单元与第二通讯单元、第二显示单元、影像选取单元及第二储存单元电性连接。第二通讯单元网络连接于云端伺服器,影像选取单元用于拍摄待识别目标,第二处理单元将待识别目标发送至云端伺服器,云端伺服器辨识待识别目标是否为多个互动物件其中之一,云端伺服器将与所述待识别目标对应的AR目标档案发送至行动装置端,并通过第二显示单元播放接收到的AR目标档案。The mobile device includes a second processing unit, a second communication unit, a second display unit, an image selection unit and a second storage unit, a second processing unit and a second communication unit, a second display unit, an image selection unit, and a second storage unit. The unit is electrically connected. The second communication unit network is connected to the cloud server, the image selection unit is used for capturing the target to be identified, and the second processing unit sends the target to be identified to the cloud server, and the cloud server identifies whether the target to be identified is a plurality of interactive objects. First, the cloud server sends the AR target file corresponding to the target to be identified to the mobile device end, and plays the received AR target file through the second display unit.

本发明所提供的线上整合扩增实境的编辑装置与系统可以使编辑器端从线上取得不同的AR暂存文件,编辑器端编辑完成后也可以将AR目标档案上传至云端伺服器。对于不同的编辑器端而言,编辑器端可以交易或交换AR暂存文件中的各项素材,例如互动物件或场景等。行动装置端可以下载现有的AR目标档案外,也可以通过社群分享将AR目标档案发送给其他使用者。另外,行动装置端也可以拍摄不同的物件或影像,使得所拍摄的目标转换为AR暂存文件的内容。The online integrated and augmented reality editing device and system provided by the invention can enable the editor to obtain different AR temporary storage files from the line, and the AR target file can also be uploaded to the cloud server after the editor side is edited. . For different editors, the editor can trade or exchange various materials in the AR scratch file, such as interactive objects or scenes. The mobile device can download the existing AR target file, or send the AR target file to other users through community sharing. In addition, the mobile device can also capture different objects or images, so that the captured target is converted into the content of the AR temporary file.

附图说明DRAWINGS

下面结合附图和具体实施方式对本发明作进一步详细的说明。The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

图1为本发明的系统架构的示意图。1 is a schematic diagram of a system architecture of the present invention.

图2为对本发明AR暂存文件的编辑过程的示意图。2 is a schematic diagram of an editing process of the AR temporary storage file of the present invention.

图3为在本发明物件编辑界面中选择互动物件的示意图。FIG. 3 is a schematic diagram of selecting an interactive object in the object editing interface of the present invention.

图4为在本发明物件编辑界面中选择AR暂存文件的示意图。4 is a schematic diagram of selecting an AR temporary file in the object editing interface of the present invention.

图5为本发明第二影像识别程序的操作示意图。FIG. 5 is a schematic diagram of the operation of the second image recognition program of the present invention.

图6为本发明识别码输入操作的示意图。Figure 6 is a schematic diagram of the identification code input operation of the present invention.

图7为本发明社群分享的操作示意图。FIG. 7 is a schematic diagram of the operation of sharing by the community of the present invention.

其中:among them:

编辑系统100;云端伺服器110;AR暂存文件111;AR目标档案112;使用者资料库114;识别资讯115;索引资讯116;编辑器端120;第一通讯单元121;第一储存单元122;第一显示单元123;第一处理单元124;物件操作单元125;互动物件126;物件编辑界面127;行动装置端130;第二处理单元131;第二通讯单元132;第二显示单元133;影像选取单元134;输入单元135;第二储存单元136;第一影像识别程序141;第二影像识别程序142;社群分享程序143;简易编辑程序144;待识别目标151;社群分享按钮152;社群图示153。Editing system 100; cloud server 110; AR temporary file 111; AR target file 112; user database 114; identification information 115; index information 116; editor terminal 120; first communication unit 121; first storage unit 122 The first display unit 123; the first processing unit 124; the object operating unit 125; the interactive object 126; the object editing interface 127; the mobile device end 130; the second processing unit 131; the second communication unit 132; Image selection unit 134; input unit 135; second storage unit 136; first image recognition program 141; second image recognition program 142; community sharing program 143; simple editing program 144; object to be identified 151; community sharing button 152 ; Community icon 153.

具体实施方式Detailed ways

如图1所示,为本发明的系统架构与组成示意图。本发明的线上整合扩增实境的编辑系统100包括云端伺服器110、至少一个编辑器端120与行动装置端130。编辑器端120与行动装置端130均可通过网络连接至云端伺服器110。本发明的组成结构包括编辑单元与执行单元两部分;编辑单元包括编辑器端120,用于建立或编辑扩增实境;执行单元包括行动装置端130,用于执行或显示扩增实境内容。FIG. 1 is a schematic diagram showing the architecture and composition of the system of the present invention. The online integrated augmented reality editing system 100 of the present invention includes a cloud server 110, at least one editor terminal 120 and a mobile device terminal 130. Both the editor terminal 120 and the mobile device terminal 130 can be connected to the cloud server 110 via a network. The composition structure of the present invention comprises two parts: an editing unit and an execution unit; the editing unit includes an editor terminal 120 for establishing or editing an augmented reality; the execution unit includes a mobile device end 130 for executing or displaying augmented reality content. .

云端伺服器110用于记录信息,信息包括多个AR目标档案112、多个互动物件126或使用者资料库114。其中,AR暂存文件111用于记录使用者编辑过程中各步骤的暂存数据,所述各步骤是对互动物件126的编辑过程。使用者可以选择互动物件126与至少一个AR暂存文件111,用作生成AR目标档案112。使用者资料库114用于记录每个使用者的识别资讯115,每个使用者可以拥有多个AR暂存文件111或AR目标档案112。对 于不符合设定权限的使用者而言,使用者可以下载AR目标档案112;所述使用者仅能观看下载的AR目标档案112,而无法观看云端伺服器110处的AR目标档案112,也无法或编辑云端伺服器110处的AR目标档案112所属的AR暂存文件111。对于符合观看权限的使用者而言,使用者在下载AR目标档案后,可以开启AR目标档案112,也可以编辑AR暂存文件111。The cloud server 110 is used to record information including a plurality of AR target files 112, a plurality of interactive objects 126, or a user database 114. The AR temporary storage file 111 is used for recording temporary data of each step in the user editing process, and the steps are the editing process of the interactive object 126. The user can select the interactive object 126 and the at least one AR temporary file 111 to be used to generate the AR target file 112. The user database 114 is used to record the identification information 115 of each user, and each user can have a plurality of AR temporary files 111 or AR target files 112. For a user who does not meet the set authority, the user can download the AR target file 112; the user can only view the downloaded AR target file 112, and cannot view the AR target file 112 at the cloud server 110. The AR temporary file 111 to which the AR target file 112 at the cloud server 110 belongs cannot be edited or edited. For the user who has the viewing right, the user can open the AR target file 112 or edit the AR temporary file 111 after downloading the AR target file.

编辑器端120包括第一通讯单元121、第一储存单元122、第一显示单元123、第一处理单元124与物件操作单元125。第一处理单元124与第一通讯单元121、第一储存单元122、第一显示单元123及物件操作单元125电性连接。行动装置端130包括第二处理单元131、第二通讯单元132、第二显示单元133、影像选取单元134、输入单元135与第二储存单元136。第二处理单元131与第二通讯单元132、第二显示单元133、影像选取单元134、输入单元135及第二储存单元136电性连接。编辑器端120可以是个人电脑、笔记型电脑、智能手机、平板电脑或穿戴装置,但不仅限于此。The editor terminal 120 includes a first communication unit 121, a first storage unit 122, a first display unit 123, a first processing unit 124, and an object operation unit 125. The first processing unit 124 is electrically connected to the first communication unit 121, the first storage unit 122, the first display unit 123, and the object operating unit 125. The mobile device end 130 includes a second processing unit 131, a second communication unit 132, a second display unit 133, an image selection unit 134, an input unit 135, and a second storage unit 136. The second processing unit 131 is electrically connected to the second communication unit 132, the second display unit 133, the image selection unit 134, the input unit 135, and the second storage unit 136. The editor terminal 120 can be a personal computer, a notebook computer, a smart phone, a tablet computer, or a wearable device, but is not limited thereto.

第一通讯单元121与云端伺服器110通过网络连接。第一通讯单元121可将识别资讯115与AR目标档案112的信息传送至云端伺服器110。第一储存单元122用于存储物件编辑界面127、互动物件126、AR目标档案112或识别资讯115的信息。第一显示单元123可对物件编辑界面127作出选择,也可编辑所选的互动物件126。使用者可通过物件操作单元125选择物件编辑界面127的按钮、图像物件、清单、或影像物件;也可以拖拽或转动互动物件126,将其移动到物件编辑界面127的任一位置。物件操作单元125可以是键盘、鼠标、触控屏幕,但不仅限于此。The first communication unit 121 is connected to the cloud server 110 via a network. The first communication unit 121 can transmit the information of the identification information 115 and the AR target file 112 to the cloud server 110. The first storage unit 122 is configured to store information of the object editing interface 127, the interactive object 126, the AR target file 112, or the identification information 115. The first display unit 123 can make a selection of the object editing interface 127, and can also edit the selected interactive object 126. The user can select the button, image object, list, or image object of the object editing interface 127 through the object operating unit 125; or move or rotate the interactive object 126 to move to any position of the object editing interface 127. The object operating unit 125 may be a keyboard, a mouse, a touch screen, but is not limited thereto.

编辑器端120连接到云端伺服器110时,云端伺服器110进行使用者的识别资讯115的确认步骤。云端伺服器110完成识别资讯115的验证后,云端伺服器110根据识别资讯115,向使用者传送对应的索引资讯116。索引资讯116中记录所述使用者可以选择的AR目标档案112。一般而言,云端伺服器110中存储内容包括公开的索引资讯116与可供个人使用的索引资讯116。使用者选择AR目标档案112后,编辑器端120会下载所选的AR目标档案112,并在物件编辑界面127上显示所选的AR目标档案112的场景内容。一般而言,AR暂存文件111的场景可以为静态影像、节庆贺卡、明信片或动态影片,但不仅限于此。When the editor terminal 120 is connected to the cloud server 110, the cloud server 110 performs a confirmation step of the user's identification information 115. After the cloud server 110 completes the verification of the identification information 115, the cloud server 110 transmits the corresponding index information 116 to the user according to the identification information 115. The AR target file 112 that the user can select is recorded in the index information 116. In general, the content stored in the cloud server 110 includes public index information 116 and index information 116 for personal use. After the user selects the AR target file 112, the editor terminal 120 downloads the selected AR target file 112 and displays the scene content of the selected AR target file 112 on the object editing interface 127. In general, the scene of the AR temporary file 111 may be a still image, a festive greeting card, a postcard, or a dynamic movie, but is not limited thereto.

在本实施例中的物件编辑界面127至少包括AR目标档案列表、互动物件列表、编辑区、位置资讯、事件列表与动作列表。使用者可以从AR目标档案列表与互动物件列表中选择进行编辑的对象。选定后,物件编辑界面127的编辑区中会下载并显示AR目标档案 112的相关场景或互动物件126。使用者可以选择任一互动物件列表中的模型,并在编辑区中编辑所选的互动物件126。使用者可以从物件编辑界面127的互动物件列表中选择至少一个互动物件126,也可以从外部来源下载新的互动物件126。例如:使用者从编辑器端120的硬盘或光盘中下载任一个影像、图片或3D模型,作为新的互动物件126。The object editing interface 127 in this embodiment includes at least an AR target file list, an interactive object list, an editing area, a location information, an event list, and an action list. The user can select an object to be edited from the AR target file list and the interactive object list. After selection, the related scene or interactive object 126 of the AR target file 112 is downloaded and displayed in the editing area of the object editing interface 127. The user can select a model in any of the interactive object lists and edit the selected interactive object 126 in the editing area. The user can select at least one interactive object 126 from the list of interactive objects of the object editing interface 127, or can download a new interactive object 126 from an external source. For example, the user downloads any image, picture or 3D model from the hard disk or optical disc of the editor terminal 120 as a new interactive object 126.

使用者可以通过物件编辑界面127对AR目标档案112中的场景或物件进行编辑。如图2所示,为对本发明的AR目标档案112的编辑过程的示意图。在本发明中是以车行进对AR目标档案112的编辑过程作出说明,但并不仅限于此。The user can edit the scene or object in the AR target file 112 through the object editing interface 127. As shown in FIG. 2, it is a schematic diagram of an editing process of the AR target file 112 of the present invention. In the present invention, the editing process of the AR target file 112 is described in terms of vehicle travel, but is not limited thereto.

首先,使用者在物件编辑界面127中选择至少一个互动物件126,并通过物件操作单元125将互动物件126移动至AR目标档案112的场景中。同时,当使用者在编辑互动物件126时,第一处理单元124记录编辑互动物件126的各步骤过程,将其存储在AR暂存文件111中,如图2与图3所示。First, the user selects at least one interactive object 126 in the object editing interface 127 and moves the interactive object 126 to the scene of the AR target file 112 through the object operating unit 125. Meanwhile, when the user is editing the interactive object 126, the first processing unit 124 records the steps of editing the interactive object 126, and stores it in the AR temporary file 111, as shown in FIG. 2 and FIG.

在图2中,使用者将互动物件126从物件编辑界面127的左侧拖拽至场景中。举例来说,第一处理单元124在选择或操作互动物件126时,第一处理单元124开始建立一个相应的AR暂存文件111。使用者对于设定互动物件126在场景中的进行移动、影像缩放、影像变形、设定物件动作、变更显示选单或增删影像的操作,第一处理单元124将上述操作过程记录在AR暂存文件里,如图4所示。第一处理单元124将互动物件126的各编辑的过程分别记录为一个新的AR暂存文件111。进一步而言,使用者可以从目前的多个AR暂存文件111中选出至少一个AR暂存文件111,并将所选的AR暂存文件111与互动物件126相关联,生成为新的AR目标档案112。使用者也可以选择多个AR暂存文件111,并将所选的多个AR暂存文件111与互动物件相126关联,生成为AR目标档案112。此外,编辑器端120还可设定互动物件126在点击(click)或移动时的触发动作,例如出现动画、播放声音、播放影片或开启新视窗画面等,但不仅限于此。In FIG. 2, the user drags the interactive object 126 from the left side of the object editing interface 127 into the scene. For example, when the first processing unit 124 selects or operates the interactive object 126, the first processing unit 124 begins to create a corresponding AR temporary file 111. The first processing unit 124 records the above operation process in the AR temporary storage file for the operation of setting the interactive object 126 to move, zoom, change the image, set the object action, change the display menu or add or delete the image in the scene. Here, as shown in Figure 4. The first processing unit 124 records each edited process of the interactive object 126 as a new AR temporary file 111. Further, the user may select at least one AR temporary storage file 111 from the current plurality of AR temporary storage files 111, and associate the selected AR temporary storage file 111 with the interactive object 126 to generate a new AR. Target file 112. The user may also select a plurality of AR temporary files 111 and associate the selected plurality of AR temporary files 111 with the interactive object 126 to generate an AR target file 112. In addition, the editor side 120 can also set the triggering action of the interactive object 126 when clicking or moving, such as an animation, playing a sound, playing a movie, or opening a new window screen, but is not limited thereto.

完成互动物件126与AR暂存文件111的编辑后,第一处理单元124将编辑完成的结果生成为AR目标档案112。此外,编辑器端120可以对AR目标档案112设定存取权限,存取权限可以是公开使用,也可以限定使用者。权限的设定包括识别资讯中的AR目标档案112的存取权限,但不仅限此。After the editing of the interactive object 126 and the AR temporary file 111 is completed, the first processing unit 124 generates the result of the editing completion as the AR target file 112. In addition, the editor side 120 can set access rights to the AR target file 112, and the access rights can be publicly used or limited to users. The setting of the rights includes the access rights of the AR target file 112 in the identification information, but is not limited thereto.

在设定识别资讯115后,编辑器端120将互动物件126与所选的AR暂存文件111存储为AR目标档案112,并将AR目标档案112上传至云端伺服器110。云端伺服器110更新索引资讯,对AR目标档案112进行记录。最后,编辑器端120通过第一通讯单元121将AR目标档案112上传至云端伺服器110,且云端伺服器110同步更新使用者的识 别资讯115。After setting the identification information 115, the editor 120 stores the interactive object 126 and the selected AR temporary file 111 as the AR target file 112, and uploads the AR target file 112 to the cloud server 110. The cloud server 110 updates the index information and records the AR target file 112. Finally, the editor terminal 120 uploads the AR target file 112 to the cloud server 110 through the first communication unit 121, and the cloud server 110 synchronously updates the user's identification information 115.

行动装置端130通过第二通讯单元132与云端伺服器110网络连接,用于传输索引资讯116、识别资讯115与AR目标档案112的信息。第二显示单元133用于播放所拍摄的影像或AR目标档案112。影像选取单元134用于拍摄环境影像,环境影像包括待识别目标151。输入单元135用于控制行动装置端130的操作与AR目标档案112的互动输入。输入单元135可以通过触控屏幕来实现,也可以是按钮或摇杆等方式。The mobile device end 130 is connected to the cloud server 110 through the second communication unit 132 for transmitting the information of the index information 116, the identification information 115 and the AR target file 112. The second display unit 133 is configured to play the captured image or AR target file 112. The image selection unit 134 is configured to capture an environment image, and the environment image includes a target 151 to be identified. The input unit 135 is for controlling the interaction of the operation of the mobile device side 130 with the AR target file 112. The input unit 135 can be implemented by a touch screen, or can be a button or a joystick.

本发明的待识别目标151可以是QR二维码(Quick Response快速反应,简称QR)外,也可以是云端伺服器110预设的其他图形或实体物件,或是由使用者自行上传的图形。第二储存单元136用于存储第一影像识别程序141、第二影像识别程序142、社群分享程序143与简易编辑程序144。The target to be identified 151 of the present invention may be a QR code (Quick Response, QR for short), or other graphics or physical objects preset by the cloud server 110, or graphics uploaded by the user. The second storage unit 136 is configured to store the first image recognition program 141, the second image recognition program 142, the community sharing program 143, and the simple editing program 144.

当第二处理单元131运行第一影像识别程序141时,影像选取单元134开始拍摄。在影像选取单元134摄像的过程中,第二处理单元131会实时的判断所拍摄到的待识别目标151是否具有识别资讯115。在拍摄待识别目标151的过程中,由第二处理单元131自动识别识别资讯115。当第二处理单元131检测出待识别目标151具有识别资讯115时,第二处理单元131根据索引资讯116的要求,从云端伺服器110下载对应的AR目标档案112。行动装置端130下载AR目标档案112后,第二显示单元133将播放AR目标档案112。When the second processing unit 131 runs the first image recognition program 141, the image selection unit 134 starts shooting. In the process of the image capturing unit 134, the second processing unit 131 determines in real time whether the captured object 151 to be identified has the identification information 115. The identification information 115 is automatically recognized by the second processing unit 131 in the process of capturing the object 151 to be identified. When the second processing unit 131 detects that the target to be identified 151 has the identification information 115, the second processing unit 131 downloads the corresponding AR target file 112 from the cloud server 110 according to the request of the index information 116. After the mobile device end 130 downloads the AR target file 112, the second display unit 133 will play the AR target file 112.

使用者可以通过输入单元135对AR目标档案112的互动物件126进行操作,以上述的车(即互动物件126)为例。当影像选取单元134拍摄具有识别资讯115的待识别目标151时,如图5所示,第二处理单元131基于识别资讯115从云端伺服器110取得相关的AR目标档案112,再由第二显示单元133播放AR目标档案112。The user can operate the interactive object 126 of the AR target file 112 through the input unit 135, taking the above-mentioned vehicle (ie, the interactive object 126) as an example. When the image selection unit 134 captures the object to be identified 151 having the identification information 115, as shown in FIG. 5, the second processing unit 131 acquires the related AR target file 112 from the cloud server 110 based on the identification information 115, and then displays the second target file 112. Unit 133 plays AR target file 112.

另外,本发明也可以通过第二影像识别程序142下载对应的AR目标档案112。当第二处理单元131运行第二影像识别程序142时,影像选取单元134开始拍摄。在拍摄过程中,影像选取单元134拍摄到待识别目标151。同时,第二显示单元133中会出现相关图示,如图6所示。使用者单击图示后会出现输入框,使用者输入正确的识别码后,第二处理单元131会从云端伺服器中下载属于该识别码的使用者资讯,对相对应的辨识图像进行扫描,以生成相对应的AR目标档案112,将相对应AR目标档案112链接至相关活动或页面中。第二处理单元131可以从所述活动或页面中识别有无对应的互动物件126。在上文中还提及,识别资讯115可以识别对互动物件126的存取权限。为了达到非公开的目的,第二影像识别程序142针对待识别目标151的存取权限也做出进一步的保护。In addition, the present invention can also download the corresponding AR target file 112 by the second image recognition program 142. When the second processing unit 131 runs the second image recognition program 142, the image selection unit 134 starts shooting. During the shooting, the image selection unit 134 captures the target 151 to be identified. At the same time, a related icon appears in the second display unit 133, as shown in FIG. 6. After the user clicks on the icon, an input box appears. After the user inputs the correct identification code, the second processing unit 131 downloads the user information belonging to the identification code from the cloud server, and scans the corresponding identification image. To generate a corresponding AR target file 112, link the corresponding AR target file 112 to the related activity or page. The second processing unit 131 can identify the presence or absence of the corresponding interactive object 126 from the activity or page. It is also mentioned above that the identification information 115 can identify access rights to the interactive object 126. In order to achieve the non-disclosure purpose, the second image recognition program 142 further protects the access rights of the target 151 to be identified.

社群分享程序143的操作,建立在上述的第一影像识别程序141与第二影像识别程序142上。当使用者完成上述影像识别程序后,使用者可以将AR目标档案112加入管理页面中。在管理页面中,每个AR目标档案112会按照顺序排列,如图7所示。使用者可以通过输入单元135选择任一AR目标档案112的选项及社群分享按钮152。当使用者单击选定社群分享按钮152后,第二处理单元131将运行社群分享程序143,并在第二显示单元133上显示相应的社群图示153。使用者可选择任一社群图示153,选定后,第二处理单元131会将AR目标档案112的索引资讯116转发至所选的社群中。The operation of the community sharing program 143 is established on the first image recognition program 141 and the second image recognition program 142 described above. After the user completes the above image recognition process, the user can add the AR target file 112 to the management page. In the management page, each AR target file 112 is arranged in order, as shown in FIG. The user can select the option of any AR target file 112 and the community share button 152 via the input unit 135. When the user clicks the selected community sharing button 152, the second processing unit 131 will run the community sharing program 143 and display the corresponding community icon 153 on the second display unit 133. The user can select any community icon 153. Upon selection, the second processing unit 131 forwards the index information 116 of the AR target file 112 to the selected community.

第二处理单元131在执行简易编辑程序144时,使用者可以拍摄环境影像或影像物件。完成拍摄后,第二处理单元131把所拍摄的结果以AR暂存文件111的方式上传至云端伺服器110。换句话说,使用者可以通过行动装置端130拍摄任意的物件或影像,作为AR暂存文件111所需的素材。除此之外,行动装置端130也可以将外部下载的影片上传至云端伺服器110,并将所影片存储为新的AR暂存文件111。When the second processing unit 131 executes the simple editing program 144, the user can capture an environmental image or an image object. After the shooting is completed, the second processing unit 131 uploads the captured result to the cloud server 110 in the manner of the AR temporary file 111. In other words, the user can take any object or image through the mobile device end 130 as the material required for the AR temporary file 111. In addition, the mobile device 130 can also upload the externally downloaded movie to the cloud server 110 and store the movie as a new AR temporary file 111.

本发明的实施方式如上所述。然而,本发明的实施方式不限于此,按照本发明所述的上述内容,利用本领域的普通技术知识和惯用手段,在不脱离本实用新型上述基本技术思想前提下,本发明还可以做出其它多种形式的修改、替换或变更,均落在本发明权利保护范围之内。Embodiments of the invention are as described above. However, the embodiments of the present invention are not limited thereto, and the present invention can also be made without departing from the above-described basic technical idea of the present invention, in accordance with the above-mentioned contents of the present invention, using the ordinary technical knowledge and conventional means in the art. Other various modifications, substitutions and alterations are possible within the scope of the invention.

Claims (10)

一种线上整合扩增实境的编辑装置,其特征在于:该装置包括:An editing device for online integration of augmented reality, characterized in that the device comprises: 通讯单元,通过网络与云端伺服器连接,用于获取AR目标档案或互动物件;The communication unit is connected to the cloud server through the network, and is used to obtain an AR target file or an interactive object; 显示单元,用于显示环境影像、AR暂存文件、所述AR目标档案或所述互动物件;a display unit, configured to display an environment image, an AR temporary file, the AR target file, or the interactive object; 物件操作单元,用于控制所述互动物件在所述显示单元中的摆放位置与显示角度;An object operating unit, configured to control a position and a display angle of the interactive object in the display unit; 处理单元,所述处理单元与所述通讯单元、所述显示单元与所述物件操作单元电性连接,所述处理单元根据识别资讯,驱动所述通讯单元从所述云端伺服器下载与识别资讯相对应的AR暂存文件的索引资讯,并根据所述索引资讯下载任一所述AR目标档案;a processing unit, the processing unit is electrically connected to the communication unit, the display unit, and the object operation unit, and the processing unit drives the communication unit to download and identify information from the cloud server according to the identification information. Corresponding index information of the AR temporary storage file, and downloading any of the AR target files according to the index information; 其中,所述处理单元根据所述互动物件及在AR目标档案的编辑过程中所生成各步骤的AR暂存文件,选择任一所述AR暂存文件并生成所述AR目标档案,所述处理单元通过所述通讯单元将生成的AR目标档案发送至云端伺服器。The processing unit selects any of the AR temporary storage files and generates the AR target file according to the interactive object and the AR temporary storage file generated in each step of the editing process of the AR target file, and the processing is performed. The unit sends the generated AR target file to the cloud server through the communication unit. 根据权利要求1所述的线上整合扩增实境的编辑装置,其特征在于:还包括储存单元,储存单元用于存储所述AR暂存文件、所述互动物件、物件编辑界面或所述AR目标档案。The apparatus for editing an online integrated augmented reality according to claim 1, further comprising: a storage unit, wherein the storage unit is configured to store the AR temporary storage file, the interactive object, the object editing interface, or the AR target file. 根据权利要求2所述的线上整合扩增实境的编辑装置,其特征在于:所述AR暂存文件用于记录互动物件在所述物件编辑界面中的移动、影像缩放、影像变形、设定物件动作、变更显示选项或增删影像。The online integrated augmented reality editing device according to claim 2, wherein the AR temporary storage file is used for recording movement, image scaling, image deformation, and setting of the interactive object in the object editing interface. Object action, change display options, or add or delete images. 根据权利要求3所述的线上整合扩增实境的编辑装置,其特征在于:所述处理单元运行所述物件编辑界面,由所述物件操作单元控制所述物件编辑界面,用于编辑所述AR暂存文件在所述环境影像中的位置、播放位置、或播放时间。The online integrated augmented reality editing apparatus according to claim 3, wherein the processing unit runs the object editing interface, and the object editing unit controls the object editing interface for editing the same. The position, play position, or play time of the AR temporary file in the environment image. 根据权利要求1所述的线上整合扩增实境的编辑装置,其特征在于:所述编辑装置为个人电脑、笔记型电脑、智能手机、平板电脑或穿戴装置。The online integrated augmented reality editing device according to claim 1, wherein the editing device is a personal computer, a notebook computer, a smart phone, a tablet computer or a wearable device. 根据权利要求1所述的线上整合扩增实境的编辑装置,其特征在于:所述处理单元将所述识别资讯加入所述生成的AR目标档案,并将其传送至所述云端伺服器,所述云端伺服器将接收到的AR目标档案归类为新的AR暂存文件。The online integrated augmented reality editing apparatus according to claim 1, wherein said processing unit adds said identification information to said generated AR target file and transmits it to said cloud server The cloud server classifies the received AR target file as a new AR temporary file. 一种线上整合扩增实境的编辑系统,其特征在于:该系统包括:An editing system for online integration of augmented reality, characterized in that the system comprises: 云端伺服器,用于记录多个AR目标档案、多个互动物件或使用者资料库;所述使用者资料库记录每个使用者的识别资讯;a cloud server for recording a plurality of AR target files, a plurality of interactive objects or a user database; the user database records each user's identification information; 至少一个编辑器端,与所述云端伺服器网络连接;所述编辑器端包括第一通讯单元、 第一储存单元、第一显示单元、第一处理单元与物件操作单元,所述第一处理单元与所述第一通讯单元、所述第一储存单元、所述第一显示单元与所述物件操作单元电性连接,第一通讯单元向所述云端伺服器获取与识别资讯相对应的AR目标档案的索引资讯;所述第一储存单元用于存储物件编辑界面、至少一个AR暂存文件、所述互动物件或所述AR目标档案;所述第一显示单元用于编辑所述物件编辑界面与所选的互动物件;所述物件操作单元用于控制所述第一显示单元中互动物件的摆放位置与显示角度,并在所述编辑过程中生成与该互动物件相应的AR暂存文件;所述第一处理单元根据所述互动物件与所选的所述AR暂存文件生成所述AR目标档案,所述第一处理单元通过所述第一通讯单元将生成的AR目标档案发送至所述云端伺服器;At least one editor end connected to the cloud server network; the editor end includes a first communication unit, a first storage unit, a first display unit, a first processing unit and an object operating unit, the first processing The unit is electrically connected to the first communication unit, the first storage unit, the first display unit, and the object operation unit, and the first communication unit acquires an AR corresponding to the identification information to the cloud server. Index information of the target file; the first storage unit is configured to store an object editing interface, at least one AR temporary file, the interactive object or the AR target file; and the first display unit is configured to edit the object editing The interface and the selected interactive object; the object operating unit is configured to control a placement position and a display angle of the interactive object in the first display unit, and generate an AR temporary storage corresponding to the interactive object in the editing process a file; the first processing unit generates the AR target file according to the interactive object and the selected AR temporary file, and the first processing unit passes the a communication unit sends the generated AR target file to the cloud server; 一行动装置端,该行动装置端包括第二处理单元、第二通讯单元、第二显示单元、影像选取单元与第二储存单元,所述第二处理单元与所述第二通讯单元、所述第二显示单元、所述影像选取单元及所述第二储存单元电性连接;所述第二通讯单元与云端伺服器网络连接,所述影像选取单元用于拍摄待识别目标,所述第二处理单元将所述待识别目标发送至所述云端伺服器,所述云端伺服器辨识所述待识别目标是否为所述多个互动物件其中之一,所述云端伺服器将与所述待识别目标对应的AR目标档案发送至所述行动装置端,并通过所述第二显示单元播放接收到的AR目标档案。a mobile device end, the mobile device end includes a second processing unit, a second communication unit, a second display unit, an image selection unit and a second storage unit, the second processing unit and the second communication unit, The second display unit is electrically connected to the second storage unit, the second communication unit is connected to the cloud server network, and the image selection unit is configured to capture a target to be identified, and the second The processing unit sends the to-be-identified target to the cloud server, and the cloud server identifies whether the target to be identified is one of the plurality of interactive objects, and the cloud server is to be identified The AR target file corresponding to the target is sent to the mobile device end, and the received AR target file is played by the second display unit. 根据权利要求7所述的线上整合扩增实境的编辑系统,其特征在于:所述第一处理单元将所述识别资讯加入所述生成的AR目标档案,并将其发送至所述云端伺服器。The online integrated augmented reality editing system according to claim 7, wherein said first processing unit adds said identification information to said generated AR target file and transmits it to said cloud server. 根据权利要求7所述的线上整合扩增实境的编辑系统,其特征在于:所述行动装置端输入识别码,所述第二处理单元根据所述识别码对所述待识别目标进行辨识,并根据所述待识别目标的要求,从所述云端伺服器下载与所述待识别目标相应的所述AR目标档案。The online integrated augmented reality editing system according to claim 7, wherein the mobile device end inputs an identification code, and the second processing unit identifies the to-be-identified target according to the identification code. And downloading, according to the requirement of the target to be identified, the AR target file corresponding to the target to be identified from the cloud server. 根据权利要求7所述的线上整合扩增实境的编辑系统,其特征在于:所述行动装置端运行社群分享程序,所述第二通讯单元与所述云端伺服器连接,选择社群网站后,所述第二处理单元生成邀请通知并发送至所选的所述社群网站。The online integrated augmented reality editing system according to claim 7, wherein the mobile device runs a community sharing program, and the second communication unit is connected to the cloud server to select a community. After the website, the second processing unit generates an invitation notification and sends it to the selected social networking website.
PCT/CN2018/091180 2017-06-23 2018-06-14 Online integration of augmented reality editing devices and systems Ceased WO2018233533A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710487618.9 2017-06-23
CN201710487618.9A CN109117034A (en) 2017-06-23 2017-06-23 Editing device and system for on-line integration augmented reality

Publications (1)

Publication Number Publication Date
WO2018233533A1 true WO2018233533A1 (en) 2018-12-27

Family

ID=64732107

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/091180 Ceased WO2018233533A1 (en) 2017-06-23 2018-06-14 Online integration of augmented reality editing devices and systems

Country Status (2)

Country Link
CN (1) CN109117034A (en)
WO (1) WO2018233533A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11222478B1 (en) 2020-04-10 2022-01-11 Design Interactive, Inc. System and method for automated transformation of multimedia content into a unitary augmented reality module
CN116070360A (en) * 2021-11-03 2023-05-05 财团法人资讯工业策进会 Synchronization management server, synchronization management system, and synchronization management method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111476911B (en) * 2020-04-08 2023-07-25 Oppo广东移动通信有限公司 Virtual image realization method, device, storage medium and terminal equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467343A (en) * 2010-11-03 2012-05-23 Lg电子株式会社 Mobile terminal and method for controlling the same
CN102739872A (en) * 2012-07-13 2012-10-17 苏州梦想人软件科技有限公司 Mobile terminal, and augmented reality method used for mobile terminal
CN105989623A (en) * 2015-02-12 2016-10-05 上海交通大学 Implementation method of augmented reality application based on handheld mobile equipment
CN106033333A (en) * 2015-03-10 2016-10-19 沈阳中云普华科技有限公司 A visual augmented reality scene making system and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102958114B (en) * 2011-08-27 2017-10-03 中兴通讯股份有限公司 Methods to access augmented reality user context
KR101984915B1 (en) * 2012-12-03 2019-09-03 삼성전자주식회사 Supporting Portable Device for operating an Augmented reality contents and system, and Operating Method thereof
CN103150658B (en) * 2013-04-02 2016-05-18 武汉友睿科技有限公司 A kind of reality of intended for end consumers strengthens custom-built system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467343A (en) * 2010-11-03 2012-05-23 Lg电子株式会社 Mobile terminal and method for controlling the same
CN102739872A (en) * 2012-07-13 2012-10-17 苏州梦想人软件科技有限公司 Mobile terminal, and augmented reality method used for mobile terminal
CN105989623A (en) * 2015-02-12 2016-10-05 上海交通大学 Implementation method of augmented reality application based on handheld mobile equipment
CN106033333A (en) * 2015-03-10 2016-10-19 沈阳中云普华科技有限公司 A visual augmented reality scene making system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11222478B1 (en) 2020-04-10 2022-01-11 Design Interactive, Inc. System and method for automated transformation of multimedia content into a unitary augmented reality module
CN116070360A (en) * 2021-11-03 2023-05-05 财团法人资讯工业策进会 Synchronization management server, synchronization management system, and synchronization management method

Also Published As

Publication number Publication date
CN109117034A (en) 2019-01-01

Similar Documents

Publication Publication Date Title
US12087330B2 (en) Masking in video stream
US9704281B2 (en) Systems and methods for creation and sharing of selectively animated digital photos
WO2022205798A1 (en) Multimedia information editing method and apparatus therefor
CN107111437B (en) digital media message generation
US20180308524A1 (en) System and method for preparing and capturing a video file embedded with an image file
CN107005458B (en) Unscripted digital media message generation method, apparatus, electronic device, and readable medium
KR20160098949A (en) Apparatus and method for generating a video, and computer program for executing the method
CN110636365A (en) Method and device for adding video characters
KR101123370B1 (en) service method and apparatus for object-based contents for portable device
CN111314204A (en) Interaction method, device, terminal and storage medium
US11503148B2 (en) Asynchronous short video communication platform based on animated still images and audio
WO2023143531A1 (en) Photographing method and apparatus, and electronic device
WO2018233533A1 (en) Online integration of augmented reality editing devices and systems
US20140282000A1 (en) Animated character conversation generator
TWI652600B (en) Online integration of augmented reality editing devices and systems
WO2025077527A1 (en) Media content generation method and apparatus, and electronic device and readable storage medium
US12452521B2 (en) Providing a template for media content generation
US20240137599A1 (en) Terminal and non-transitory computer-readable medium
WO2024041564A1 (en) Video recording method and apparatus, electronic device and storage medium
TWM560053U (en) Editing device for integrating augmented reality online
JP4326753B2 (en) Video information indexing support system, program, and storage medium
CN116156312B (en) File sharing method and device, electronic equipment and readable storage medium
CN114979050B (en) Voice generation method, voice generation device and electronic equipment
CA3124259C (en) Asynchronous short video communication platform based on animated still images and audio
US20260032333A1 (en) Providing a template for media content generation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18820120

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16.03.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18820120

Country of ref document: EP

Kind code of ref document: A1