[go: up one dir, main page]

CN108984904B - Home design method based on deep neural network - Google Patents

Home design method based on deep neural network Download PDF

Info

Publication number
CN108984904B
CN108984904B CN201810781492.0A CN201810781492A CN108984904B CN 108984904 B CN108984904 B CN 108984904B CN 201810781492 A CN201810781492 A CN 201810781492A CN 108984904 B CN108984904 B CN 108984904B
Authority
CN
China
Prior art keywords
home
sequence
model
input
subsequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810781492.0A
Other languages
Chinese (zh)
Other versions
CN108984904A (en
Inventor
陈宇峰
李博
吴丹
霍盼盼
陶泽綦
白学营
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201810781492.0A priority Critical patent/CN108984904B/en
Publication of CN108984904A publication Critical patent/CN108984904A/en
Application granted granted Critical
Publication of CN108984904B publication Critical patent/CN108984904B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/08Probabilistic or stochastic CAD

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Civil Engineering (AREA)
  • Architecture (AREA)
  • Structural Engineering (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Machine Translation (AREA)

Abstract

The invention relates to a home design method based on a deep neural network, and belongs to the technical field of deep neural networks and home design. Firstly, collecting a home design display to label a home, and then generating sequence data based on a marked home placing sequence; then designing a home furnishing prediction model, extracting structural features among furniture characters, learning sequence data and predicting a home furnishing placing sequence based on the designed home furnishing prediction model; designing a bidirectional layered home prediction model based on parameter constraint of the relative size of a home object, and then performing multi-round prediction of hierarchical recursion to enable multi-round prediction results to better accord with the actual situation of home design; finally, a home prediction model aiming at a specific style is designed according to the requirements of different design styles in practical application. The invention uses the three-dimensional engine to draw the three-dimensional home scene, and visually proves the effectiveness and the style learning advantage of the model in a three-dimensional mode.

Description

一种基于深度神经网络的家居设计方法A home design method based on deep neural network

技术领域technical field

本发明涉及一种基于深度神经网络的家居设计方法,属于深度学习技术领域。The invention relates to a home furnishing design method based on a deep neural network, and belongs to the technical field of deep learning.

背景技术Background technique

深度神经网络(Deep Neural Network,DNN)是目前许多现代AI应用的基础。不同于传统机器学习算法的是,它可以自主地从数据中学习特征,这一过程是不需要人为干预,更智能化,也更加符合人类感知世界的机理。此外,在许多领域中,DNN的准确性已经超过人类,DNN作为一个父类,其包含的主要算法可以分为四个体系:卷积神经网络、深度堆栈自编码网络、循环神经网络、生成对抗网络。其中,卷积神经网络是在空间上的加深、循环神经网络是在时间上的加深。Deep Neural Network (DNN) is the foundation of many modern AI applications. Different from traditional machine learning algorithms, it can learn features from data autonomously. This process does not require human intervention, is more intelligent, and is more in line with the mechanism of human perception of the world. In addition, in many fields, the accuracy of DNN has surpassed that of human beings. As a parent class, DNN contains the main algorithms can be divided into four systems: convolutional neural network, deep stack auto-encoding network, recurrent neural network, generative confrontation network. Among them, the convolutional neural network is deepened in space, and the recurrent neural network is deepened in time.

循环神经网络(Recurrent Neural Network,RNN)在实际应用中主要是自然语言处理,其特殊之处在于考虑了上下文信息对结果输出的影响,网络的当前输出yt不仅与当前输入xt有关,还与上一时刻隐藏层的状态ht-1有关,根据ht-1和xt获得新的隐藏层状态ht,决定当前时刻的输出并影响下一时刻。由于RNN的特殊性,其在反向传播的过程中,不仅仅依赖于当前层的网络,还依赖前面若干层的网络,这种传播算法称为随时间反向传播算法(backpropagation through time,BPTT)。Recurrent Neural Network (RNN) is mainly used in natural language processing in practical applications. Its special feature is that the influence of context information on the result output is considered. The current output y t of the network is not only related to the current input x t , but also Related to the state h t-1 of the hidden layer at the previous moment, the new hidden layer state h t is obtained according to h t-1 and x t , which determines the output at the current moment and affects the next moment. Due to the particularity of RNN, in the process of backpropagation, it not only depends on the network of the current layer, but also depends on the network of several previous layers. This propagation algorithm is called backpropagation through time (BPTT). ).

现阶段对于家居设计方法的研究中,大都使用高斯模型和贝叶斯网络来训练模型提取对象间的结构特征,这些算法都是基于智能计算方法的,在模型简化上存在一定的提升空间且不能够学习设计风格,算法方面的工作可以大致分为三类:家具布局优化、场景建模与重建、场景合成,这些方法存在着一些问题,比如建模与重建对采集特定数据的需求较高(如Kinect数据)、场景合成需要提取整体特征并时刻把控生成场景与原型之间的相似度。In the current research on home design methods, Gaussian models and Bayesian networks are mostly used to train models to extract structural features between objects. These algorithms are all based on intelligent computing methods, and there is a certain room for improvement in model simplification. Able to learn the design style, the work of the algorithm can be roughly divided into three categories: furniture layout optimization, scene modeling and reconstruction, scene synthesis, these methods have some problems, such as modeling and reconstruction, the demand for collecting specific data is high ( Such as Kinect data), scene synthesis needs to extract overall features and always control the similarity between the generated scene and the prototype.

发明内容SUMMARY OF THE INVENTION

本发明的目的是针对现有家居预测模型在模型简化上还有一定提升空间以及设计风格不能学习的技术缺陷,提出了一种基于深度神经网络的家居设计方法。The purpose of the present invention is to propose a home furnishing design method based on a deep neural network in view of the technical defects that the existing home furnishing prediction model still has a certain room for improvement in model simplification and the design style cannot be learned.

本发明的核心思想是:首先采集家居设计陈列对家居序列进行标注,再基于经过标注的家居序列,即家居摆放顺序生成序列数据;然后设计顺序结构家居预测模型,基于所设计的顺序结构家居预测模型提取家具字符间的结构特征、学习序列数据并对家居摆放顺序进行预测;引入限制家居物体相对尺寸的参数约束,构建双向分层家居预测模型,对于每一层的预测任务都事先给定该层的参数约束,同时在预测阶段利用集束搜索进行参数筛选,即进行层次化递归的多轮预测,预测结果更符合家居设计的实际情况;最后设计针对不同风格的家居预测展示模型,实现应用中对不同设计风格的需求,运用三维引擎进行三维家居场景绘制,以三维方式直观地证明了模型的有效性和风格学习优势。The core idea of the present invention is as follows: firstly collect the home furnishing design display to mark the home furnishing sequence, and then generate sequence data based on the marked home furnishing sequence, that is, the home furnishing order; and then design a sequential structure home furnishing prediction model, based on the designed sequential structure home furnishing The prediction model extracts the structural features between furniture characters, learns sequence data, and predicts the order of home placement; introduces parameter constraints that limit the relative size of home objects, and builds a bi-directional hierarchical home prediction model. The prediction tasks of each layer are given in advance. Determine the parameter constraints of this layer, and use cluster search to filter parameters in the prediction stage, that is, perform hierarchical recursive multi-round prediction, and the prediction results are more in line with the actual situation of home design; According to the requirements of different design styles in the application, the 3D engine is used to draw 3D home scenes, which directly proves the validity of the model and the advantages of style learning in a 3D way.

本发明的目的通过以下技术方案实现:The object of the present invention is achieved through the following technical solutions:

一种基于深度神经网络的家居设计方法包括顺序结构家居预测模型和双向分层家居预测模型,实现方法分为数据预处理阶段、训练阶段、预测阶段以及模型绘制阶段;A home furnishing design method based on a deep neural network includes a sequential structure home furnishing prediction model and a bidirectional hierarchical home furnishing prediction model, and the realization method is divided into a data preprocessing stage, a training stage, a prediction stage and a model drawing stage;

其中,顺序结构家居预测模型的数据预处理阶段,包括如下步骤:Among them, the data preprocessing stage of the sequential structure home furnishing prediction model includes the following steps:

步骤1.将采集得到每一组家居摆放数据,按顺序进行人工标注得到家居序列,该序列分为输入序列和输出序列两部分,生成完整的家居数据集;Step 1. Collect and obtain each group of home furnishing data, and manually mark them in order to obtain a home furnishing sequence, which is divided into two parts: an input sequence and an output sequence, to generate a complete home furnishing data set;

步骤2.创建对应的家居数据集词汇表,词汇表主要包含特殊标记和家居标记;Step 2. Create the corresponding household data set vocabulary, the vocabulary mainly includes special tags and household tags;

顺序结构家居预测模型的训练阶段,包括如下步骤:The training phase of the sequential structure home furnishing prediction model includes the following steps:

步骤3.序列编码过程,具体为:将步骤1的输入序列作为循环神经网络模型的输入,计算并得到状态序列,该状态序列较为完整地保留了输入序列的结构信息;Step 3. The sequence encoding process, specifically: taking the input sequence of step 1 as the input of the cyclic neural network model, calculating and obtaining a state sequence, which relatively completely retains the structural information of the input sequence;

步骤4.序列解码过程,具体为:对步骤3得到的状态序列加权求和得到当前状态语义向量,该语义向量用于判定模型需要将更多的注意力放于状态序列的哪个位置,使家居的结构信息得以在全部的神经网络模型中传输,将当前状态语义向量作为循环神经网络模型的输入,计算得到出输出序列的概率分布,得到基于循环神经网络的顺序结构家居序列预测模型;Step 4. The sequence decoding process, specifically: the weighted summation of the state sequence obtained in step 3 to obtain the current state semantic vector, which is used to determine where the model needs to pay more attention to the state sequence, so that the home The structure information can be transmitted in all neural network models, the current state semantic vector is used as the input of the recurrent neural network model, the probability distribution of the output sequence is calculated, and the sequence structure home sequence prediction model based on the recurrent neural network is obtained;

顺序结构家居预测模型的预测阶段,具体为:The prediction stage of the sequential structure home prediction model, specifically:

步骤5.将一组部分家居序列输入到步骤4得到的模型,生成并得到完整的家居序列;Step 5. Input a group of partial household sequences into the model obtained in step 4 to generate and obtain a complete household sequence;

双向分层家居预测模型的数据预处理阶段,包括如下步骤:The data preprocessing stage of the bidirectional hierarchical home furnishing prediction model includes the following steps:

步骤6.在步骤1的家居数据集中加入尺寸参数,并创建带尺寸参数的家居数据集;Step 6. Add size parameters to the home data set in step 1, and create a home data set with size parameters;

步骤7.对所有家居常规尺寸参数化并扩充到步骤2的词汇表中,创建层次化家居模型词汇表,词汇表主要包含特殊标记和家居标记;Step 7. Parametrize all household conventional sizes and expand them into the vocabulary of step 2, create a hierarchical household model vocabulary, and the vocabulary mainly includes special marks and household marks;

步骤8.一个完整的家居序列定义为子序列的集合D,其中D={U1,U2,…,U2k-1,U2k},每一个子序列Um包含若干家居模型词汇,即Um={wm,1,wm,2,wm,3…wm,n},wm,n表示该词汇在第m个子序列Um中的位置n,可以表示如步骤7所示的家居标记的内容含义或特殊标记所表示的当前行为状态,家居序列中U2i-1表示输入子序列,U2i表示输出子序列,其中i=1,2,…,k;Step 8. A complete household sequence is defined as a set D of subsequences, where D={U 1 , U 2 , ..., U 2k-1 , U 2k }, and each subsequence U m contains several household model vocabulary, namely U m ={w m,1 ,w m,2 ,w m,3 ...w m,n }, w m,n represents the position n of the word in the mth subsequence U m , which can be represented as the step 7 In the household sequence, U 2i-1 represents the input subsequence, and U 2i represents the output subsequence, where i=1,2,...,k;

双向分层家居预测模型的训练阶段,分层递归结构建模与实施的过程,包括如下步骤:The training phase of the bidirectional hierarchical home furnishing prediction model, the process of hierarchical recursive structure modeling and implementation, includes the following steps:

步骤9.如步骤8所示家居序列集合,设置递归次数为k,设置i初始值为1;Step 9. As shown in step 8, the home sequence collection is set, and the recursion times are set to k, and the initial value of i is set to 1;

步骤10.序列编码过程,具体为:将步骤8的输入子序列U2i-1作为循环神经网络的输入,计算得到该输入子序列对应的编码向量;Step 10. The sequence encoding process, specifically: taking the input subsequence U 2i-1 of step 8 as the input of the cyclic neural network, and calculating the encoding vector corresponding to the input subsequence;

步骤11.将步骤10输出的输入子序列对应的编码向量作为循环神经网络的输入,计算得到该输入子序列时刻对应的上下文向量;Step 11. Use the coding vector corresponding to the input subsequence output in step 10 as the input of the cyclic neural network, and calculate the context vector corresponding to the input subsequence moment;

步骤12.序列解码过程,具体为:将步骤11输出的上下文向量与步骤8输出子序列U2i并联一起作为循环神经网络的输入向量,再对该输出子序列进行预测,产生对应该输出子序列的词汇概率分布;Step 12. The sequence decoding process is specifically as follows: the context vector output in step 11 and the output subsequence U 2i in step 8 are used in parallel as the input vector of the cyclic neural network, and then the output subsequence is predicted to generate the corresponding output subsequence. The word probability distribution of ;

步骤13.置i=i+1,利用步骤12的输出子序列和下一个输入子序列U2i-1,组合作为新的输入子序列,迭代执行步骤10至步骤12,直至i=k遍历完所有子序列,得到基于循环神经网络的分层递归结构家居预测模型;Step 13. Set i=i+1, use the output subsequence of step 12 and the next input subsequence U 2i-1 to combine as a new input subsequence, and iteratively execute steps 10 to 12 until i=k has been traversed. For all subsequences, a hierarchical recursive structure home prediction model based on recurrent neural network is obtained;

双向分层家居预测模型的预测阶段,包括如下步骤:The prediction stage of the bidirectional hierarchical home furnishing prediction model includes the following steps:

步骤14.基于步骤13家居预测模型,输入家居子序列,预测相应输出子序列;Step 14. Based on the home furnishing prediction model in step 13, input the home furnishing subsequence, and predict the corresponding output subsequence;

步骤15.使用集束搜索对预测可选方案进行筛选与排序,在所有满足约束条件的序列中,选择概率得分最高的一个序列作为目标输出,与输入构成当前分层的家居预测序列,最终预测生成完整家居序列;Step 15. Use beam search to filter and sort the prediction options. Among all the sequences that meet the constraints, select the sequence with the highest probability score as the target output, and form the current hierarchical home prediction sequence with the input, and the final prediction is generated. Complete home sequence;

模型绘制阶段,具体为:利用三维软件进行场景建模,将预测到的家居序列方案在模型库中检索到对应的模型,并在场景中绘制,同时准备不同风格种类的数据集,根据预测序列在模型库中检索对应风格家居进行绘制;The model drawing stage is specifically: using 3D software to model the scene, retrieving the corresponding model in the model library for the predicted home sequence scheme, and drawing it in the scene, while preparing data sets of different styles, according to the predicted sequence Retrieve the corresponding style home in the model library to draw;

顺序结构家居预测模型和双向分层家居预测模型的模型绘制阶段相同。The model drawing stage is the same for the sequential structure home prediction model and the bidirectional hierarchical home prediction model.

有益效果beneficial effect

本发明一种基于深度神经网络的家居设计方法,与现有技术相比,具有如下有益效果:Compared with the prior art, a home furnishing design method based on a deep neural network of the present invention has the following beneficial effects:

1.本发明有效地提取了依存结构特征并完成家居摆放顺序预测;1. The present invention effectively extracts the dependent structural features and completes the home furnishing order prediction;

2.根据参数约束有效地预测家居字符间的结构关系和尺寸参数;2. Effectively predict the structural relationship and size parameters between home furnishing characters according to parameter constraints;

3.使家居设计的实现更智能化,同时进行了三维可视化。3. The realization of home design is more intelligent, and three-dimensional visualization is carried out at the same time.

附图说明Description of drawings

图1是本发明一种基于深度神经网络的家居设计方法及实施例中顺序结构家居预测模型流程图;Fig. 1 is a kind of home furnishing design method based on deep neural network of the present invention and the flow chart of sequential structure home furnishing prediction model in the embodiment;

图2是本发明一种基于深度神经网络的家居设计方法及实施例中双向分层家居预测模型流程图。FIG. 2 is a flowchart of a bidirectional hierarchical home furnishing prediction model in a home furnishing design method based on a deep neural network and an embodiment of the present invention.

具体实施方式Detailed ways

下面结合附图对本发明方法的实施例做详细说明,但本发明的实施方式不限于此。构成本申请的一部分的附图用来提供对本发明的进一步理解,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。The embodiments of the method of the present invention will be described in detail below with reference to the accompanying drawings, but the embodiments of the present invention are not limited thereto. The accompanying drawings constituting a part of the present application are used to provide further understanding of the present invention, and the exemplary embodiments of the present invention and their descriptions are used to explain the present invention and do not constitute an improper limitation of the present invention.

实施例1Example 1

一种基于深度神经网络的家居设计方法,首先提出了基于依存关系特征的顺序结构家居预测模型,并设计和使用该模型去提取家居字符间的结构特征同时学习顺序摆放序列的预测。具体实现步骤如下:A home furnishing design method based on deep neural network, firstly proposes a sequential structure home furnishing prediction model based on dependency features, and designs and uses the model to extract structural features between home furnishing characters and learn the prediction of sequential placement sequence. The specific implementation steps are as follows:

步骤一,家居数据集的标注及数据集整理Step 1: Labeling of home data sets and data set arrangement

从专业网站家居设计图和家居平面图中采集家居设计陈列,图片采用的标准是空间相对完整且家居陈列符合中等户型规格,对其中涉及到的家居进行标注,在家居标注的选择上,倾向于体积相对较大、个体较独立的对象,用英文单词表示物体的类别,家居对象都是围着墙体摆放一周的,可以按照顺时针顺序,将这些家居的摆放顺序变为序列;在序列中加入了corner(拐角)来表示第一面墙体家居输出结束,第二面墙体家居输出开始,作用相当于某个方向上家居分组隔断的标识;数据集中,每一套序列数据均由四组家居对象组成以保持数据的特征信息完整性;把数据集整理成“输入”和“输出”的对应文件,生成.enc(输入-Encoder)和.dec(输出-Decoder)文件。Collect the home design and display from professional website home design drawings and home floor plans. The standard used in the pictures is that the space is relatively complete and the home display meets the specifications of the medium-sized apartment. Label the homes involved. In the choice of home labeling, it tends to be volume. Relatively large and individual objects are relatively independent, and English words are used to represent the types of objects. The household objects are placed around the wall for a week. The order of these households can be changed into a sequence in a clockwise order; in the sequence Corner (corner) is added to indicate the end of home furnishing output on the first wall and the beginning of home furnishing output on the second wall, which is equivalent to the identification of home group partitions in a certain direction; in the data set, each set of sequence data is represented by Four groups of household objects are formed to maintain the integrity of the characteristic information of the data; the data set is organized into corresponding files of "input" and "output", and .enc (input-Encoder) and .dec (output-Decoder) files are generated.

步骤二,词汇表创建Step 2, vocabulary creation

词汇表文件的内容是涉及到的家居类别,作用是将输入和输出转换成对应id的形式,在纯结构顺序的家居预测模型中,词汇表包含了32个单词(28种家居类别和4种特殊符号),词汇表中的内容:_PAD_GO_EOS_UNK chair divan lounge sofa flowerpot……,前四种标识是特殊标记,是用于填充标记输入与输出。_GO标记输入开始,_EOS标记输入结束,_UNK标记在词汇表中未出现的字符,_PAD用来填充序列,保证同一批次中的序列具有相同的长度。The content of the vocabulary file is the related household category, and the function is to convert the input and output into the form of the corresponding id. In the purely structural order household prediction model, the vocabulary contains 32 words (28 household categories and 4 special symbols), the content in the vocabulary: _PAD_GO_EOS_UNK chair divan lounge sofa flowerpot..., the first four kinds of symbols are special symbols, which are used to fill in the input and output of the symbol. _GO marks the start of input, _EOS marks the end of input, _UNK marks characters that do not appear in the vocabulary, and _PAD is used to pad sequences to ensure that sequences in the same batch have the same length.

输入和输出转换成的ids文件,.enc中每一行是一个输入,.dec中每一行是一个输出。每一行中的每一个id代表输入或输出中对应词汇表位置的词。The ids file converted into input and output, each line in .enc is an input, and each line in .dec is an output. Each id in each row represents the word at the corresponding vocabulary position in the input or output.

步骤三,顺序结构家居预测模型训练Step 3, training of sequential structure home furnishing prediction model

模型有三种模式,分别是train、test、serve。使用train模式进行训练,test模式来测试,serve模式来验证,模型的Encoder部分选择LSTM,Decoder部分在解码时搜索并输出最优的预测序列,模型加入Attention机制,Encoder与Attention部分并无关联,普通Encoder-Decoder模型都是用于自然语言处理任务的,在家居设计方面,更多是希望预测一些依存关系结构和对称结构以及全局布局分布,模型流程如图1所示。The model has three modes, train, test, serve. Use the train mode for training, the test mode for testing, and the serve mode for verification. The Encoder part of the model selects LSTM, the Decoder part searches for and outputs the optimal prediction sequence during decoding, and the model joins the Attention mechanism. The Encoder is not related to the Attention part. Ordinary Encoder-Decoder models are used for natural language processing tasks. In terms of home design, it is more hoped to predict some dependency structures, symmetrical structures, and global layout distribution. The model flow is shown in Figure 1.

Encoder部分选用LSTM,使其学会搜索能够与源序列对应的子序列表示,按序读入输入序列(x1,…,xTx),并计算当前隐藏层状态序列(s1,…,st);获取每个词xt的隐藏层状态st,当前的隐藏层状态st包含了前向词的语义信息,较为完整地保留了输入序列的结构信息;输入门i对输入参数进行处理,遗忘门f设置选择性遗忘的权重,并舍弃部分信息,输出门o对输出参数进行处理,然后计算记忆单元ct的状态,如公式(1)所示:The Encoder part uses LSTM to make it learn to search for subsequence representations corresponding to the source sequence, read the input sequence (x 1 ,...,x Tx ) in order, and calculate the current hidden layer state sequence (s 1 ,...,s t ); obtain the hidden layer state s t of each word x t , the current hidden layer state s t contains the semantic information of the forward word, and retains the structural information of the input sequence relatively completely; the input gate i processes the input parameters , the forgetting gate f sets the weight of selective forgetting, and discards part of the information, the output gate o processes the output parameters, and then calculates the state of the memory unit c t , as shown in formula (1):

Figure BDA0001732694880000031
Figure BDA0001732694880000031

其中,xt表示t时刻网络的输入数据,st-1表示t-1时刻隐藏层的神经元状态,U、W是LSTM模型中的权值矩阵,f、i为遗忘门和输入门,

Figure BDA0001732694880000032
为点乘,tanh为激活函数。Among them, x t represents the input data of the network at time t, s t-1 represents the neuron state of the hidden layer at time t-1, U and W are the weight matrices in the LSTM model, f and i are the forgetting gate and the input gate,
Figure BDA0001732694880000032
is the dot product, and tanh is the activation function.

计算t时刻LSTM隐藏层的激活值,如公式(2)所示:Calculate the activation value of the hidden layer of the LSTM at time t, as shown in formula (2):

Figure BDA0001732694880000033
Figure BDA0001732694880000033

其中,o为输出门,

Figure BDA0001732694880000034
为点乘,tanh为激活函数。where o is the output gate,
Figure BDA0001732694880000034
is the dot product, and tanh is the activation function.

用Encoder得到的每一个输入的隐藏层状态信息去预测目标输出,对于每一个预测的条件概率,计算入公式(3)所示:Use the hidden layer state information of each input obtained by the Encoder to predict the target output. For each predicted conditional probability, it is calculated as shown in formula (3):

p(yi|y1,…,yi-1,x)=g(yi-1,si,ci) (3)p(y i |y 1 ,...,y i-1 ,x)=g(y i-1 ,s i , ci ) (3)

其中si是i时刻RNN的隐藏层状态,g是softmax激活函数,ci是语义向量。where s i is the hidden layer state of the RNN at time i , g is the softmax activation function, and ci is the semantic vector.

ci是由Encode中计算的隐藏层状态序列(h1,…,hTx)决定,每一个hi包含了完整的输入序列信息,同时在输入序列的第i个位置增加了注意力概率,对hi加权求和就得到了语义向量ci;权重参数αij或它的关联参数eij体现了hj的重要性并为下一个隐层状态si和输出yi提供了重要信息,这也是Attention Model的作用机制,Decoder部分根据这些信息可以判定需要将更多的注意力放于源句子的哪个位置。c i is determined by the hidden layer state sequence (h 1 , . The semantic vector c i is obtained by the weighted summation of h i ; the weight parameter α ij or its associated parameter e ij embodies the importance of h j and provides important information for the next hidden layer state s i and output y i , This is also the mechanism of the Attention Model, and the Decoder part can determine which position of the source sentence needs to pay more attention based on this information.

实施例2Example 2

一种基于深度神经网络的家居设计方法,进一步引入限制家居物体相对尺寸的参数约束,构建双向分层家居预测模型,对于每一层的预测任务都要事先给定该层的参数约束,同时在预测阶段利用集束搜索进行参数筛选,完成家居序列的方案设计工作,模型流程如图2所示。具体实现步骤如下:A home furnishing design method based on deep neural network further introduces parameter constraints that limit the relative size of home objects, and builds a bidirectional hierarchical home furnishing prediction model. In the prediction stage, the cluster search is used to filter the parameters, and the scheme design of the household sequence is completed. The model process is shown in Figure 2. The specific implementation steps are as follows:

步骤一,层次化的词组级家居数据集构建Step 1, build a hierarchical phrase-level household data set

在每一个家居字符后追加一个数字,即数字大小表示家居的尺寸,在这里表示的是相对于其摆放墙体的长度,对于每一类家居,设定常用的几个不同尺寸;接下来构建层次化输入与输出对,由于引入了尺寸参数,就需要对每一面墙体都进行单独预测,而非顺序结构中的一对输入和输出序列对就构成一套完整的序列,层次化模型需要进行四次输入和预测输出才会实现完整的方案,因此数据集形式更像是一个序列对的数组。Append a number after each household character, that is, the size of the number represents the size of the household, here it represents the length of the wall relative to it. For each type of household, set several different sizes that are commonly used; next Constructing hierarchical input and output pairs. Due to the introduction of size parameters, each wall needs to be predicted separately. A pair of input and output sequence pairs in a non-sequential structure constitutes a complete set of sequences. The hierarchical model Four inputs and predicted outputs are required to implement the full scheme, so the dataset form is more like an array of sequence pairs.

为构建序列对数组,这里引入了8个元素的集合形式{s1,s2,s3,s4,s5,s6,s7,s8},代表了输入x与输出y之间四轮“对话”过程,为捕捉交互式预测过程,将特殊的序列结束标记附加到所有训练集序列中,并在来自同输入或同输出中加入标识以表示当前属于输入还是输出阶段。每一个序列的结尾用</s>标记当前序列预测的结束,每四轮预测一套完整家居序列的结束用</d>标记;<x>标记当前是输入阶段,<y>标记当前是输出阶段。In order to construct the sequence pair array, a set form of 8 elements {s1, s2, s3, s4, s5, s6, s7, s8} is introduced here, which represents the four-round "dialogue" process between the input x and the output y, which is Capture the interactive prediction process, append a special end-of-sequence marker to all training set sequences, and add a flag from the same input or the same output to indicate whether it is currently in the input or output stage. The end of each sequence is marked with </s> to mark the end of the current sequence prediction, and the end of every four rounds of prediction of a complete home sequence is marked with </d>; the <x> mark is currently the input stage, and the <y> mark is currently output stage.

步骤二,层次化家居模型词汇表创建Step 2: Create a Hierarchical Home Model Vocabulary

层次化家居模型词汇表的格式与顺序结构格式相似,其作用是将家居字符转换成对应的id。与顺序结构不同的是引入了参数约束,所以序列是基于词组级的,基于此,统计了所有家居的常规尺寸并将其参数化,扩充到词汇表中。此外表中也包含了扩充的特殊字符,对应格式如表1所示:The format of the hierarchical home model vocabulary is similar to the sequential structure format, and its function is to convert the home characters into the corresponding ids. Unlike the sequence structure, parameter constraints are introduced, so the sequence is based on the phrase level. Based on this, the general dimensions of all households are counted and parameterized, and expanded into the vocabulary. In addition, the table also contains extended special characters, and the corresponding format is shown in Table 1:

表1词汇表内容列表Table 1 Glossary Contents List

词汇表内容Glossary content 含义meaning _PAD_PAD 序列填充sequence padding _GO_GO 输入开始enter start _EOS_EOS 输入结束end of input _UNK_UNK 陌生字符strange characters </s></s> 子序列预测结束end of subsequence prediction </d></d> 第四轮预测结束The fourth round of predictions ends chairchair 家居字符-椅子Home Character - Chair divandivan 家居字符-长沙发Home character - couch loungelounge 家居字符-躺椅Home Character - Recliner ……... 家居字符home character (0.3)(0.3) 参数-0.3mparameter-0.3m (0.4)(0.4) 参数-0.4mparameter-0.4m (0.5)(0.5) 参数-0.5mparameter-0.5m (0.8)(0.8) 参数-0.8mparameter-0.8m ……... 参数parameter

词汇表中家居字符包含了28类,描述参数有30个,特殊标记6种,词汇表分为输入与输出两个对应表,两表包含的内容基本相同。There are 28 categories of household characters in the vocabulary, 30 description parameters, and 6 special marks. The vocabulary is divided into two corresponding tables: input and output, and the contents of the two tables are basically the same.

步骤三,双向分层家居预测模型训练Step 3: Two-way hierarchical home furnishing prediction model training

模型由三个循环神经网络模块组成:Encoder、Decoder和Context,Encoder网络模块将每一个子序列映射成一个子序列向量,该向量是经子序列中最后一个字符被处理之后获得的隐藏层状态,Context网络模块实现高层信息的学习与记忆,其通过迭代地处理每个子序列向量来跟踪之前的语义信息,Decoder网络模块利用Context网络模块的隐藏层状态并在下一个子序列预测时产生对应字符的概率分布;The model consists of three recurrent neural network modules: Encoder, Decoder and Context. The Encoder network module maps each subsequence into a subsequence vector, which is the hidden layer state obtained after the last character in the subsequence is processed. The Context network module realizes the learning and memory of high-level information. It tracks the previous semantic information by iteratively processing each subsequence vector. The Decoder network module utilizes the hidden layer state of the Context network module and generates the probability of the corresponding character when the next subsequence is predicted. distributed;

将一次完整的家居序列(四组输入及预测)视为八个子序列的集合D,其中D={U1,U2,U3,U4,U5,U6,U7,U8},涉及到输入端和输出端两个交互方,每一个Um又包含了若干个家居字符,意味着Um={wm,1,wm,2,wm,3…wm,n},wm,n是一个取自词汇表中值的随机变量,代表了其在第m个子序列w中的位置n。字符wm,n既代表了家居内容的含义又表示了当前的行为状态,即是输入阶段还是预测阶段或者是子序列的结束还是该轮预测的结束,预测模型将概率P的分布进行参数化的过程是由任意长度的所有可能家居完整序列集合中的参数θ来控制。一轮预测过程的概率P可以被分解为如公式(4)所示:Consider a complete household sequence (four sets of inputs and predictions) as a set D of eight subsequences, where D={U 1 ,U 2 ,U 3 ,U 4 ,U 5 ,U 6 ,U 7 ,U 8 } , involving two interactive parties of the input end and the output end, and each U m contains several household characters, which means U m ={w m,1 ,w m,2 ,w m,3 ...w m,n }, w m,n is a random variable taken from the value in the vocabulary, representing its position n in the mth subsequence w. The characters w m, n represent both the meaning of the home content and the current behavior state, that is, whether it is the input stage or the prediction stage or the end of the subsequence or the end of the round of prediction. The prediction model parameterizes the distribution of the probability P The process is controlled by the parameter θ in the set of all possible home complete sequences of arbitrary length. The probability P of one round of prediction process can be decomposed as shown in Equation (4):

Figure BDA0001732694880000051
Figure BDA0001732694880000051

进一步按字符粒度划分又可以分解为如公式(5)所示:Further division by character granularity can be decomposed as shown in formula (5):

Figure BDA0001732694880000052
Figure BDA0001732694880000052

其中,U<m={U1,…,Um-1},wm,<n={wm,1,…,wm,n-1},即表示在子序列Um中n之前的字符,预测固定的四次分层递归的过程,而对话的回合数是不确定的。模型的抽样可以按照标准对话模型进行,即基于先前采样字符为条件所生成的条件概率p分布Pθ(wm,n|wm,<n,U<m),每次从中取样一个字符,使用标准的n-grams模型来计算各个完整家居字符预测的联合概率。Among them, U <m ={U 1 ,...,U m-1 }, w m,<n ={w m,1 ,...,w m,n-1 }, which means that the subsequence U m is before n characters, the prediction is a fixed four-fold hierarchical recursive process, and the number of turns of the dialogue is indeterminate. The sampling of the model can be carried out according to the standard dialogue model, that is, based on the conditional probability p distribution P θ (w m,n |w m,<n ,U <m ) generated based on the previously sampled characters, one character is sampled from it at a time, A standard n-grams model is used to compute joint probabilities of individual complete home character predictions.

在训练阶段,由于每一个Decoder的输出都是有正确答案的,因此不需要对参数进行多余的处理。模型测试阶段,使用集束搜索对可选方案进行筛选与排序,在所有满足约束条件的序列中,选择概率得分最高的一个序列作为目标输出,与输入构成当前分层的家居设计序列。In the training phase, since the output of each Decoder has the correct answer, there is no need to perform redundant processing on the parameters. In the model testing stage, the beam search is used to screen and sort the alternatives, and among all the sequences that satisfy the constraints, the sequence with the highest probability score is selected as the target output, and the input constitutes the current hierarchical home design sequence.

一种基于深度神经网络的家居设计方法,将前面家居模型所设计的家居方案进行三维可视化,展示基于深度学习的家居模型在学习风格方面的优势。利用Unity3D进行场景建模,将前文预测到的家居序列方案在模型库中检索到对应的模型,并在Unity场景中进行绘制,利用三维场景绘制的方式对场景进行建模其效果更加直观和形象。根据预测序列在模型库中检索对应的中式或欧式家居,绘制效果自然优雅且风格特征明显。A home design method based on deep neural network, which visualizes the home plan designed by the previous home model in 3D, showing the advantages of the home model based on deep learning in terms of learning style. Use Unity3D for scene modeling, retrieve the corresponding model in the model library for the home sequence scheme predicted above, and draw it in the Unity scene, and use the 3D scene drawing method to model the scene. The effect is more intuitive and vivid. . According to the prediction sequence, the corresponding Chinese-style or European-style furniture is retrieved in the model library, and the drawing effect is natural and elegant with obvious style characteristics.

顺序结构家居预测模型在训练次数达到60K左右时,模型的perplexity基本稳定在1.21左右,可以直观地理解为平均情况下,模型预测下一个家居字符时,认为有1~2个词等可能地作为下一个词的合理选择,说明模型基本上可以找到合适的下一个目标预测,模型可以提取家居字符之间结构关系,同时预测的家居设计方案具有一定的场景合理性;When the number of training times of the sequential structure home prediction model reaches about 60K, the perplexity of the model is basically stable at about 1.21. It can be intuitively understood that in the average case, when the model predicts the next home character, it is considered that there are 1 to 2 words, etc. may be used as The reasonable choice of the next word indicates that the model can basically find a suitable next target prediction, the model can extract the structural relationship between household characters, and the predicted household design scheme has a certain scene rationality;

在双向分层家居预测模型中,训练达到50epochs时,模型的perplexity稳定在2.52左右,模型可以根据参数约束有效地预测家居字符间的结构关系和尺寸参数,具有实际合理性,以表格的形式列举一组模型预测客厅场景的家居设计方案,将客厅预测方案的几个子序列依次连接起来,会出现“flowerpot、television、flowerpot”这样的对称结构,同时“divan”也能恰当的同“television”形成正对的布局,观察参数约束方面,最终筛选出的参数每一组对应的子序列长度和不会超过所依附墙体的长度,根据先验知识,不同的家居类型的尺寸参数会在一个适当的范围内变化,如“flowerpot”不会同参数“3.0”关联起来。In the two-way hierarchical home furnishing prediction model, when the training reaches 50 epochs, the perplexity of the model is stable at about 2.52. The model can effectively predict the structural relationship and size parameters between home furnishing characters according to the parameter constraints, which is practical and reasonable, and is listed in the form of a table. A set of models predicts the home design scheme of the living room scene, and connects several sub-sequences of the living room forecast scheme in turn, there will be symmetrical structures such as "flowerpot, television, flowerpot", and "divan" can also be properly formed with "television". In the right layout, observing the parameter constraints, the length of each sub-sequence corresponding to each group of parameters finally screened will not exceed the length of the attached wall. According to prior knowledge, the size parameters of different home types will be in an appropriate Changes within the range, such as "flowerpot" will not be associated with the parameter "3.0".

表1:客厅预测方案Table 1: Living room forecast scenarios

Figure BDA0001732694880000053
Figure BDA0001732694880000053

以上所述为本发明的较佳实施例而已,本发明不应该局限于该实施例和附图所公开的内容。凡是不脱离本发明所公开的精神下完成的等效或修改,都落入本发明保护的范围。The above descriptions are only the preferred embodiments of the present invention, and the present invention should not be limited to the contents disclosed in the embodiments and the accompanying drawings. All equivalents or modifications accomplished without departing from the disclosed spirit of the present invention fall into the protection scope of the present invention.

Claims (1)

1. A home design method based on a deep neural network is characterized by comprising the following steps: the method comprises a sequential structure home prediction model and a bidirectional hierarchical home prediction model, and the implementation method comprises a data preprocessing stage, a training stage, a prediction stage and a model drawing stage;
the data preprocessing stage of the sequential structure home prediction model comprises the following steps:
step 1, manually labeling each group of collected home furnishing data in sequence to obtain a home furnishing sequence, wherein the sequence is divided into an input sequence and an output sequence to generate a complete home furnishing data set;
step 2, creating a corresponding home data set vocabulary table, wherein the vocabulary table comprises special marks and home marks;
the training stage of the sequential structure home prediction model comprises the following steps:
and 3, a sequence coding process specifically comprises the following steps: taking the input sequence in the step 1 as the input of a recurrent neural network model, calculating and obtaining a state sequence, wherein the state sequence completely reserves the structural information of the input sequence;
step 4, sequence decoding process, which specifically comprises: weighting and summing the state sequences obtained in the step (3) to obtain a current state semantic vector, wherein the semantic vector is used for judging the position of the state sequence where the model needs to put more attention so that the structural information of the home is transmitted in all the neural network models, the current state semantic vector is used as the input of the cyclic neural network model, the probability distribution of an output sequence is obtained through calculation, and a home sequence prediction model with a sequential structure based on the cyclic neural network is obtained;
the prediction stage of the sequential structure household prediction model specifically comprises the following steps:
step 5, inputting a group of partial home sequence into the model obtained in the step 4, and generating and obtaining a complete home sequence;
the data preprocessing stage of the bidirectional layered home prediction model comprises the following steps:
step 6, adding size parameters into the home data set in the step 1, and creating a home data set with the size parameters;
step 7, parameterizing all household conventional sizes, expanding the parameterized household conventional sizes into the vocabulary in the step 2, and creating a hierarchical household model vocabulary, wherein the vocabulary comprises special marks and household marks;
step 8, a complete home sequence is defined as a set D of subsequences, wherein D ═ U 1 ,U 2 ,…,U 2k-1 ,U 2k Each subsequence U m Comprises a plurality of home furnishing moldsType vocabulary, i.e. U m ={w m,1 ,w m,2 ,w m,3 …w m,n },w m,n Indicates that the word is in the m-th sub-sequence U m The position n in (1) represents the content meaning of the home mark or the current behavior state represented by the special mark as shown in step 7, and U in the home sequence 2i-1 Representing an input subsequence, U 2i Represents an output subsequence, wherein i ═ 1,2, …, k;
the training phase of the bidirectional layered home prediction model and the modeling and implementation process of the layered recursive structure comprise the following steps:
step 9, setting the recursion times as k and setting the initial value of i as 1 in the home sequence set shown in the step 8;
step 10, the sequence coding process specifically comprises: inputting the subsequence U of step 8 2i-1 As the input of the recurrent neural network, calculating to obtain the coding vector corresponding to the input subsequence;
step 11, taking the coding vector corresponding to the input subsequence output in the step 10 as the input of the recurrent neural network, and calculating to obtain a context vector corresponding to the input subsequence at the moment;
step 12. the sequence decoding process specifically includes: the context vector output in step 11 and the subsequence U output in step 8 are compared 2i The parallel connection is used as the input vector of the cyclic neural network, and then the output subsequence is predicted to generate the vocabulary probability distribution corresponding to the output subsequence;
step 13. put i ═ i +1, use the output subsequence of step 12 and the next input subsequence U 2i-1 Combining the sub-sequences as new input sub-sequences, and iteratively executing the steps 10 to 12 until all the sub-sequences are traversed when i is k, so as to obtain a hierarchical recursive structure home prediction model based on the recurrent neural network;
the prediction stage of the bidirectional layered home prediction model comprises the following steps:
step 14, inputting the home subsequence based on the home prediction model in the step 13, and predicting a corresponding output subsequence;
step 15, screening and sequencing the prediction alternatives by using cluster searching, selecting one sequence with the highest probability score from all sequences meeting constraint conditions as target output, forming a current layered home prediction sequence with the target output and input, and finally predicting to generate a complete home sequence;
a model drawing stage, which specifically comprises the following steps: carrying out scene modeling by using three-dimensional software, retrieving a corresponding model from a model base for the predicted home sequence scheme, drawing in a scene, preparing data sets of different styles and types, and retrieving the home with the corresponding style from the model base according to the predicted sequence for drawing;
the model drawing stages of the sequential structure home prediction model and the bidirectional hierarchical home prediction model are the same.
CN201810781492.0A 2018-07-17 2018-07-17 Home design method based on deep neural network Expired - Fee Related CN108984904B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810781492.0A CN108984904B (en) 2018-07-17 2018-07-17 Home design method based on deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810781492.0A CN108984904B (en) 2018-07-17 2018-07-17 Home design method based on deep neural network

Publications (2)

Publication Number Publication Date
CN108984904A CN108984904A (en) 2018-12-11
CN108984904B true CN108984904B (en) 2022-09-20

Family

ID=64549284

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810781492.0A Expired - Fee Related CN108984904B (en) 2018-07-17 2018-07-17 Home design method based on deep neural network

Country Status (1)

Country Link
CN (1) CN108984904B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109871604B (en) * 2019-01-31 2023-05-16 浙江工商大学 Indoor functional area division method based on deep adversarial network model
CN110135032B (en) * 2019-04-30 2020-12-18 厦门大学 A method and device for auxiliary clothing generation based on adversarial generative network
CN110457650B (en) * 2019-07-05 2023-12-22 泰康保险集团股份有限公司 Method, device, medium and electronic equipment for generating livability design
CN110826135A (en) * 2019-11-05 2020-02-21 广东博智林机器人有限公司 Home arrangement method and device, neural network construction method and storage medium
CN110751721B (en) * 2019-12-24 2020-10-30 广东博智林机器人有限公司 Furniture layout drawing generation method and device, computer equipment and storage medium
CN110874496A (en) * 2020-01-20 2020-03-10 广东博智林机器人有限公司 Building placement method and device based on reinforcement learning, storage medium and computer equipment
CN111523169B (en) * 2020-04-24 2023-06-13 广东博智林机器人有限公司 Decoration scheme generation method and device, electronic equipment and storage medium
WO2021217340A1 (en) * 2020-04-27 2021-11-04 Li Jianjun Ai-based automatic design method and apparatus for universal smart home scheme
CN111680421A (en) * 2020-06-05 2020-09-18 广东博智林机器人有限公司 Home decoration design method and device, electronic equipment and storage medium
CN112131629A (en) * 2020-08-12 2020-12-25 南京维狸家智能科技有限公司 An artificial intelligence-based home scene layout method
CN113256157A (en) * 2021-06-17 2021-08-13 平安科技(深圳)有限公司 Conference flow generation method and device, computer equipment and storage medium
CN114462207B (en) * 2022-01-07 2023-03-14 广州极点三维信息科技有限公司 Matching method, system, equipment and medium for home decoration template

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107241358A (en) * 2017-08-02 2017-10-10 重庆邮电大学 A kind of smart home intrusion detection method based on deep learning
CN107396322A (en) * 2017-08-28 2017-11-24 电子科技大学 Indoor orientation method based on route matching Yu coding and decoding Recognition with Recurrent Neural Network
CN108153158A (en) * 2017-12-19 2018-06-12 美的集团股份有限公司 Switching method, device, storage medium and the server of household scene

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8253731B2 (en) * 2006-11-27 2012-08-28 Designin Corporation Systems, methods, and computer program products for home and landscape design

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107241358A (en) * 2017-08-02 2017-10-10 重庆邮电大学 A kind of smart home intrusion detection method based on deep learning
CN107396322A (en) * 2017-08-28 2017-11-24 电子科技大学 Indoor orientation method based on route matching Yu coding and decoding Recognition with Recurrent Neural Network
CN108153158A (en) * 2017-12-19 2018-06-12 美的集团股份有限公司 Switching method, device, storage medium and the server of household scene

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
室内图像中家具多标签标注的实现;马天瑶;《电脑知识与技术》;20171215(第35期);全文 *

Also Published As

Publication number Publication date
CN108984904A (en) 2018-12-11

Similar Documents

Publication Publication Date Title
CN108984904B (en) Home design method based on deep neural network
CN108875807B (en) An image description method based on multi-attention and multi-scale
CN117571014B (en) A visual language navigation method combining image description and text generation
CN109522403B (en) A Method of Abstract Text Generation Based on Fusion Coding
CN112650886B (en) Cross-modal video time retrieval method based on cross-modal dynamic convolution network
CN111611274A (en) A database query optimization method and system
CN111753207B (en) A Review-Based Neural Graph Collaborative Filtering Method
CN112465929B (en) Image generation method based on improved graph convolution network
CN116643989A (en) A Defect Prediction Method Using Graph Structure for Deep Semantic Understanding
CN114168769B (en) Visual question-answering method based on GAT relation reasoning
CN111400494B (en) A sentiment analysis method based on GCN-Attention
CN110706303A (en) GANs-based face image generation method
CN115422388B (en) A visual dialogue method and system
CN113792594A (en) A method and device for locating language fragments in video based on contrastive learning
CN112347756A (en) A method and system for reasoning reading comprehension based on serialized evidence extraction
CN118136155A (en) Drug target affinity prediction method based on multi-modal information fusion and interaction
CN115204171A (en) Document-level event extraction method and system based on hypergraph neural network
CN118550907A (en) A self-supervised trajectory completion method for general scenarios
CN117458440A (en) Method and system for predicting generated power load based on association feature fusion
CN115017805A (en) Method and system for planning optimal path of nuclear retired field based on bidirectional A-x algorithm
CN119557844A (en) A decoupling method, representation method and representation system for cross-modal data
CN119557955B (en) A method and system for generating architectural planning images based on potential diffusion model
CN116524070A (en) Scene picture editing method and system based on text
CN119514643A (en) A method for adversarial generation of hierarchical road network topology based on pseudo-node attention
CN119261926A (en) A method and system for autonomous driving trajectory prediction based on streaming Transformer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220920

CF01 Termination of patent right due to non-payment of annual fee