[go: up one dir, main page]

WO2018198320A1 - Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable - Google Patents

Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable Download PDF

Info

Publication number
WO2018198320A1
WO2018198320A1 PCT/JP2017/016942 JP2017016942W WO2018198320A1 WO 2018198320 A1 WO2018198320 A1 WO 2018198320A1 JP 2017016942 W JP2017016942 W JP 2017016942W WO 2018198320 A1 WO2018198320 A1 WO 2018198320A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable terminal
restaurant
menu
display
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/016942
Other languages
English (en)
Japanese (ja)
Inventor
俊二 菅谷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Optim Corp
Original Assignee
Optim Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Optim Corp filed Critical Optim Corp
Priority to PCT/JP2017/016942 priority Critical patent/WO2018198320A1/fr
Publication of WO2018198320A1 publication Critical patent/WO2018198320A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/12Hotels or restaurants
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a wearable terminal display system, a wearable terminal display method, and a program for displaying a collected restaurant menu as an augmented reality on a display panel of a wearable terminal for a restaurant that can be seen through the display board.
  • Patent Document 1 An electronic menu system is provided that facilitates menu selection according to one's own health condition.
  • Patent Document 1 is a system that replaces a desktop terminal of a restaurant, there is a problem that it is impossible to grasp what menu the restaurant provides before entering the restaurant.
  • the present invention specifies a restaurant from a viewable image of a wearable terminal, and displays a menu collected according to the restaurant as an augmented reality on a display panel of the wearable terminal, a wearable terminal It is an object to provide a display method and a program.
  • the present invention provides the following solutions.
  • the invention according to the first feature is a wearable terminal display system for displaying a restaurant menu on a display panel of a wearable terminal, and an image acquisition means for acquiring an image of a restaurant entering the field of view of the wearable terminal; Analyzing the image, identifying means for identifying the restaurant, collecting means for collecting the restaurant menu, and the restaurant that appears through the display board on the display board of the wearable terminal
  • a wearable terminal display system comprising menu display means for displaying the menu as augmented reality is provided.
  • the invention according to the first feature is a wearable terminal display method for displaying a restaurant menu on a wearable terminal display board, and an image acquisition step of acquiring an image of a restaurant that has entered the field of view of the wearable terminal; Analyzing the image, specifying the restaurant, a collecting step for collecting the restaurant menu, and the restaurant that can be seen through the display board on the display board of the wearable terminal
  • a wearable terminal display method comprising: a menu display step for displaying the menu as augmented reality.
  • the invention according to the first feature includes, in a computer, an image acquisition step of acquiring an image of a restaurant that has entered the field of view of a wearable terminal, a specifying step of analyzing the image and specifying the restaurant, A collection step for collecting a restaurant menu; and a menu display step for displaying the menu as an augmented reality for the restaurant that is visible through the display board on the display board of the wearable terminal.
  • the restaurant menu Before entering the restaurant, the restaurant menu can be displayed on the display panel of the wearable terminal.
  • FIG. 1 is a schematic diagram of a wearable terminal display system.
  • FIG. 2 is an example in which a restaurant menu is collected and displayed on the display board of the wearable terminal.
  • the wearable terminal display system of the present invention is a system that displays collected menus as augmented reality on a display board of the wearable terminal for restaurants that can be seen through the display board.
  • a wearable terminal is a terminal with a field of view such as a smart glass or a head-mounted display.
  • FIG. 1 is a schematic diagram of a wearable terminal display system which is a preferred embodiment of the present invention.
  • the wearable terminal display system includes image acquisition means, identification means, collection means, and menu display means that are realized by a control unit reading a predetermined program.
  • determination means, change means, detection means, action result display means, position / direction acquisition means, estimation means, guideline display means, and selection reception means may be provided. These may be application type, cloud type or others.
  • Each means described above may be realized by a single computer or may be realized by two or more computers (for example, a server and a terminal).
  • the image acquisition means acquires an image of a restaurant that has entered the field of view of the wearable terminal. You may acquire the image imaged with the camera of the wearable terminal. Or even if it is other than a wearable terminal, as long as such an image can be acquired, it does not matter.
  • the image may be a moving image or a still image. In order to display the menu in real time, a real time image is preferable.
  • the identifying means identifies the restaurant by analyzing the image. For example, it is specified whether the restaurant is McDonald's, Yoshinoya, or Gust. It can be identified from the store structure, store name, sign color, logo mark, and so on. Moreover, when it takes time if all the restaurants shown are taken, only the restaurant in the center of the field of view of the wearable terminal may be specified. By specifying only the restaurant in the center of the field of view, the time required for identification can be greatly reduced.
  • the accuracy of image analysis may be improved by machine learning. For example, machine learning is performed using a past image of a restaurant as teacher data.
  • the collecting means collects a menu corresponding to the restaurant.
  • Menus corresponding to restaurants may be collected with reference to a database in which menus are registered in advance.
  • menus may be collected by accessing Web contents previously linked to restaurants.
  • the URL can be collected from the Web content by assigning a URL that links the restaurant and the menu.
  • menus may be collected from Web contents searched by searching for restaurants on the Internet. For example, there are cases where menus are posted on restaurants' homepages, which can be collected from internet searches.
  • menus may be collected from SNS (social networking service) or word-of-mouth sites.
  • the menu display means displays the menu as an augmented reality on the display board of the wearable terminal with respect to the restaurant that can be seen through the display board.
  • a menu drawn with a broken line is displayed as augmented reality on a display board of the wearable terminal with respect to a restaurant drawn with a solid line that can be seen through the display board.
  • the solid line is the real thing
  • the broken line is the augmented reality.
  • the determination means determines whether the displayed menu has been browsed. It may be determined whether the menu has been browsed by acquiring an image being browsed and analyzing the image. Moreover, you may determine whether the menu was browsed from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted
  • the changing means changes the menu to browsed when it is determined that it has been browsed, and changes the degree of attention so that the menu is browsed when it is determined that it has not been browsed. By doing in this way, it can be grasped visually which menu was browsed or not browsed. For example, if you check the check box of the menu, you have already viewed it. For example, it may be viewed by pressing a stamp on the menu.
  • the attention level may be changed by changing the menu color or size, or by pressing a stamp so that the menu stands out.
  • the detecting means detects an action for the displayed menu.
  • the action is, for example, a gesture, a hand movement, a line-of-sight movement, or the like.
  • An action for a menu can be detected by acquiring an image being browsed and analyzing the image. Moreover, you may detect the action with respect to a menu from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted
  • the action result display means displays a result corresponding to the action as an augmented reality on the display board of the wearable terminal for a restaurant that can be seen through the display board.
  • the display of the menu may be deleted when an action for deleting the menu is detected.
  • the link may be opened.
  • other actions are possible.
  • the position direction means acquires the terminal position and the imaging direction of the wearable terminal.
  • the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal.
  • the imaging direction can be acquired from a geomagnetic sensor or an acceleration sensor of the wearable terminal when imaging is performed with the wearable terminal. You may acquire from other than these.
  • the estimation means estimates the target position of the restaurant based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the store position of the imaged restaurant can be estimated.
  • the specifying unit may specify a restaurant from the target position and the image analysis. Even for restaurants in the same chain store, there are cases where the menu differs depending on the store. For example, if it can be specified that it is a Tokyo Minato Ward store, a limited menu of the Tokyo Minato Ward store can also be displayed.
  • the guideline display means displays a guideline for imaging the restaurant as an augmented reality on the display board of the wearable terminal.
  • guidelines such as a frame or a cross may be displayed. Image analysis is facilitated by having images taken along the guidelines.
  • the acquisition unit may acquire an image taken along the guideline.
  • a restaurant can be identified efficiently by acquiring and analyzing only the images taken along the guidelines.
  • the selection accepting unit accepts selection of a selection target for a restaurant that is seen through the display board of the wearable terminal. For example, you may receive selection of the selection object by seeing the restaurant which can be seen through the display board of a wearable terminal for a certain period of time. For example, you may touch the restaurant which sees through the display board of a wearable terminal, and may receive selection of selection object. For example, selection of a selection target may be received by placing a cursor on a restaurant that can be seen through a display board of a wearable terminal. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
  • the menu display means may display the menu as augmented reality on the display board of the wearable terminal only in accordance with a selection target that can be seen through the display board. Since the menu is displayed as augmented reality only in accordance with the selected selection object, the menu can be grasped pinpointly. If the menu is displayed in all the specified restaurants, the screen of the display board may become troublesome.
  • the wearable terminal display method of the present invention is a method of displaying a collected menu as an augmented reality on a display board of a wearable terminal for a restaurant that can be seen through the display board.
  • the wearable terminal display method includes an image acquisition step, a specification step, a collection step, and a menu display step. Although not shown, similarly, a determination step, a change step, a detection step, an action result display step, a position / direction acquisition step, an estimation step, a guideline display step, and a selection reception step may be provided.
  • the image acquisition step acquires an image of a restaurant that has entered the field of view of the wearable terminal. You may acquire the image imaged with the camera of the wearable terminal. Or even if it is other than a wearable terminal, as long as such an image can be acquired, it does not matter.
  • the image may be a moving image or a still image. In order to display the menu in real time, a real time image is preferable.
  • the restaurant is specified by analyzing the image. For example, it is specified whether the restaurant is McDonald's, Yoshinoya, or Gust. It can be identified from the store structure, store name, sign color, logo mark, and so on. Moreover, when it takes time if all the restaurants shown are taken, only the restaurant in the center of the field of view of the wearable terminal may be specified. By specifying only the restaurant in the center of the field of view, the time required for identification can be greatly reduced.
  • the accuracy of image analysis may be improved by machine learning. For example, machine learning is performed using a past image of a restaurant as teacher data.
  • a menu corresponding to the restaurant is collected.
  • Menus corresponding to restaurants may be collected with reference to a database in which menus are registered in advance.
  • menus may be collected by accessing Web contents previously linked to restaurants.
  • the URL can be collected from the Web content by assigning a URL that links the restaurant and the menu.
  • menus may be collected from Web contents searched by searching for restaurants on the Internet. For example, there are cases where menus are posted on restaurants' homepages, which can be collected from internet searches.
  • menus may be collected from SNS (social networking service) or word-of-mouth sites.
  • the menu is displayed on the display board of the wearable terminal as an augmented reality for a restaurant that can be seen through the display board.
  • a menu drawn with a broken line is displayed as augmented reality on a display board of the wearable terminal with respect to a restaurant drawn with a solid line that can be seen through the display board.
  • the solid line is the real thing
  • the broken line is the augmented reality.
  • the determination step determines whether or not the displayed menu has been browsed. It may be determined whether the menu has been browsed by acquiring an image being browsed and analyzing the image. Moreover, you may determine whether the menu was browsed from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted
  • the menu is changed to viewed, and when it is determined that browsing has not been performed, the degree of attention is changed so that the menu is browsed.
  • the degree of attention is changed so that the menu is browsed.
  • it can be grasped visually which menu was browsed or not browsed. For example, if you check the check box of the menu, you have already viewed it. For example, it may be viewed by pressing a stamp on the menu.
  • the attention level may be changed by changing the menu color or size, or by pressing a stamp so that the menu stands out.
  • the detecting step detects an action for the displayed menu.
  • the action is, for example, a gesture, a hand movement, a line-of-sight movement, or the like.
  • An action for a menu can be detected by acquiring an image being browsed and analyzing the image. Moreover, you may detect the action with respect to a menu from the sensor information of a wearable terminal, the sensor information with which the viewer was mounted
  • the result corresponding to the action is displayed as augmented reality on the display board of the wearable terminal with respect to the restaurant that can be seen through the display board.
  • the display of the menu may be deleted when an action for deleting the menu is detected.
  • the link may be opened.
  • other actions are possible.
  • the terminal position and the imaging direction of the wearable terminal are acquired.
  • the terminal position can be acquired from the GPS (Global Positioning System) of the wearable terminal.
  • the imaging direction can be acquired from a geomagnetic sensor or an acceleration sensor of the wearable terminal when imaging is performed with the wearable terminal. You may acquire from other than these.
  • the estimation step estimates the target position of the restaurant based on the terminal position and the imaging direction. If the terminal position and the imaging direction are known, the store position of the imaged restaurant can be estimated.
  • the specifying step may specify a restaurant from the target position and image analysis. Even for restaurants in the same chain store, there are cases where the menu differs depending on the store. For example, if it can be specified that it is a Tokyo Minato Ward store, a limited menu of the Tokyo Minato Ward store can also be displayed.
  • the guideline for imaging the restaurant is displayed as augmented reality on the display board of the wearable terminal.
  • guidelines such as a frame or a cross may be displayed. Image analysis is facilitated by having images taken along the guidelines.
  • an image captured along the guideline may be acquired.
  • a restaurant can be identified efficiently by acquiring and analyzing only the images taken along the guidelines.
  • the selection accepting step accepts selection of a selection target for a restaurant that can be seen through the display board of the wearable terminal. For example, you may receive selection of the selection object by seeing the restaurant which can be seen through the display board of a wearable terminal for a certain period of time. For example, you may touch the restaurant which sees through the display board of a wearable terminal, and may receive selection of selection object. For example, selection of a selection target may be received by placing a cursor on a restaurant that can be seen through a display board of a wearable terminal. For example, a sensor that detects a line of sight, a motion sensor, an acceleration sensor, and the like.
  • the menu may be displayed as an augmented reality on the display board of the wearable terminal only in accordance with a selection target that can be seen through the display board. Since the menu is displayed as augmented reality only in accordance with the selected selection object, the menu can be grasped pinpointly. If the menu is displayed in all the specified restaurants, the screen of the display board may become troublesome.
  • the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
  • the program may be, for example, an application installed on a computer, or may be in the form of SaaS (software as a service) provided from a computer via a network, for example, a flexible disk, a CD It may be provided in a form recorded on a computer-readable recording medium such as a CD-ROM (DVD-ROM, DVD-RAM, etc.).
  • the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
  • the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.
  • a nearest neighbor method a naive Bayes method, a decision tree, a support vector machine, reinforcement learning, or the like may be used.
  • deep learning may be used in which a characteristic amount for learning is generated by using a neural network.

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Le problème décrit par la présente invention est d'identifier un restaurant à partir de l'image dans le champ de vision d'un terminal portable et d'afficher, en réalité augmentée, des menus collectés en fonction du restaurant, sur une carte d'affichage du terminal portable. La solution selon l'invention porte sur un système d'affichage de terminal portable, permettant d'afficher un menu d'un restaurant sur la carte d'affichage d'un terminal portable, pourvu : d'un moyen d'acquisition d'image pour acquérir une image d'un restaurant qui se trouve dans le champ de vision du terminal portable ; un moyen d'identification pour analyser ladite image afin d'identifier le restaurant ; un moyen de collecte pour collecter des menus du restaurant ; et un moyen d'affichage de menu pour afficher sur une carte d'affichage du terminal portable, en réalité augmentée, un menu du restaurant qui est visible au moyen de la carte d'affichage.
PCT/JP2017/016942 2017-04-28 2017-04-28 Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable Ceased WO2018198320A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/016942 WO2018198320A1 (fr) 2017-04-28 2017-04-28 Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/016942 WO2018198320A1 (fr) 2017-04-28 2017-04-28 Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable

Publications (1)

Publication Number Publication Date
WO2018198320A1 true WO2018198320A1 (fr) 2018-11-01

Family

ID=63920176

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/016942 Ceased WO2018198320A1 (fr) 2017-04-28 2017-04-28 Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable

Country Status (1)

Country Link
WO (1) WO2018198320A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021089458A (ja) * 2019-12-02 2021-06-10 株式会社リクルート 順番管理システム、ユーザ装置、およびプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003132068A (ja) * 2001-10-22 2003-05-09 Nec Corp ナビゲーションシステム及びナビゲーション端末
JP2012216135A (ja) * 2011-04-01 2012-11-08 Olympus Corp 画像生成システム、プログラム及び情報記憶媒体
JP2014504413A (ja) * 2010-12-16 2014-02-20 マイクロソフト コーポレーション 理解および意図に基づく拡張現実表示用コンテンツ
JP2015176516A (ja) * 2014-03-18 2015-10-05 富士通株式会社 表示装置、表示制御プログラム、および表示制御方法
JP2016039599A (ja) * 2014-08-11 2016-03-22 セイコーエプソン株式会社 頭部装着型表示装置、情報システム、頭部装着型表示装置の制御方法、および、コンピュータープログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003132068A (ja) * 2001-10-22 2003-05-09 Nec Corp ナビゲーションシステム及びナビゲーション端末
JP2014504413A (ja) * 2010-12-16 2014-02-20 マイクロソフト コーポレーション 理解および意図に基づく拡張現実表示用コンテンツ
JP2012216135A (ja) * 2011-04-01 2012-11-08 Olympus Corp 画像生成システム、プログラム及び情報記憶媒体
JP2015176516A (ja) * 2014-03-18 2015-10-05 富士通株式会社 表示装置、表示制御プログラム、および表示制御方法
JP2016039599A (ja) * 2014-08-11 2016-03-22 セイコーエプソン株式会社 頭部装着型表示装置、情報システム、頭部装着型表示装置の制御方法、および、コンピュータープログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021089458A (ja) * 2019-12-02 2021-06-10 株式会社リクルート 順番管理システム、ユーザ装置、およびプログラム

Similar Documents

Publication Publication Date Title
US11630861B2 (en) Method and apparatus for video searching, terminal and storage medium
TWI418763B (zh) 作為導航器之行動成像裝置
US9183583B2 (en) Augmented reality recommendations
US20180247361A1 (en) Information processing apparatus, information processing method, wearable terminal, and program
JP2010061218A (ja) ウェブ広告効果測定装置、ウェブ広告効果測定方法及びプログラム
Anagnostopoulos et al. Gaze-Informed location-based services
US20140330814A1 (en) Method, client of retrieving information and computer storage medium
CN106203292A (zh) 一种图像的增强现实处理的方法、装置及移动终端
WO2014176938A1 (fr) Procédé et appareil de récupération d'informations
JP6887198B2 (ja) ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム
CN112634469B (zh) 用于处理图像的方法和装置
US9911237B1 (en) Image processing techniques for self-captured images
JP2017204134A (ja) 属性推定装置、属性推定方法およびプログラム
WO2018198320A1 (fr) Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable
WO2019003359A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
TWI661351B (zh) 結合地圖服務的數位內容系統與數位內容產生方法
WO2019021446A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
WO2018216220A1 (fr) Système d'affichage de terminal vestimentaire, procédé et programme d'affichage de terminal vestimentaire
WO2018216221A1 (fr) Système d'affichage de terminal portable, procédé et programme d'affichage de terminal portable
WO2019021447A1 (fr) Système d'affichage de terminal portable, procédé d'affichage de terminal portable et programme
JP6762470B2 (ja) ウェアラブル端末表示システム、ウェアラブル端末表示方法およびプログラム
JP5541868B2 (ja) 画像検索指令システムおよびその動作制御方法
CN103870601A (zh) 一种标识和突出显示网页内容的方法及装置
JP6343412B1 (ja) 地図連動センサ情報表示システム、地図連動センサ情報表示方法およびプログラム
CN112035692B (zh) 图片信息搜索方法和装置、计算机系统和可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17906802

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17906802

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP