HK1175339A - Imaging device, information acquisition system, and program - Google Patents
Imaging device, information acquisition system, and program Download PDFInfo
- Publication number
- HK1175339A HK1175339A HK13102426.5A HK13102426A HK1175339A HK 1175339 A HK1175339 A HK 1175339A HK 13102426 A HK13102426 A HK 13102426A HK 1175339 A HK1175339 A HK 1175339A
- Authority
- HK
- Hong Kong
- Prior art keywords
- information
- image
- image data
- control unit
- latitude
- Prior art date
Links
Description
Technical Field
The present invention relates to an imaging apparatus having a function of acquiring positional information of an object to be imaged and associating the object with the positional information when the object is imaged, and an imaging apparatus, an information acquisition system, and a program for acquiring information of the object using the information.
In the present application, priority is claimed based on japanese patent application No. 2010-025998 filed on 2/8/2010, and the contents thereof are incorporated herein.
Background
Conventionally, there has been known a portable information device such as an imaging apparatus (for example, a digital camera) having an imaging function, which detects positional information where the portable information device is located and processes captured image data in accordance with the detected positional information (see, for example, patent document 1).
[ patent document 1] Japanese patent application laid-open No. 2004-15187
Disclosure of Invention
However, for example, including patent document 1, the user cannot easily obtain information on what object (for example, a building) the user is shooting.
An aspect of the present invention is to provide a photographing apparatus, an information acquisition system, and a program that can enrich a user's movement.
An imaging device according to an aspect of the present invention includes: an imaging unit that images an object; a position information acquiring unit that acquires position information of a photographing position; a control unit for acquiring information on the subject based on the position information and displaying the image data of the subject and the information on the subject on a display unit; and a holding control unit that outputs a holding control signal for holding the image data of the subject and the information on the subject to the control unit.
An imaging device according to an aspect of the present invention includes: a latitude and longitude detecting unit that detects latitude and longitude information of its own position; an azimuth angle detection unit that detects an azimuth angle in the captured image data; a control unit that acquires and displays on a display unit, by using the latitude and longitude information and the azimuth, AR information to which a building in a latitude and longitude range in the azimuth direction is attached, among the latitude and longitude information; and a holding control unit that outputs a holding control signal for storing the AR information and the image data in the storage unit to the control unit after detecting an operation for storing the AR information in the storage unit.
An information acquisition system according to an aspect of the present invention includes an imaging device and an information retrieval system; the imaging device is any one of the above imaging devices.
An information acquisition system according to an aspect of the present invention is constituted by an imaging device and an information retrieval system that extracts a structure in a latitude/longitude range in an azimuth direction in latitude/longitude information based on latitude/longitude information and an azimuth transmitted from the imaging device, and transmits information attached to the extracted structure to the imaging device; the imaging device includes: a latitude and longitude detecting unit that detects latitude and longitude information of its own position; an azimuth angle detection unit that detects an azimuth angle in the captured image data; a control unit that acquires and displays on a display unit, by using the latitude and longitude information and the azimuth, AR information to which a building in a latitude and longitude range in the azimuth direction is attached, among the latitude and longitude information; and a holding control unit that outputs a holding control signal for storing the AR information and the image data in the storage unit to the control unit after detecting an operation for storing the AR information in the storage unit; the information retrieval system includes: a database storing map data corresponding to a building identification number of a building and latitude and longitude information of the building, and a building table corresponding to the AR information of the building displayed by the building identification number and the building identification number; and an information search server that searches the map data for a structure identification number of a structure located in a latitude and longitude range of the azimuth direction in the latitude and longitude information, based on the latitude and longitude information and the azimuth transmitted from the image capturing device, reads the AR information attached to the structure displayed by the structure identification number from the structure table, based on the searched structure identification number, and transmits the AR information of the read structure to the image capturing device.
A program according to an aspect of the present invention is a program for causing a computer to execute a function of any one of the above-described imaging devices, the program causing the computer to execute: inputting position information of a photographing position for photographing an object; acquiring information related to the subject based on the position information; displaying the image data of the subject and information related to the subject on a display unit; and outputting a hold control signal for holding the image data of the subject and the information on the subject to a control unit.
A program according to an aspect of the present invention is a program for causing a computer to execute a function of any one of the above-described imaging devices, the program causing the computer to execute: inputting latitude and longitude information of the position detected by the latitude and longitude detecting unit; inputting an azimuth angle in the captured image data detected by the azimuth angle detecting unit; acquiring and displaying on a display unit, based on the latitude and longitude information and the azimuth, AR information to which a building in a latitude and longitude range in the azimuth direction is attached; and outputting a holding control signal for storing the AR information and the image data in a storage unit to the control unit after detecting an operation for storing the AR information in the storage unit.
According to an aspect of the present invention, there are provided a photographing apparatus, an information acquisition system, and a program that can enrich a user's movement.
Drawings
Fig. 1 is a block diagram showing an example of the configuration of an information acquisition system according to an embodiment of the present invention.
FIG. 2 is a conceptual diagram illustrating an example of a structure table stored in the database of FIG. 1.
Fig. 3 is a flowchart showing the flow of information retrieval processing of a video captured by a digital camera.
FIG. 4 is a conceptual diagram of a display image displayed on a digital camera.
FIG. 5 is a conceptual diagram of a display image displayed on a digital camera.
FIG. 6 is a conceptual diagram of a display image displayed on a digital camera.
Fig. 7 is a flowchart showing the flow of information search processing for images captured by the digital camera.
Fig. 8 is a flowchart showing a process flow of retrieving information from store information input to the digital camera.
Fig. 9 shows a block diagram of an information acquisition system having a function of acquiring and storing AR (augmented reality) information.
Fig. 10 is a diagram showing a composite image on a display unit of a real image of a building and a marker (virtual description) image in which each piece of AR information of the building is described.
Fig. 11 is a diagram showing an AR information table stored in the AR information storage unit in fig. 9.
Fig. 12 is a diagram showing a search range for searching for AR information of a structure.
Fig. 13 is a flowchart showing an example of the operation of the information retrieval system in fig. 9.
[ description of main element symbols ]
1 digital camera 2 information retrieval system
3 radio base station 4 information communication network
11 control part 12 transceiver part
13 photographing part 14GPS
15 storage part 16 orientation sensor
17 display part 18 timer
19 navigation unit 21 information search server
22 database 30AR information storage
31 holding control part
Detailed Description
Hereinafter, an image capturing apparatus and an information acquisition system according to an embodiment of the present invention will be described with reference to the drawings. Fig. 1 is a block diagram showing an example of the configuration of the imaging device and the information acquisition system according to the present embodiment.
In fig. 1, the information acquisition system is composed of a digital camera 1 and an information search system 2, which are examples of imaging devices, and the digital camera 1 performs data communication with a wireless base station 3 by wireless communication, and performs data transmission and reception with the information search system 2 (or an information search server 21 described later) via the wireless base station 3 and an information communication network 4 such as the internet. The imaging device of the present embodiment is a mobile terminal having an imaging function, such as the above-described digital camera 1 or a mobile phone having a camera. For example, the digital camera 1 has a browsing function, and includes a control unit 11, a transmitting/receiving unit 12, an imaging unit 13, a GPS (Global Positioning System) 14, a storage unit 15, an orientation sensor 16 (orientation angle detection unit), a display unit 17, and a timer 18. Here, the wireless communication according to the present embodiment includes a communication party such as a radio wave, light, or sound wave without using a line as a transmission path.
The image pickup unit 13 includes a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) image sensor, a lens, and the like, and outputs image data obtained by picking up an image of an object (a subject to be picked up) to the control unit 11. In the following description, the subject of the present embodiment represents a building (a commercial building including a shop, a public building including a school and a hospital, a factory, a house including an apartment, a tower, a bridge, a dam, an amusement park, an artificial building such as a bronze statue, and the like) as an example. The GPS14 (position information acquiring unit, latitude and longitude detecting unit) receives a control signal instructing photographing from a user (for example, a control signal output from an unillustrated peripheral circuit that detects that the photographing button B1 has been pressed), or a control signal instructing information retrieval (for example, a control signal output from an unillustrated peripheral circuit that detects that an image (logo) is selected by the user touching the image (logo) for display retrieval or the like), acquires information on the latitude and longitude (latitude and longitude information, position information) of the digital camera 1, and outputs the information to the control unit 11. The position information acquiring unit (latitude/longitude detecting unit) according to the present embodiment may have other configurations as long as it can acquire the position of the imaging device 13, and for example, the wireless base station 3 may acquire the position information of the imaging device 13.
The azimuth sensor 16 (azimuth detecting unit, azimuth information acquiring unit) is constituted by an electronic compass or the like, and when a control signal instructing photographing from the user (for example, a control signal output from an unillustrated peripheral circuit that detects that the photographing button B1 has been pressed) or a control signal instructing information retrieval (for example, a control signal output from an unillustrated peripheral circuit that detects that an image (logo) for retrieval or the like has been selected by the user touching the display) is input thereto, the CCD and the lens arranged in series in the photographing unit 13, for example, detect the azimuth in the series direction and output the detected azimuth as azimuth information to the control unit 11. The orientation sensor 16 may detect the orientation of the direction of the subject viewed from the user (i.e., the shooting direction), and output the orientation information to the control unit 11. Here, when the azimuth is used as the azimuth information, the azimuth is expressed in units of degrees, minutes, and seconds, for example, with the latitude and longitude of the position information as a center point, with north as a reference (0 degrees), and clockwise around east (90 degrees), south (180 degrees), and west (270 degrees).
When a control signal instructing the user to press the photographing button B1 or the like is input thereto from a peripheral circuit not shown, the control unit 11 gives image identification information to the photographed image data, and writes the image data photographed with respect to each of the image identification information and the time data, the position information, and the orientation information acquired from the timer 18 in the storage unit 15 in the photographing order in association with each other.
The display unit 17 displays the image data captured by the imaging unit 13 or the image data read from the storage unit 15 by the control unit 11 when the user selects an image to be displayed.
The display unit 17 displays the video data received from the information search system 2, the character data input by the user, and the like under the control of the control unit 11 as described later.
The transmitting/receiving unit 12 transmits/receives data such as image data, text data, or control signals to/from the information search system 2 via the information communication network 4 by performing wireless communication with the wireless base station 3.
Next, the information search system 2 includes an information search server 21 and a database 22 as shown in fig. 1. The database 22 may be provided in a storage medium (for example, a memory or an HDD) in the information search server 21, or may be provided in an external storage medium or a storage medium of another terminal.
As shown in the building table of fig. 2, the database 22 stores building identification information for identifying a building, a building name which is a name of a building, building information (information such as an address, a telephone number, a type, and peripheral image data centering on a building), position information such as latitude and longitude of a building, a description of a building (information described in a store in the case of the store), and posting information (comments on a visiting user, comments on the user, image data posted by the user, and the like).
The database 22 stores map data in which registered structures are arranged on coordinate axes having latitude and longitude as two-dimensional planes. In the map data, each structure is displayed at a position of latitude and longitude corresponding to the structure identification information.
The information search server 21 searches for a building located at the closest distance in the direction indicated by the azimuth information from the latitude and longitude position indicated by the position information using the inputted position information and azimuth information, and acquires building identification information of the searched building.
The information search server 21 searches the building table for a building corresponding to the building identification information of the searched building, and transmits information (building name, building information, etc.) of the searched building to the digital camera 1. The information search server 21 may selectively transmit each piece of information of the building to be transmitted to the digital camera 1, as necessary, for example, in accordance with the transmission/reception data capacity. At this time, the information search server 21 may transmit the remaining information among the pieces of information of the building to the digital camera 1 again by a predetermined operation (for example, a request from the digital camera 1) or the like.
< information search of video image captured by digital Camera 1 >
Next, the operation of the present embodiment will be described with reference to fig. 1 and 3, and fig. 4 to 6. Fig. 3 is a flowchart showing an operation example of the information search processing according to the present embodiment when the digital camera 1 is directed to the building during shooting and information search is performed. Fig. 4 to 6 are diagrams showing a screen displayed on the display unit 17 of the digital camera 1 and the arrangement of the photographing button B1 of the digital camera.
In the following description, for example, a state in which the user searches for a restaurant that has dinner while walking in an alarm area will be described.
The user moves the lens of the digital camera 1 to a restaurant having a desired appearance while staying in the downtown area (step S1), and when the restaurant having the desired information appearance is found, that is, as shown in fig. 4, touches (presses) the browsing image I1 provided in the video display field MP of the digital camera 1, thereby selecting the browsing image I1 (step S2).
In this case, the control unit 11 displays the image data of the restaurant being photographed in the image display field MP via the display unit 17 as shown in fig. 4, in the image capturing state, the image data being photographed by the image capturing unit 13 inputted from the image capturing unit 13. The image display section MP (image display unit) in the present embodiment is configured by a touch panel (including, for example, a pressure-sensitive type, an electrostatic type, or a sound wave type).
In step S2, the user may select the restaurant image by touching the browsing image I1 while viewing the image display field MP displaying the restaurant image captured by the digital camera 1 after pressing the image-taking button B1.
By touching the browsing image I1, a signal indicating that the browsing image I1 has been selected is output to the control unit 11 from a peripheral circuit not shown. Next, the control unit 11 detects the fact that the input through image I1 has been selected, detects the fact that the through image I1 has been selected, and adds video identification information to the restaurant video data being captured by the imaging unit 13, and writes the video data and the imaging time in the storage unit 15 in association with the video identification information, and stores the video data (step S3).
When the browsing image I1 is touched, the controller 11 writes and stores the positional information acquired by the GPS14 and the orientation information acquired by the orientation sensor 16 as related information in the storage unit 15 in association with the image recognition information (step S4).
Next, after the image data and the related information are written in the storage unit 15, the control unit 11 transmits camera identification information for identifying the digital camera 1 (user identification information given to the user or identification information given to the digital camera 1) and a search request signal including the position information and the orientation information to the information search system 2 via the transmission/reception unit 12 (step S5). In addition, the control unit 11 may transmit the search request signal to the information search system 2 via the transmission/reception unit 12 without writing the image data, the associated information, and the like in the storage unit 15 in the steps S3 to S5.
Next, in the information retrieval system 2, when the information retrieval server 21 receives the retrieval request signal from the digital camera 1, the building to be retrieved is extracted from the map data in the database 22 based on the position information and the azimuth information included in the retrieval request signal. Next, the information search server 21 transmits each piece of information in the building table of fig. 2 as search result information from the building table so as to include the building identification information, to the digital camera 1, by using the building identification information (step S7). At this time, for example, the information search server 21 reads an address on the network of the digital camera 1 added when the search request signal is transmitted from the digital camera 1 via the transmission/reception unit 12, and transmits search result information to the address.
In step S7, the information search server 21 may extract the structure to be searched from the map data in the database 22 based on the position information and the azimuth information included in the search request signal only when it is detected that the camera identification information included in the search request signal is registered in the user registration table stored in the database 22. In this case, when detecting that the camera identification information included in the search request signal is not registered in the user registration table, the information search server 21 may transmit information that requires the user to register to the digital camera 1 based on the camera identification information, for example.
Next, in the digital camera 1, when the control unit 11 receives the search result information from the information search system 2 via the transmission/reception unit 12, the display unit 17 displays the building information in the information display field SP as shown in fig. 5 (step S8). For example, when the building is a restaurant, information on what kind of restaurant (chinese style, wind style, french style, semantic style, etc.), a telephone number, an address, etc. of the restaurant are displayed. The information display field SP shown in fig. 5 is only an example of the present embodiment, and may be displayed in a part of the restaurant image superimposition information display field SP. The information display field SP is a part of the video display field MP, and is selectively displayed by touching, for example, the browsing image I1 or a CM image I2 described later.
When the user touches the CM image I2, the peripheral circuit outputs a signal indicating that the CM image I2 is selected to the control unit 11.
In this way, the control unit 11 detects the fact that the CM image I2 has been selected by receiving a signal indicating that the fact that the CM image I2 has been selected, and displays the restaurant advertisement information (menu, number of seats, in-store video, recommended food for the main kitchen today, etc.) included in the search result information in the information display field SP via the display unit 17. When the advertisement information includes video data such as in-store video, the control unit 11 displays the video data in the information display field SP.
When the user touches posting browsing image I3, the peripheral circuit outputs a signal indicating that posting browsing image I3 has been selected to control unit 11.
In this way, the control unit 11 detects the selected posting browsing image I3 by receiving a signal indicating that the posting browsing image I3 has been selected, and displays posting information (comments) written by other users included in the search result information in the information display field SP via the display unit 17. When there are a plurality of posting information, the control unit 11 displays the plurality of posting information in the information display field SP in order of the posted dates, for example.
When the posting information also includes image data, such as a photograph of each food taken by another user or image data of the inside of a store, the control unit 11 displays the image data in the information display field SP via the display unit 17.
Next, in fig. 5, when the control unit 11 detects a signal indicating that the information display field SP is touched so as to slide in the right direction from the left end of the information display field SP, for example, the display of the information display field SP is terminated, and the image of the restaurant is displayed on the image display field MP via the display unit 17 as shown in fig. 4. Further, for example, when a signal indicating that the posting browsing image I3 is selected is detected in the display of posting information in the information viewing column SP, the control unit 11 may end the display of the information display column SP or the display of posting information.
The control unit 11 may display the advertisement information together with the building information when the advertisement information (or the posting information) is displayed after the building information is displayed in the information display field SP, or may display the advertisement information after the building information is deleted from the information display field SP.
When detecting a signal indicating that the in-store image displayed on the information display field SP has been touched, the control unit 11 displays an enlarged image or a reduced image of the in-store image on the information display field SP (or the image display field MP).
When the user touches the mail image I4, the peripheral circuit outputs a signal indicating that the mail image I4 is selected to the control unit 11.
In this way, the control unit 11 detects the fact that the mail image I4 has been selected by receiving a signal indicating that the fact that the mail image I4 has been selected is displayed, and displays the input field IP shown in fig. 6 on a part of the video display field MP via the display unit 17. The input field IP here is composed of a character field T for inputting information and the like by the user, and a touch screen type keyboard section K. The input field IP is used for the user to input the posting information, the shop name, the time of year, and the like, which will be described later.
The user writes the draft information such as the food eaten by the user, the atmosphere in the store, or the service quality of the store clerk in the character field T of the input field IP by using the touch screen type keyboard section K. Next, when the user touches the mail image I4 again, the control unit 11 detects that the mail image I4 has been selected, and transmits the text information described in the text field T and the image data displayed in the image display field MP as posting information together with the camera identification information and the building identification information to the information search system 2 via the transmission/reception unit 12.
When the information search server 21 receives the search request signal from the digital camera 1, new posting information input in accordance with the building identification information is written into the posting column of the building table of the database 22. Before writing the posting information written in the posting column (text column T) in the posting column of the building table of the database 22, the information search server 21 may write the posting information in the posting column of the building table of the database 22 only when the presence or absence of the camera identification information included in the search request signal is detected and the camera identification information is detected to be registered in the user registration table of the database 22.
Next, the control unit 11 detects whether or not the end image E has been selected (step S9). Here, when the user touches the end image E, the peripheral circuit outputs a signal indicating that the end image E has been selected to the control unit 11.
Next, the control unit 11 detects that the end image E has been selected by receiving a signal indicating that the end image E has been selected, and ends the information search process.
On the other hand, when the signal indicating that the end image E has been selected is not input and the end image E has not been selected is not detected, the control unit 11 returns the process to step S1 and continues the image capturing process and the information search process of the digital camera 1.
When the browsing image I1 is selected, the control unit 11 may include the captured image data in the search request signal transmitted to the information search system 2. Next, the information search server 21 may compare the peripheral image data of the building information searched from the building table with the transmitted image data by image recognition based on the building identification information of the building extracted from the map data, extract the feature amount from the image data, compare the feature amount with the feature amount of the stored peripheral image data, and detect whether or not the feature amount is equal to or more than a value in which the similarity is set, thereby determining whether or not the feature amount corresponds to the photographed building.
In this case, when it is determined that the structure of the surrounding image data is not the same as the structure of the image data and the comparison result is obtained, the information search server 21 may extract the next closest structure from the map data by using the position information and the orientation information, compare the feature values again, and evaluate the similarity between the surrounding image data of the newly extracted structure and the image data being captured.
The through image I1 of the present embodiment may be provided as a through button (not shown) in the imaging device 1, similarly to the imaging button B1. In this case, the function of the navigation button is the same as that of the navigation image I1. The photographing button B1 of the present embodiment may function as a navigation button by operating a switching unit, not shown. In which case there is no need to browse through the image I1 or browse buttons.
< information retrieval of video image captured by digital Camera 1 >
Next, the operation of the present embodiment will be described with reference to fig. 1 and 7, and fig. 4 to 6. Fig. 7 is a flowchart showing an operation example of the information search process according to the present embodiment in the case where the digital camera 1 photographs a structure, reads out image data stored in the storage unit 15, and searches for information from the image data.
The following description shows an example of a state in which, for example, a user searches for a restaurant that has dinner while walking around a busy area in which he is newly staying, and then takes images of image data of a plurality of restaurants and searches for a restaurant that has dinner from the restaurant.
The user moves the digital camera 1 to a restaurant having a desired appearance while staying in an alarm area (step S11), and then presses a shooting button B1 (shutter) provided in the digital camera 1 when the user finds a restaurant having an appearance for information retrieval (step S12). Here, the control unit 11 displays the image data of the restaurant being photographed in the image display field MP of the display unit 17 with the image data being photographed by the photographing unit 13 as shown in fig. 4.
Next, when the user presses the photographing button B1, the peripheral circuit outputs a signal indicating that the photographing button has been pressed to the control unit 11.
Next, the control unit 11 detects that the photographing button B1 is pressed by receiving a signal indicating that the photographing button B1 is pressed, gives image identification information to the restaurant image data being photographed by the photographing unit 13, and writes and stores the image data and the photographing time in the storage unit 15 in association with the image identification information (step S13).
When the photographing button B1 is pressed, the controller 11 writes the position information acquired by the GPS14 and the direction information acquired by the direction sensor 16 in the storage 15 in association with the image recognition information and stores the information (step S14).
Next, when the user collects information about the restaurant to be photographed, the user touches the readout image R of the photographed image. In this way, the peripheral circuit outputs a signal indicating that the reading of the image R has been selected to the control unit 11. Here, the control unit 11 determines whether or not a video is stored in the storage unit 15 by whether or not a signal indicating that the reading of the image R has been selected is input (step S15).
At this time, when the signal indicating that the readout image R has been selected is output, the control unit 11 detects that the readout image R has been selected and starts the information search, and therefore the process proceeds to step S16. The readout image R may be a readout button (not shown). In this case, the readout button may be provided on the digital camera 1 body as in the case of the photographing button B1.
On the other hand, when the signal indicating that the reading of the image R has been selected is not input or when no image is stored in the storage unit 15, the control unit 11 performs a new video processing, and therefore returns the processing to step S11.
Next, when the user touches the readout image R and a signal indicating that the readout image R has been selected is input from the peripheral circuit, the control unit 11 sequentially reads out the image data from the storage unit 15 in the order of time-series shooting (step S16), and displays the image data as a photo image in the image display field MP by the display unit 17 for each preset number of sheets (or each piece of image data).
When it is difficult to enter the photo image into one page (or to display the photo images in separate pages), if the user touches the image displayed in the image display section MP in a manner of sliding in a predetermined direction, the photo image of the previous page or the next page is displayed in the image display section MP. At this time, the control unit 11 detects a signal indicating that the image displayed in the image display field MP is touched to slide in a predetermined direction, and displays the photo image of the immediately preceding page or the next page in the image display field MP through the display unit 17.
Next, when the user selects a restaurant image of interest from the photo images by touching the image and further touches the browsing image I1, the peripheral circuit outputs the selected image recognition signal and a signal indicating that the browsing image I1 has been selected to the control unit 11.
In this way, when a signal indicating that the viewing image I1 has been pressed is input, the control unit 11 reads out the position information and the direction information corresponding to the image identification information from the storage unit 15 based on the image identification information of the selected image data, and transmits a search request signal including the camera identification information, the position information, and the direction information to the information search system 2 via the transmitting/receiving unit 12 (step S17).
The following processing of steps S6 to S8 is the same as the processing of steps S6 to S8 in fig. 3, and therefore, the description thereof is omitted. Next, the control unit 11 detects whether or not the end image E has been selected (step S18). At this time, when the user touches the end image E and a signal indicating that the end image E has been selected is input from the peripheral circuit, the control unit 11 ends the browsing process. On the other hand, when the signal indicating that the end image E has been selected is not input, the control unit 11 returns the process to step S16 to continue the video selection process from the photo video.
< information retrieval of store information from the digital camera 1 inputted >
Next, the operation of the present embodiment will be described with reference to fig. 1 and 8, and fig. 4 to 6. Fig. 8 is a flowchart showing an operation example of the information search processing according to the present embodiment when the user inputs a store name into the digital camera 1 and attempts to search for information such as a restaurant of the input store name from the building table of fig. 2 corresponding to the store name stored in the database 22.
In the following description, for example, a state is displayed in which it is desired to confirm information on a restaurant listened to from a friend when the user is at night. When the user selects the search image I5 by touching it, the peripheral circuit outputs a signal indicating that the search image I5 has been selected to the control section 11.
In this way, the control unit 11 detects the fact that the search image I5 has been selected by receiving a signal indicating that the search image I5 has been selected, and displays the input field IP shown in fig. 6 in a part of the video display field MP via the display unit 17.
Next, the user writes the shop name of the restaurant to be searched in the character field T of the input field IP using the touch screen type keyboard section K (step S21), and the user touches the search image I5 again (step S22).
In this way, the peripheral circuit outputs a signal indicating that the search image I5 has been selected and character data of the store name input in the character field T to the control unit 11. Next, the control unit 11 detects the selected search image I5 by inputting a signal indicating the selected search image I5, reads the data of the store name input in the text field T, uses the data as a search request signal together with the camera identification information of the control unit itself, and transmits the search request signal to the information search system 2 via the transmission/reception unit 12 (step S23).
Next, when the search request signal is input, the server 21 reads out information (store information, each information in the table of fig. 2) of the building (store) corresponding to the store name from the building table in the database 22 (step S24).
After reading the store information, the server 21 transmits the acquired store information to the digital camera 1 (step S25).
When the store information is received by the transmitting/receiving unit 12, the control unit 11 displays the video data of the periphery of the store included in the building information in the video display field MP of fig. 5 and the building information in the information display field SP via the display unit 17 (step S26). For example, information and telephone numbers indicating what kind of restaurant (Chinese style, wind style, French style, semantic style, etc.) the restaurant is.
When the user selects the CM image I2 by touching it, the control unit 11 detects that the CM image I2 has been selected, and displays the restaurant advertisement information (menu, recommended food for the main kitchen today, or the like) included in the search result information in the information display field SP via the display unit 17.
When the user touches the selection posting browsing image I3, the peripheral circuit outputs a signal indicating that the posting browsing image I3 has been selected to the control unit 11.
The control unit 11 detects the selected posting browsing image I3 by receiving a signal indicating that the posting browsing image I3 has been selected, and displays posting information written by another user included in the search result information in the information display field SP via the display unit 17.
When there are a plurality of posting information, the control unit 11 sequentially displays the plurality of posting information on the information display field SP via the display unit 17. When the posting information also includes video data, the control unit 11 causes the display unit 17 to display the video data in the information display field SP.
Next, the control unit 11 detects whether or not the end image E has been selected (step S27). The detection of the selection of the end image E is the same as the processing of step S9 (in fig. 3) in < information retrieval of a video being captured by the digital camera 1 >, and therefore, the description thereof is omitted.
At this time, when a signal indicating that the user has selected the end image E by touching the end image E is input from the peripheral circuit, the control unit 11 ends the browsing process. On the other hand, when the signal indicating that the end image E has been selected is not input, the control unit 11 returns the process to step S21 to continue the search for the store information.
< charging processing for stores registered in database 22 >
The building table of the database 22 may store data of discount coupons for each store (including restaurants and companies) when the building is a store.
Next, the server 21 adds the discount coupon information to the search result information, and transmits the discount coupon information to the digital camera 1.
When the user uses the discount coupon information to eat or shop, the information search server 21 detects whether the user has eaten or shops based on whether the discount coupon information is used. For example, when the discount coupon information is displayed on a barcode (including a two-dimensional barcode) or the like in the information display field SP of the digital camera 1, the barcode is read by a reader of the store, and the camera identification information and the cost of shopping (or eating) transmitted from the digital camera 1 and the use information indicating that the discount coupon is used are transmitted to the information retrieval system 2.
Thus, when the information search server 21 receives the use information, it reads out the history information corresponding to the camera identification information attached to the use information from the database 22, and in the stage of using the information (is information search performed and the store is used. The information search server 21 writes the stage of information search according to the camera identification information and the history using the discount coupon information into the history table stored in the database 22.
The bar code includes a building identification information indicating the store. The database 22 is provided with a toll table for storing a toll history and a toll integrated value corresponding to each shop corresponding to the building identification information.
Next, the information search server 21 writes the charge corresponding to the amount of money used and the stage of use of the information as a history into the charge table in accordance with the building identification information, and adds a new charge to the integrated value to update the integrated value.
< additional processing for points of user >
The information search server 21 accumulates points obtained in association with the camera identification information in the user registration table of the database 22, for the points associated with the use amount (for example, calculated by multiplying the use amount by a point coefficient) each time the user uses the discount coupon information at the store.
The information search server 21 accumulates points of a numerical value set in advance for the user who transmitted the posting information, in the user registration table, in association with the camera identification information. The points may also be used with the discount coupon information in paying a price instead of money.
< presentation order of posting information >
Here, the information search server 21 may transmit the nickname and the title as the posting information and the posting identification information to the digital camera 1 at the initial stage. Next, the control unit 11 first displays only a plurality of nicknames and titles in the information display field SP. Then, the user touches and selects a nickname and a title of the interested posting information from the plurality of display persons. In this way, the peripheral circuit transmits a signal indicating that the nickname and the title have been selected and the posting identification information of the selected posting information to the control unit 11.
In this way, the control unit 11 transmits the posting identification information and the posting information transmission request to the information retrieval system 2.
In this way, the information search server 21 transmits the character data and the video data (entire posting information) corresponding to the posted identification information to the digital camera 1. Here, the information search server 21 adds the posting identification information for identification to each posting information, writes the posting identification information into the user log in association with the camera identification information, and stores the posting identification information.
In the present embodiment, the control unit 11 may receive the text data and the image data from the information search system 2 via the transmission/reception unit 12, and then display the text data in the information display field SP and the image data in the image display field MP.
The information search server 21 searches the user registration table for the posting identification information corresponding to the referred posting information, and increases the number of times of reference of the camera identification information corresponding to the posting identification information.
Next, at the stage of displaying the nickname and title for the first time the user selects the posted information, the information search server 21 transmits the nickname and title of the user whose reference frequency is large in the display table whose rank is the upper rank of the information display field SP in the order of the number of times the posted information is selected.
In this way, the control unit 11 sequentially displays the nickname and the title in the information display field SP in accordance with the display table.
< display processing of past video data >
In the present embodiment, the database 22 may have a past image table in which image data of buildings and scenery photographed at each latitude and longitude in each year is stored in association with each latitude and longitude.
When the user selects the past video image I6 by touching, the peripheral circuit outputs a signal indicating that the past video image I6 has been selected to the control unit 11.
In this way, the control unit 11 detects the selection of the past video image I6 by receiving a signal indicating the selection of the past video image I6, and displays the input field IP shown in fig. 6 on a part of the video display field MP via the display unit 17.
Next, the user writes the year (e.g., the western-style symbol) in the text field T using the touch screen type keyboard section K, and then touches the past video image I6 (or the browsing image I1).
In this way, the peripheral circuit transmits a signal indicating that the past video image I6 has been selected and the character data of the year to the control unit 11. When the control unit 11 detects that the previous image I6 has been selected, the character data is read (the year written in the character field T is displayed).
After reading the year data, the control unit 11 transmits the past image search request to the information search system 2 together with the read year, position information, orientation information, and camera identification information.
Next, in the information search system 2, the information search server 21 selects a past image table corresponding to the latitude and longitude based on the position information of the building (received from the digital camera 1 displayed in the image display field MP). Next, the information search server 21 reads the video data corresponding to the azimuth information and the year in the selected past video table, and transmits the read video data to the digital camera 1. The digital camera 1 displays the video data corresponding to the year received from the information search server 21 in the video display field MP via the display unit 17. When there is additional information to the read video data, the information search server 21 transmits the read video data and the additional information to the digital camera 1.
Therefore, the user can obtain the information of the buildings in the past years, or the scenery only without other images.
Here, when the information search server 21 cannot search the previous image table of the corresponding latitude and longitude, that is, the previous image table of the latitude and longitude located in the direction of the azimuth information and closest to the latitude and longitude is searched for, and the image data is extracted from the previous image table.
In addition, if the year of the search target is an age with no photograph, the image data of the scenes drawn in the age or cg (computer graphics) created by imagination may be stored in the past image table of the storage unit 15 in association with the year, instead of the image data of the photograph.
< processing for displaying AR (Augmented Reality) information >
As described above, in the configuration of fig. 1, the control unit 11 adds image recognition information to the image data obtained by imaging the subject by the imaging unit 13, such as latitude and longitude information (position information) indicating the position where the digital camera 1 itself exists, which is obtained from the GPS14, and azimuth information indicating the azimuth angle in the optical axis direction (imaging direction) of the digital camera 1, which is obtained by the azimuth sensor 16 (azimuth angle detection unit), and writes the image data in the storage unit 15 in the imaging order.
In the configuration of fig. 1, as shown in fig. 9, a holding control unit 31 and an AR information storage unit 30 are newly added as functions for acquiring and storing AR information. Fig. 9 is a block diagram showing an example of the configuration of an information acquisition system having a function of acquiring and storing AR information.
The AR information of the present embodiment is information (information related to an object) showing various information shown in the building table shown in fig. 2. For example, the AR information includes building identification information for identifying a building, a building name which is a name of the building, building information (information such as an address, a telephone number, a type, and peripheral image data centering on the building), position information such as latitude and longitude of the building, a description of the building (information described in a store in the case of the store), and posting information (comments on a visiting user, such as an evaluation, and image data posted by the user). Here, the building identification information may be a URL indicating a location where the building information is stored.
The control unit 11 may be configured to allow the user to arbitrarily set whether or not any of the AR information is to be a mark (augmented reality object) to be superimposed on the image data of the through-image (image after the subject is captured).
Here, the through-image is image data in which the image formed on the imaging element by the imaging unit 13 is continuously output to the control unit 11 as image data, and the control unit 11 sequentially displays the image data on the display unit 17.
The holding control unit 31 detects an operation for requesting holding of the AR information. In other words, the holding control unit 31 detects the holding timing when the image data displayed on the display unit 17 by imaging the imaging unit 13 on an element such as a CCD and the AR information within the image data (within the angle of view of the image) are stored in the AR information storage unit 30. The AR information storage unit 30 may be provided in the information search system 2.
Here, the control unit 11 stores a table showing the correspondence between the focal length, magnification, and the like, and the angle of view in the storage unit 15 in advance, and obtains the angle of view information of the image from the focal length, magnification, and the like at the time of acquiring the image data.
Fig. 10 is a diagram showing a composite image (AR image) on the display unit 17 of a real image (through image) of buildings T1 to T4 and an image of an augmented reality object (for example, a marker in which augmented reality building information is described) TB1, TB2, TB3, and TB4, which are images in which AR information of the buildings T1 to T4 is described.
The digital camera 1 according to the present embodiment has an AR information acquisition mode or a normal shooting mode for acquiring AR information. The control unit 11 controls the digital camera 1 to set the digital camera to either the AR information acquisition mode or the normal photographing mode by pressing a button (AR information acquisition button ARB1) of the AR information acquisition mode provided in the housing of the digital camera or touching an image (AR information acquisition image ARI1) of the AR information acquisition mode displayed on the display unit 17, and the control unit 11 detects the pressing of the button by detecting a signal from a switch or the touching of the image on the display unit 17 by a touch sensor.
The digital camera 1 according to the present embodiment has a mode (AR display mode) in which, in the case of the AR information acquisition mode, the AR information is displayed as a marker (an augmented reality object in which the video data and the AR information are combined) or a mode (AR non-display mode) in which the display is not performed (only the video data is displayed). The control unit 11 generates a signal from a switch or touches a display/non-display image (AR switching image ARI2) displayed on the display unit 17 by pressing a display/non-display button (AR switching button ARB2) provided on the housing of the digital camera 1, and the control unit 11 detects the pressing of the button or the touching of the image by a sensor and controls to display or non-display.
Here, the display unit 17 is provided with a transparent touch sensor on the display element, and when an image is touched, the touch sensor is touched, and the coordinate value of the touched region is transmitted to the control unit 11 as a detection signal. The control unit 11 can match the coordinates of the touch sensor with the coordinates of the display element to be displayed, and determine the image being displayed based on the coordinate values of the detection signal (image displayed on the display unit 17, for example, start-up of any application program.
The control unit 11 transmits latitude and longitude information (position information), orientation information, and angle of view information after the captured image data displayed on the display unit 17 when held, to the information retrieval system 2 via the transmission/reception unit 12, together with the AR information acquisition request, every time the orientation angle (orientation information) of the digital camera 1 changes. The control unit 11 may transmit the position information and the AR information acquisition request to the information search system 2 via the transmission/reception unit 12 at regular intervals or at the instruction of the user, to acquire the AR information. The control unit 11 may transmit the image data and the AR information acquisition request held by the holding control unit 31 to the information search system 2 via the transmission/reception unit 12 to acquire the AR information, periodically or at the instruction of the user.
Here, the control unit 11 detects the azimuth angle at a predetermined cycle with respect to the change of the azimuth angle, detects the change of the azimuth angle when the azimuth angle is changed from the azimuth angle stored in the azimuth angle storage unit of the storage unit 15 by more than a predetermined angle, and newly stores the azimuth angle at that time in the azimuth angle storage unit.
Upon receiving the AR information acquisition request from the digital camera 1, the information search server 21 obtains, from the viewing angle information, a search distance from the coordinate position of the latitude and longitude indicated by the obtained latitude and longitude information to the coordinate position of the latitude and longitude of the search AR information, based on the latitude and longitude information, azimuth information, and viewing angle information added to the AR information acquisition request.
The information search server 21 reads the search distance corresponding to the viewing angle information from the search distance table in the database 22 in which the viewing angle information and the search distance corresponding to the viewing angle are stored, and thereby obtains the search distance of the AR information.
Fig. 12 is a diagram showing a search range for searching for AR information of a structure.
Next, the information search server 21 finds a search range shown in fig. 12 based on the coordinate position of the latitude/longitude information, the azimuth angle of the optical axis of the lens of the digital camera 1, and the search distance, and reads the building identification information located in the search range from the map data in the database 22.
The information search server 21 reads out the AR information of the building from the building table in the database 22 based on the building identification information that has been read out.
The information search server 21 transmits the read AR information to the digital camera 1 together with the corresponding building identification information.
With this, the control unit 11 superimposes and synthesizes the image of the logo (augmented reality object) of the AR information on the image data (through image) in accordance with the coordinate value based on the latitude and longitude information indicated by the latitude and longitude information included in the AR information with the coordinate value based on the latitude and longitude information of the digital camera 1, and displays the superimposed image in the state shown in fig. 10 on the display unit 17.
Upon receiving the holding control signal from the holding control unit 31, the control unit 11 performs processing for recording each piece of image identification information of the image data at the time of holding into the AR information storage unit 30, i.e., latitude and longitude information, azimuth information, and angle of view information at the time of holding, and the AR information transmitted from the information search system 2 in response to the AR information acquisition request and the building identification information for identifying the building of the AR information.
Fig. 11 is a diagram showing the configuration of the AR information table stored in the AR information storage unit 30 in fig. 9. The AR information table is stored in the AR information storage unit 30 for each piece of video identification information.
The control unit 11 generates the above-described image identification information, and writes and stores, as an AR information table, the building identification information (the URL indicating the location where the AR information of the building is stored), the AR information (including latitude and longitude information) of the building, and latitude and longitude information, orientation information, and angle of view information of the digital camera 1 itself when the holding control signal is input and held in the AR information storage unit 30.
When the hold control signal is input, the control unit 11 writes and stores the image data of the imaging element imaged at the imaging unit 13 at that point in time in the storage unit 15 in association with the image identification information given to the image data.
Here, as an operation for instructing to hold the input of the AR information, the holding control unit 31 includes, for example, a detection means (detecting a signal from a switch generated by pressing the AR information acquisition button ARB1 provided in the housing of the digital camera 1) or a detection means for detecting, by a touch sensor, an image (AR information acquisition image ARI1) acquired by displaying the AR information displayed on the display unit 17.
Next, when the AR information acquisition button is pressed or when the image for displaying the AR information acquisition is touched, the holding control unit 31 detects that the signals for recording the video data and the AR information are input, and outputs a holding control signal for performing holding control to the control unit 11. In this way, the hold control unit 31 detects that the signal for recording the image data and the AR information is input by the action (operation, manipulation, movement) of the digital camera 1 by the user, and outputs a hold control signal for performing hold control to the control unit 11.
The holding control unit 31 may detect that the digital camera 1 has been abruptly moved (i.e., a movement such as swinging the digital camera 1 downward to make the screen easy to see) by using an acceleration sensor as an operation for holding the AR information, and may set the detection time as the holding time.
In the acceleration detection using the acceleration sensor, the holding control unit 31 outputs a holding control signal to the control unit 11 as a control signal of a holding instruction after acceleration data equal to or more than a predetermined threshold value is supplied from the acceleration sensor. In this way, the hold control unit 31 detects that the signals of the recorded video data and the AR information are input by the operation of the terminal (for example, the digital camera 1), and outputs a hold control signal for performing hold control to the control unit 11.
As an operation of holding the AR information, for example, the holding control unit 31 may detect the feature point by moving the detected image feature point within a predetermined detection range by a predetermined distance, or may detect a change in the image pattern in the image data by changing the contrast of the image data output from the image pickup device over a predetermined difference, and set the detection time as the holding time.
In the detection of the change in the image pattern, the holding control section 31 outputs the holding control signal to the control section 11 as a control signal for performing holding control after detecting the change in the image pattern equal to or larger than a predetermined threshold value.
The holding control unit 31 may detect a sharp change in the angle of the digital camera 1 (movement by a motion such as swinging the digital camera 1 downward for easy visibility of the screen) by using an elevation angle sensor as the operation for holding the AR information, and may set the detection time as the holding time.
In the detection of the elevation angle information using the elevation angle sensor, the holding control unit 31 outputs the holding control signal to the control unit 11 as a control signal for performing holding control after the elevation angle sensor is supplied with a change in elevation angle equal to or greater than a predetermined threshold value. In this way, the hold control unit 31 detects that the signals of the recorded video data and the AR information are input by the operation of the terminal (for example, the digital camera 1), and outputs a hold control signal for performing hold control to the control unit 11.
Further, when the above-described operations of using the acceleration sensor, the change in the image pattern, and the holding by the elevation sensor are instructed, the control unit 11 acquires the AR information corresponding to the image data at the time of receiving the holding control signal, and therefore, it is necessary to store the image data acquired each time the acquisition of the AR information is requested, the building identification information corresponding to the image data, and the AR information, latitude and longitude information, orientation information, and angle of view information of the building in the buffer unit of the AR information storage unit 30 within a predetermined time, for example, within 1 second.
Therefore, the control unit 11 stores (caches) the image data corresponding to the AR information acquisition request, the building identification information corresponding to the image data, and the AR information, latitude and longitude information, orientation information, and angle of view information of the building in the buffer of the AR information storage unit 30 for a predetermined time, for example, 1 second.
Next, when receiving the request for obtaining AR information immediately before or a predetermined time before the holding control signal after the holding control signal is input, the control unit 11 writes and stores the image data obtained from the information retrieval system 2 and stored in the buffer unit in response to the request for obtaining AR information corresponding to the image identification information, the building identification information corresponding to the image data, and the AR information, latitude and longitude information, orientation information, and angle of view information of the building in the AR information storage unit 30 as an AR information table to which time information is added.
The control unit 11 writes the image data and the image identification information into the storage unit 15 and stores them.
The control unit 11 and the holding control unit 31 may be configured to perform the processing of acquiring the AR information or holding the AR information even when the non-display mode in which the AR display is not displayed is set.
By the above-described processing, when displaying the AR information of the image to be held, the control unit 11 compares the time information from the AR information table stored in the AR information storage unit 30 with the time of the internal clock by an instruction signal based on the pressing of the button (AR switching button ARB2) (provided in the case of the digital camera 1) confirmed by the AR display or an instruction signal based on the touching of the image (AR switching button ARI2) (displayed on the display unit 17) confirmed by the AR display, selects and reads the AR information table at the latest time from the AR information storage unit 15, and displays the image data together with the marker of the AR information on the display unit 17 similarly to the case of transmitting the image data from the information retrieval system 2. The control unit 11 may periodically acquire the AR information table at the latest time from the information search server 21, or may acquire the AR information table at the latest time from the information search server 21 based on the instruction signal. In the case of the AR information acquisition mode, the control unit 11 may display the AR information of the image to be held by receiving the holding control signal at the time of holding.
The control unit 11 may be configured to read the image data corresponding to the image identification information stored in the AR information storage unit 30 from the storage unit 15, and display a thumbnail image of the read image data on the display unit 17. The control unit 11 may be configured to select image data by the user touching the thumbnail image displayed on the display unit 17. The control unit 11 reads the AR information table of the image identification information from the AR information storage unit 30 based on the image identification information associated with the selected thumbnail image, and displays the image identification information on the display unit 17.
At this time, as described above, the user can select display/non-display of the AR information.
Here, the control unit 11 stores all of the building identification information for identifying the building, the building name which is the name of the building, the building information, the position information such as the latitude and longitude of the building, the description of the building, and the posting information described in the building table in the storage unit 15 as the AR information, and the like, and even in a place where the radio communication with the radio base station 3 is not possible, that is, in an off-line state, the control unit 11 can add the marker of the AR information to the image data stored in the AR information storage unit 30 and display the image data on the display unit 17 as shown in fig. 1.
Further, as long as the digital camera is not used offline, the control unit 11 may be configured to store latitude and longitude information, azimuth angle information, and angle of view information of the digital camera 1, which are necessary for the AR information acquisition request, as an AR information table, and to make an AR information acquisition request to the information search system again when selecting image data.
With the above configuration, even if the user does not orient the shooting direction of the digital camera 1 to the direction in which the AR information is to be acquired, for example, even in a state where the digital camera 1 is placed on a table and the screen of the display unit 17 is viewed, the user can confirm the AR information while viewing the image data in the direction in which the AR information is to be acquired.
When the AR information acquisition mode is set, the control unit 11 receives the hold control signal from the hold control unit 31, and then displays a user ID write field together with the image of the key on which the english alphabet and the number are displayed to prompt the user to input the user ID.
When the user touches the image of the key on which the english alphabet and the number are displayed, the control unit 11 detects the corresponding character data and writes the character data into the internal buffer, and displays the character data of the internal buffer in the user ID write field.
Next, the control unit 11 fixes the character string of the internal buffer to the user ID by detecting that the enter key on the image on which the english alphabet and the numeral are displayed is touched.
The control unit 11 adds and stores the user ID stored in the internal buffer to the AR information table.
When the user selects image data while referring to the AR information, the display for urging the input of the user ID and the image of the key on which the English alphabet and the numeral are displayed together display the user ID writing field.
By detecting that the image of the key on which the english alphabet and the number are displayed is touched, the control unit 11 detects the corresponding character data and writes the character data into the internal buffer, and displays the character data of the internal buffer in the user ID write field.
Then, the control unit 11 detects that the input key on the image on which the english alphabet and numeral keys are displayed has been touched, and compares the user ID corresponding to the character string of the internal buffer with the user ID added to the AR information table corresponding to the video identification information of the selected video data.
The control unit 11 displays a video (AR video) in which the video data and the AR information are combined on the display unit 17 when the user ID input by the user is equal to the user ID added to the AR information table, and displays only the video data when the user ID is different from the user ID added to the AR information table.
With this configuration, the digital camera 1 can specify a user (or a user ID) for each image and display AR information.
When the control unit 11 attempts to display the registered AR information or the acquired AR information of the user without displaying the AR information registered in the information search system 2 by another user or the AR information acquired by another user, the control unit 11 stores the user using the digital camera 1 in the storage unit 15 or the database 22 in advance as a user list, such as an address book, and when an image (AR switching image ARI2) to be referred to by the AR information is touched, the control unit 11 immediately displays a list of user names (or a list of user IDs) acquired from the user list on the display unit 17.
Next, when the user touches his or her own user ID from the list, the control unit 11 may be configured to read the AR information table to which the user having the same user ID as the touched user ID is added from the AR information storage unit 30, read the image data having the image identification information identical to the image identification information of the AR information table from the storage unit 15, and display the image data as an thumbnail image on the display unit 17. In the state where the list is displayed, the control unit 11 may be configured to read out the AR information table to which the user having the same ID as that of the other user is added from the AR information storage unit 30 based on a signal selected by the other user, read out the video data having the video identification information identical to that of the AR information table from the storage unit 15, and display the video data as a thumbnail image on the display unit 17.
When storing the AR information table of each piece of video identification information in the AR information storage unit 30, the control unit 11 may be configured to add a parameter associated with the AR information or a shooting date (time information) to the AR information as AR-related information in addition to the AR information received from the information search system 2, and store the AR information table in association with each piece of building identification information.
In this way, for example, when the user presses the image of the AR date search (AR date search image ARI 3) displayed on the display unit 17 when the user displays the image data, AR information, and AR related information before 10 years ago on the display unit 17, the control unit 11 detects the pressed image of the AR date search by the sensor, and displays the input field of the input date on the display unit 17 together with the image of the key on which the english alphabet and the number are displayed.
Next, after the user inputs the date in the input field by the image of the key on which the english alphabet and the numeral are displayed, the control unit 11 searches the AR information table to which the date identical to the date is added from the AR information storage unit 30 and extracts the same.
Next, the control unit 11 reads the video identification information corresponding to the extracted AR information table, reads the video data of the video identification information from the storage unit 15, and displays the video data as a thumbnail image on the display unit 17.
When the user touches the thumbnail image displayed on the display unit 17, the control unit 11 detects that the thumbnail image is touched by the sensor of the display unit 17, and detects the image identification information of the image data selected by the touch.
Next, the control unit 11 reads the video data corresponding to the video identification information of the selected video data from the storage unit 15 and displays the video data on the display unit 17.
Further, the control unit 11 reads the AR information table corresponding to the image identification information, and displays a synthesized image (AR image) in which the AR information and the AR-related information are superimposed on the image data displayed previously on the display unit 17.
With the above configuration, the user can view the image data of 10 years ago and the AR information of 10 years ago of the structure located in the image data through the display unit 17.
When the present AR information display image (AR switch button image ARI2) displayed on the display unit 17 is touched, the control unit 11 detects that the present AR information display image is touched based on the detection signal of the touch sensor, and extracts the AR information table of the image identification information of the image data from the AR information storage unit 30.
Next, the control unit 11 adds latitude and longitude information, azimuth information, and angle of view information when acquiring AR information in the period from the present time to the past (in the present embodiment, 10 years) from the AR information table, and adds an AR information search request to the information search system 2. The control unit 11 may acquire all AR information from the information search system 2 for the period from the present time to the past time (10 years in the case of the present embodiment).
Upon receiving the AR information search request from the digital camera 1, the information search server 21 extracts, from the map data in the database 22, the structure identification information of the structure in the search range obtained by the latitude and longitude information, the azimuth information, and the angle of view information added to the AR information acquisition request.
Next, the information search server 21 reads out the AR information of the building corresponding to the building identification information from the building table of the database 22, and transmits the AR information to the digital camera 1.
In this way, the control unit 11 displays the AR information of the building currently located at the position superimposed on the past image data.
With this configuration, the transition between the past and the present buildings can be easily confirmed.
When the AR information at the time of photographing of the subject (for example, 10 years ago) is stored in the database 22 of the information search system 2 in the order of time and day, the control unit 11 may acquire the AR information of the time information based on the current or past time information (for example, current time, 10 years ago, etc.) and display the AR information on the display unit 17. The user can view the AR information corresponding to the image data in time series from now recalling to the past.
In the configuration described above in which the user ID is set for each piece of video identification information of the video data, the AR information can be limited to be displayed only between a plurality of friends (users) by making a configuration in which a plurality of user IDs can be added. Further, the user ID may be set for each building identification information, and the AR information of each building may be managed. In this case, a mode for writing the user ID is set, and the control unit 11 prompts the user ID to be input for each building.
When any user ID is not set, the display is open, that is, the display is not limited to the viewing person (user), and any person (user) can view the display.
The control unit 11 displays the column for inputting the user ID and the image on which the keys for the english alphabet and the number are displayed on the display unit 17 only when the user ID is added to the AR information table corresponding to the selected video data.
As described above, the following 5 types of combinations of display of AR information using a user ID are available.
(1) Composite image of current (or past) image data and AR information of a structure in the current (or past) image data
With additional user ID, user-defined display
(2) Composite image of current (or past) image data and AR information of a structure in the current (or past) image data
Open display capable of being displayed by any user without user ID
(3) Synthetic image of AR information of current structure located in search range of current (or past) image data and current (or past) image data
With additional user ID, user-defined display
(4) Synthetic image of AR information of current structure located in search range of current (or past) image data and current (or past) image data
Open display capable of being displayed by any user without user ID
(5) Display of only current (or past) image data (AR information non-display)
The control unit 11 can switch the display of the AR information using the user ID by a signal based on the selection of the AR user switching image ARI 4.
Next, the operation of the information retrieval system of fig. 9, which is another embodiment of the present invention, will be described with reference to fig. 13. Fig. 13 is a flowchart showing an example of the operation of the information search system according to another embodiment of the present invention.
When the user touches the image in the AR information acquisition mode (AR information acquisition image ARI1) displayed on the display unit 17, the control unit 11 detects that the image in the AR information acquisition mode has been touched, and sets the digital camera 1 to the AR information acquisition mode.
The control unit 11 causes the display unit 17 to display the video data continuously supplied from the image capturing unit 13 (step S31).
Next, the control unit 11 reads latitude and longitude information (position information) indicating coordinate values of latitude accuracy and azimuth information indicating an azimuth of the optical axis of the digital camera 1 from each of the GPS14 and the azimuth sensor 16 (step S32).
At this time, the control unit 11 reads out the angle of view information from the table stored in the storage unit 15, which is associated with the angle of view information in terms of the combination of the focal length and the magnification, and determines the angle of view information in the digital camera 1 when the information search system 2 requests the acquisition of the AR information.
After obtaining the latitude and longitude information, the azimuth information, and the angle of view information, the control unit 11 acquires the camera identification information, latitude and longitude information, azimuth information, and angle of view information, which are requested to be given to the digital camera 1, from the AR information, and transmits the acquired information to the information search system 2 (step S33).
The information search server 21 finds a search range for searching for a structure from the latitude and longitude information, the azimuth information, and the angle of view information supplied from the digital camera 1, searches for a structure within the search range from the map data in the database 22, and reads out the AR information from the structure table using the structure identification information of the structure searched for within the search range (step S34).
Next, the information search server 21 transmits the read AR information together with the building identification information of the AR information to the digital camera 1 corresponding to the camera identification information.
Next, the control unit 11 calculates the coordinate position of each building in the image data display coordinates of the display unit 17 based on the latitude and longitude information of the building, the azimuth information viewing angle information, the search distance (transmitted from the information search system 2 in addition to the AR information), and the latitude and longitude information of the building included in the supplied AR information, and displays the image data and then displays the AR information of each building in the image data as a mark (step S35).
Next, the control unit 11 determines whether the azimuth angle supplied from the azimuth sensor 16 has changed by a predetermined angle (distance) (step S36). That is, the control unit 11 obtains a difference between the azimuth angle supplied from the azimuth sensor 16 and the azimuth angle stored in the azimuth angle storage unit, and compares the difference with a preset angle (threshold value of angle).
At this time, the controller 11 determines that the azimuth angle has changed when the difference exceeds (or exceeds) the predetermined angle (threshold value of angle), and returns the process to step S31, while determining that the azimuth angle has not changed when the difference does not exceed (or exceeds) the predetermined angle (threshold value of angle), and advances the process to step S37.
Next, the control unit 11 determines whether or not the holding control signal has been supplied from the holding control unit 31 (step S37).
At this time, the control unit 11 proceeds to step S38 after the hold control signal is input, and returns to step S36 when the hold control signal is not input.
Here, the holding control unit 31 compares the acceleration data from the acceleration sensor with a preset threshold value of the acceleration data, for example, and generates a holding control signal to transmit to the control unit 11 when the acceleration data supplied from the acceleration sensor exceeds the threshold value (or more) of the acceleration data. On the other hand, when the acceleration data supplied from the acceleration sensor is equal to or less than the threshold value of the acceleration data (or less than the threshold value), the holding control unit 31 does not perform the process of transmitting the holding control signal to the control unit 11.
Next, the control unit 11 writes latitude and longitude information (position information) indicating coordinate values of latitude accuracy obtained from the GPS14 and the azimuth sensor 16, respectively, and azimuth information indicating an azimuth of the optical axis of the digital camera 1 in the storage unit 15 and stores the latitude and longitude information and the azimuth information (step S38). Next, the control unit 11 advances the process to step S36 (return).
In the case of the AR information acquisition mode, the flowchart of fig. 13 is repeated regardless of whether the AR information is displayed or not displayed.
As described above, according to the present embodiment, the imaging direction of the digital camera 1 is changed, that is, the imaging direction is changed to a direction different from the direction in which the AR information has been acquired, the display unit 17 is moved to a position where the user can most easily view the image data, and the AR information can be confirmed in the image data when the AR information has been acquired.
For example, the AR information of the building located in the video data (the video data from the northgate of tokyo station) can be held, and the AR information of the building from the video data from the northgate of tokyo station can be confirmed on the table of the restaurant. The user can hold the AR information of the building in the north direction from the current position, and confirm the AR information of the building of the held image data in a state where the digital camera 1 is directed in a direction different from the north direction (for example, the direction of the ground or the direction of the east). The digital camera 1 according to the present embodiment can perform the above-described processing even when it is offline without performing communication with the wireless base station 3.
Next, the guide function of the digital camera 1 is explained. The AR information acquisition mode has been set by the user and the image data and the AR information are displayed as a mark. The same processing can be performed in the AR information confirmation mode after holding the AR information.
When the guide image displayed on the display unit 17 is touched, the control unit 11 detects that the guide image is touched by the touch sensor, and starts a guide procedure (guide unit).
The control unit 11 displays a display urging the touch (a mark for touching the structure to be guided) on the display unit 17.
When the user touches the marker of the structure to be guided, the control unit 11 detects the structure identification information corresponding to the touched marker of the structure, and reads the AR information table corresponding to the image identification information of the image data being displayed from the AR information storage unit 30.
Next, the control unit 11 reads latitude and longitude information of the structure from the AR information corresponding to the detected structure identification information based on the AR information table, adds latitude and longitude information of the current position of the digital camera 1 and latitude and longitude information of the structure, and transmits the result as a guidance request to the information retrieval system 2.
Upon receiving the guidance request, the information search server 21 searches for the shortest path (or paths) between the added latitude/longitude information of the digital camera 1 and the latitude/longitude information of the building from the map data stored in the database 22.
That is, the information search server 21 extracts, from the map data, an intersection closest to the coordinate value at which the digital camera 1 is located as indicated by the latitude and longitude information, as a start intersection.
Similarly, the information search server 21 extracts, from the map data, an intersection closest to the coordinate value at which the building indicated by the latitude and longitude information is located, as the end intersection.
Next, the information search server 21 generates a combination of a plurality of road routes connecting the start intersection to the end intersection based on the road network of the map data, selects a route having the smallest distance for each combination, and extracts the combination of the road routes having the smallest total distance by using a calculation formula in a graph theory that can efficiently solve a known shortest path problem, such as a Dijkstra (Dijkstra) method that extracts the shortest path.
Next, the information search server 21 transmits image data showing a map of a road route from the start intersection to the end intersection to the digital camera 1 which has made the guidance request. Here, the information search server 21 generates the image data of the map in a color different from that of other roads, based on the display color of the guidance route (the road route from the start intersection to the end intersection) in the image data of the map.
The control unit 11 changes the received image data of the map to a size that can be displayed on its own display unit 17, and displays the changed image data of the map on the display unit 17.
Next, the control unit 11 superimposes a mark indicating its own position on the image data of the map displayed on the display unit 17 corresponding to the latitude and longitude position acquired by the GPS 14. Therefore, the user can receive the guiding service from the starting crossroad to the ending crossroad while confirming the movement of the mark.
The digital camera 1 may transmit the image data and the AR information of the structure of the image data to another digital camera of another user, and share the image data with the information of the other user.
Further, the digital camera 1 may upload the image data and AR information of a building in the image data to a web page such as SNS (Social Networking Service) and share the information with a plurality of other digital cameras. In this case, by adding the user ID prior to the AR information table, only the user who has recognized the user ID can view or add the AR information corresponding to the image data in the SNS.
Here, the SNS server has an AR information common database storing image data and AR information.
The user can register the image data to be registered by himself/herself and the AR information corresponding to the image data by attaching the user information to the image data from the digital camera 1.
Here, when the user touches the image registered by the user displayed on the display unit 17, the control unit 11 detects the touched image by the touch sensor and displays the registration screen on the display unit 17. The login screen also includes thumbnail images of image data corresponding to the AR information acquired by the user and images of keys on which english letters and numbers are displayed.
In this login screen, the user selects image data to be registered from the thumbnail image, inputs the user ID and the user's own name (or nickname) for the image data, and touches the input image, whereby the control unit 11 accesses the SNS server and transmits the image data and AR information together with the login request.
After receiving the login request, the SNS server generates a file of the user's own name (or nickname), writes the image data and AR information requested for login into the AR information common database, and stores them. This file has the same configuration as the AR information table of fig. 11.
Next, when the user touches the AR information reference image displayed on the display unit 17 of the digital camera 1, the control unit 11 detects the touched image by the touch sensor and displays the reference screen on the display unit 17. The reference screen also includes images of keys on which english alphabets and numerals are displayed.
Next, in the login screen, the user touches the input image by touching the image of the key on which the english alphabet and the number are displayed to input the name (or the nickname) of the user to be referred to and the user ID, whereby the control unit 11 accesses the SNS server and transmits the name (or the nickname) and the user ID together with the image data and the AR information reference request.
Upon receiving the reference request, the SNS server searches the AR information common database by the name (or nickname) attached to the reference request, and extracts files of the same name (or nickname).
Then, the SNS server reads out the image identification information to which the user ID identical to the received user ID is added from the AR information in the file and the image data corresponding to the image identification information, and transmits the read-out AR information and image data to the digital camera 1.
The control unit 11 displays the input image data and AR information on the display unit 17 as described above.
Next, when the user touches the AR information change image displayed on the display unit 17 of the digital camera 1, the control unit 11 detects the touched image by the touch sensor and displays the change screen on the display unit 17. The login screen also includes thumbnail images of image data corresponding to the read AR information, information recording fields, and images of keys on which english letters and numbers are displayed.
Then, in this change screen, when the user selects the image data of the AR information to be changed from the thumbnail image, the control unit 11 detects the selected image data by the touch sensor, displays the selected image data on the display unit 17, and displays the marker of the AR information on the displayed image data so as to overlap the marker of the AR information in correspondence with the position of each building.
Next, the user touches the mark for selecting the AR information to be changed, touches the image of the key on which the english alphabet and the number are displayed to input the added or edited character string and the user ID in the information writing field, and touches the input key (input image), and the control unit 11 transmits the image identification information and the building identification information together with the changed AR information to the SNS server as the change request including the user name and the user ID, with the character string in the information writing field being input as the AR information of the new building.
After receiving the change request, the SNS server searches the AR information common database by the name (or nickname) attached to the change request, and extracts files of the same name (or nickname).
Next, the SNS server determines whether or not the user ID attached to the video identification information matches the user ID attached to the received change request, and notifies the digital camera 1 that the change is not possible without performing any operation when the user ID does not match the user ID.
On the other hand, when the user ID added to the AR information matches the user ID added to the received change request, the SNS server changes the AR information of the building identification information in the image identification information.
Alternatively, a program for realizing the AR information holding function of the control unit 11 and the holding control unit 31 of the digital camera 1 shown in fig. 9 may be recorded in a computer-readable recording medium, and the program recorded in the recording medium may be read into a computer system and executed to control the AR information.
Alternatively, a program for realizing the AR information search function of the information search server 21 in fig. 9 may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read into a computer system and executed to perform search control of AR information.
In addition, the term "computer system" as used herein includes hardware such as an OS or a peripheral machine.
The term "computer system" also includes a web page providing environment (or display environment) as long as it uses the WWW system.
The term "computer-readable recording medium" refers to a transportable medium such as a floppy disk, a magneto-optical disk, a ROM, or a CD-ROM, or a storage device such as a hard disk incorporated in a computer system. Further, the "computer-readable recording medium" includes those that dynamically hold programs for a short time via a communication line when the programs are transmitted via a network such as the internet or a communication line such as a telephone line, or those that hold programs for a certain time such as a volatile memory (RAM) in a server or a computer system as a client in such a case. The program may be a program for realizing a part of the above functions, or a program recorded in a computer system may be combined with the above functions.
While the embodiments of the present invention have been described above with reference to the drawings, the specific configurations are not limited to the embodiments, and may be designed without departing from the scope of the present invention.
Claims (16)
1. An imaging device is characterized by comprising:
an imaging unit that images an object;
a position information acquiring unit that acquires position information of a photographing position;
a control unit for acquiring information on the subject based on the position information and displaying the image data of the subject and the information on the subject on a display unit; and
and a holding control unit for outputting a holding control signal for holding the image data of the subject and the information on the subject to the control unit.
2. The imaging apparatus according to claim 1, wherein the control section displays the image data of the subject and the information related to the subject on the display section based on the hold control signal received from the hold control section.
3. The imaging apparatus according to claim 1 or 2, wherein the control section is capable of switching between display and non-display of the information on the subject on the display section.
4. The imaging apparatus according to any one of claims 1 to 3, wherein the control section displays the image data of the object and the information related to the object, which are held, on the display section when receiving the holding control signal in imaging the object.
5. The imaging apparatus according to any one of claims 1 to 4, wherein the control section transmits the held information on the subject and the image data of the subject to another terminal based on the holding control signal.
6. The imaging apparatus according to any one of claims 1 to 5, wherein the information relating to the subject is AR information.
7. An information acquisition system is characterized by comprising an imaging device and an information retrieval system;
the photographing apparatus according to any one of claims 1 to 6.
8. An imaging device is characterized by comprising:
a latitude and longitude detecting unit that detects latitude and longitude information of its own position;
an azimuth angle detection unit that detects an azimuth angle in the captured image data;
a control unit that acquires and displays on a display unit, by using the latitude and longitude information and the azimuth, AR information to which a building in a latitude and longitude range in the azimuth direction is attached, among the latitude and longitude information; and
and a holding control unit for outputting a holding control signal for storing the AR information and the image data in the storage unit to the control unit after detecting the operation of storing the AR information in the storage unit.
9. The imaging apparatus according to claim 8, wherein the control unit stores, in the storage unit, image identification information for identifying the image data, in addition to the latitude/longitude information and the azimuth of the position of the image data itself, when the hold control signal is input, after the hold control signal is input from the hold control unit.
10. The imaging apparatus according to claim 8, wherein the control unit stores the image data displayed on the display unit and the AR information corresponding to the image data in the storage unit in addition to image identification information for identifying the image data when the hold control signal is input after the hold control signal is input from the hold control unit.
11. The imaging apparatus according to any one of claims 8 to 10, wherein the storage unit is configured to store, for each of the image data, image identification information for identifying the image data, building identification information for identifying a building in the image data, building information for the building indicated by the building identification information, and a user ID added to the image identification information.
12. The imaging apparatus according to any one of claims 8 to 11, wherein the control unit reads the AR information and the image data from the storage unit and displays them on the display unit.
13. The imaging apparatus according to any one of claims 8 to 12, wherein the control unit transmits the AR information added in the imaging state to an information retrieval system.
14. An information acquisition system comprising an imaging device and an information retrieval system, wherein the information retrieval system extracts a structure in a latitude and longitude range in the azimuth direction from latitude and longitude information and an azimuth transmitted from the imaging device, and transmits information attached to the extracted structure to the imaging device;
the imaging device includes:
a latitude and longitude detecting unit that detects latitude and longitude information of its own position;
an azimuth angle detection unit that detects an azimuth angle in the captured image data;
a control unit that acquires and displays on a display unit, by using the latitude and longitude information and the azimuth, AR information to which a building in a latitude and longitude range in the azimuth direction is attached, among the latitude and longitude information; and
a holding control unit for outputting a holding control signal for storing the AR information and the image data in the storage unit to the control unit after detecting an operation for storing the AR information in the storage unit;
the information retrieval system includes:
a database storing map data corresponding to a building identification number of a building and latitude and longitude information of the building, and a building table corresponding to the AR information of the building displayed by the building identification number and the building identification number; and
an information search server for searching the map data for a structure identification number of a structure located in a latitude and longitude range of the azimuth direction in the latitude and longitude information, based on the latitude and longitude information and the azimuth transmitted from the image pickup device, reading the AR information attached to the structure displayed by the structure identification number from the structure table, based on the searched structure identification number, and transmitting the AR information of the read structure to the image pickup device.
15. A program for causing a computer to execute the functions of the photographing apparatus according to any one of claims 1 to 6, characterized by causing the computer to execute:
inputting position information of a photographing position for photographing an object;
acquiring information related to the subject based on the position information;
displaying the image data of the subject and information related to the subject on a display unit; and
and outputting a hold control signal for holding the image data of the subject and the information on the subject to a control unit.
16. A program for causing a computer to execute the functions of the photographing apparatus according to any one of claims 8 to 13, characterized by causing the computer to execute:
inputting latitude and longitude information of the position detected by the latitude and longitude detecting unit;
inputting an azimuth angle in the captured image data detected by the azimuth angle detecting unit;
acquiring and displaying on a display unit, based on the latitude and longitude information and the azimuth, AR information to which a building in a latitude and longitude range in the azimuth direction is attached; and
and outputting a holding control signal for storing the AR information and the image data in a storage unit to the control unit after detecting an operation for storing the AR information in the storage unit.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-025998 | 2010-02-08 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
HK18101947.2A Division HK1242876B (en) | 2010-02-08 | 2013-02-26 | Imaging device, information acquisition system, program and recording medium |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
HK18101947.2A Addition HK1242876B (en) | 2010-02-08 | 2013-02-26 | Imaging device, information acquisition system, program and recording medium |
Publications (2)
Publication Number | Publication Date |
---|---|
HK1175339A true HK1175339A (en) | 2013-06-28 |
HK1175339B HK1175339B (en) | 2018-03-23 |
Family
ID=
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107404598B (en) | Imaging device, information acquisition system, program, and recording medium | |
JP6958673B2 (en) | Mobile information devices, information acquisition systems, and programs | |
HK1175339A (en) | Imaging device, information acquisition system, and program | |
HK1175339B (en) | Imaging device, information acquisition system, and program | |
HK1242876A1 (en) | Imaging device, information acquisition system, program and recording medium | |
HK1242876B (en) | Imaging device, information acquisition system, program and recording medium | |
HK1176200B (en) | Imaging device, information acquisition system, and program | |
HK1176200A (en) | Imaging device, information acquisition system, and program |