[go: up one dir, main page]

US20190080011A1 - Information display device, information display method, program and information display system - Google Patents

Information display device, information display method, program and information display system Download PDF

Info

Publication number
US20190080011A1
US20190080011A1 US16/081,868 US201616081868A US2019080011A1 US 20190080011 A1 US20190080011 A1 US 20190080011A1 US 201616081868 A US201616081868 A US 201616081868A US 2019080011 A1 US2019080011 A1 US 2019080011A1
Authority
US
United States
Prior art keywords
information
search
user
tag
search object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/081,868
Inventor
Tetsuya Mitsui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pioneer Corp
Original Assignee
Pioneer Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corp filed Critical Pioneer Corp
Assigned to PIONEER CORPORATION reassignment PIONEER CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MITSUI, TETSUYA
Publication of US20190080011A1 publication Critical patent/US20190080011A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/907Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/909Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • G06F17/30867
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3679Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
    • G01C21/3682Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities output of POI information on a road map
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • G06F16/24573Query processing with adaptation to user needs using data annotations, e.g. user-defined metadata
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9035Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9038Presentation of query results
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/907Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F17/30525
    • G06F17/30991
    • G06F17/30997

Definitions

  • the present invention relates to a technique of displaying information relevant to facilities.
  • the service candidate providing server presumes service candidates based on user's character such as an attribute, a taste and a history of the user, and transmits its result to the onboard device.
  • the onboard device selects the service candidates in consideration of the user's situation and displays the result of the selection.
  • Patent Reference-1 Japanese Patent Application Laid-Open under No. 2005-208943
  • the search results are selected based on the user's situation. Therefore, when the search results based on the situation are not needed, it is not possible to provide the search result that the user wants. Also, when the search is made without consideration of the user's situation, the user actually needs to access the details of the respective search results in the list of the search results so as to know whether or not the search results include information that the user explicitly or implicitly wants.
  • An invention described in claims is an information display device comprising: a receiving unit configured to receive a search condition from a user; a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • Another invention described in claims is an information display method executed by an information display device, comprising: a receiving process configured to receive a search condition from a user; a search condition obtaining process configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining process configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display process configured to display an additional image indicating the additional information, together with the search object information.
  • Still another invention described in claims is a program executed by an information display device including a computer, the program causing the computer to function as: a receiving unit configured to receive a search condition from a user; a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • Still another invention described in claims is an information display system comprising: a receiving unit configured to receive a search condition from a user; a first obtaining unit configured to obtain an attribute and/or a current situation of the user; a second obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; a third obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • FIG. 1 illustrates a whole configuration of an information display system according to an embodiment of the present invention.
  • FIG. 2 illustrates an example of context
  • FIGS. 3A and 3B illustrate display examples of search results.
  • FIG. 4 is a flowchart of search processing.
  • FIG. 5 illustrates an example of configuration of a tag recommending unit.
  • FIG. 6 is a flowchart of learning processing of a classifier.
  • FIGS. 7A and 7B illustrate display examples of search results according to a modified example.
  • an information display device comprising: a receiving unit configured to receive a search condition from a user; a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • the above information display device receives a search condition for searching for facilities from a user, and obtains search object information indicating a search object searched based on the search condition. Also, the information display device obtains additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user. Then, the information display device displays an additional image indicating the additional information, together with the search object information. Thus, it becomes possible to determine appropriate additional information based on the attribute and/or the current situation of the user and display the additional image indicating the additional information.
  • the search object information includes facility information relevant to a facility
  • the display unit preferentially displays the facility information of a facility having the additional information, out of a plurality of facility information, in accordance with a designation by the user.
  • the additional information can be positively presented to the user.
  • the search object information includes facility information relevant to a facility
  • the additional information is information relevant to equipment existing in the facility
  • the display unit displays detailed image information indicating a position of the equipment in the facility when the user selects the additional information.
  • the search object obtaining unit obtains the facility information obtained by a new search based on the search condition including the equipment corresponding to the additional image when the user selects the additional information, and the display unit displays the facility information obtained by the new search.
  • the display unit displays the facility information obtained by the new search.
  • the additional information obtaining unit obtains the additional information based on the attribute and/or the current situation of the user obtained, without receiving an input by the user.
  • an information display method executed by an information display device, comprising: a receiving process configured to receive a search condition from a user; a search condition obtaining process configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining process configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display process configured to display an additional image indicating the additional information, together with the search object information.
  • a program executed by an information display device including a computer, the program causing the computer to function as: a receiving unit configured to receive a search condition from a user; a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • This program may be handled in a manner stored in a storage medium.
  • an information display system comprising: a receiving unit configured to receive a search condition from a user; a first obtaining unit configured to obtain an attribute and/or a current situation of the user; a second obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; a third obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • the above information display system receives a search condition for searching for facilities from a user, and obtains an attribute and/or a current situation of the user. Next, the information display system obtains search object information indicating a search object searched based on the search condition. Also, the information display system obtains additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user. Then, the information display system displays an additional image indicating the additional information, together with the search object information. Thus, it becomes possible to determine appropriate additional information based on the attribute and/or the current situation of the user and display the additional image indicating the additional information.
  • FIG. 1 illustrates a whole configuration of an information display system 1 according to an embodiment of the present invention.
  • the information display system. 1 includes a server 10 and a terminal device 20 .
  • the terminal device 20 may be a smartphone 20 a or a car navigation device 20 b loaded on a vehicle, for example.
  • the terminal device 20 communicates with the server 10 by wireless communication.
  • the server 10 includes a search unit 11 , a search database (“database” will be simply expressed as “DB”) 12 , a tag recommending unit 13 and a user attribute DB 14 .
  • the information display system 1 is characteristic in displaying additional information relevant to facilities (hereinafter referred to as “tag information”) together with the search result, when a user searches for the facilities.
  • tag information is information relevant to equipment existing in the facility.
  • the server 10 searches for the facilities based on the inputted search keywords and obtains one or more search results.
  • the server 10 determines the tag information to be recommended to the user in association with the facility obtained as the search result, and presents the tag information to the user together with the search results.
  • “context” is information indicating current situation of the user.
  • the server 10 when the user makes a search by inputting search keywords “shopping mall”, the server 10 obtains the facility information of plural shopping malls as the search results. In addition, the server 10 determines that the user has a baby based on the context and/or the user attribute, and adds the tag information indicating a nursing room to the shopping malls having the nursing room, when it presents the list of the plural shopping malls obtained as the search results.
  • the server 10 when the user makes a search by inputting search keywords “convenience store”, the server 10 obtains the facility information of plural convenience stores as the search results. In addition, the server 10 determines that the user is continuously driving the vehicle more than two hours and a female passenger is on the vehicle based on the context, and adds the tag information indicating a toilet to the convenience stores having the toilet, when it presents the list of the plural convenience stores obtained as the search results.
  • the server 10 when the user makes a search by inputting search keywords “facility with a parking lot”, the server 10 obtains the facility information of plural facilities with a parking lot as the search results. In addition, the server 10 determines that it will rain with high possibility at the time when the user will reach the destination based on the context, and adds the tag information indicating a roofed parking lot to the facilities having the roofed parking lot, when it presents the list of the plural facilities with a parking lot obtained as the search results.
  • the search keywords 31 and the context 32 are transmitted from the terminal device 20 to the server 10 .
  • FIG. 2 illustrates an example of the context 32 .
  • the context is obtained by the terminal device 20 and is stored in the terminal device 20 .
  • the context 32 includes “Continuous driving time”, “Member of passengers”, “Weather in destination at expected arrival time”, “Current time”, “Current position”, “During route guidance or not”. “Continuous driving time” is obtained by a control unit of the vehicle or the navigation device 20 b , and “Member of passengers” is inputted by the user or obtained by a sensor or a camera installed in the vehicle. “Weather in destination at expected arrival time” is obtained by the terminal device 20 from an external server via the Internet.
  • “Current time” may be obtained from an internal clock of the terminal device 20
  • “Current position” may be obtained by the GPS installed in the terminal device 20
  • Information of “During route guide or not” may be obtained from the navigation application operating in the terminal device 20 .
  • the search unit 11 makes a search using the search keywords 31 obtained from the terminal device 20 by referring to the search DB 12 , and obtains one or plural facility information as the search results 34 . Then, the search unit 11 sends the search result 34 thus obtained, the search keywords 31 and the context 32 to the tag recommending unit 13 .
  • the tag recommending unit 13 obtains the user attribute 33 from the user attribute DB 14 , in addition to the search keywords 31 , the context 32 and the search result 34 , determines the tag information 35 to be recommended to the user based on those information, and sends the tag information 35 to the search unit 11 .
  • the tag information 35 includes tag contents 35 x . Then, the search unit 11 transmits the search results 36 including the tag information 35 to the terminal device 20 based on the search results 34 and the tag information 35 .
  • the terminal device 20 receives the search results 36 including the tag information 35 and displays them on a display unit.
  • FIG. 3A is a display example of the search results.
  • FIG. 3A is the display example in the above-described second example.
  • the user inputs the search keywords “convenience store” and makes a search, and the search result bars 41 indicating four convenience stores are displayed as the search results. Also, since “Toilet” is determined as the tag information to be recommended to the user, the tag icons 42 indicating the toilet are displayed in the search result bars 41 of the convenience store having the toilet, among the four convenience stores.
  • the search results are displayed based on the search keywords inputted by the user, and if there is tag information 35 to be recommended to the user in consideration of the context at that time and/or the user attribute, the tag icon 42 indicating the tag information 35 is displayed. Therefore, the basic search results are displayed first, and the tag information may be naturally presented to the user.
  • the tag contents 35 x shown in FIG. 3B are displayed.
  • the tag contents 35 x is information indicating the detail of the tag information 35 that the tag icon 42 indicates.
  • the store map including the toilet is displayed as the tag contents 35 x corresponding to the tag icon 42 of toilet.
  • search results 34 are examples of the facility information of the present invention
  • the context 32 is an example of the current situation of the present invention
  • the tag information 35 is an example of the additional information of the present invention
  • the tag icon 42 is an example of the additional image in the present invention.
  • FIG. 4 is a flowchart of the search processing. This processing is executed by the server 10 and the terminal device 20 .
  • the terminal device 20 determines whether or not the user inputs the search keywords 31 (step S 10 ).
  • the terminal device 20 transmits the search keywords 31 and the context 32 at that time to the server 10 (step S 11 ).
  • the server 10 receives the search keywords 31 and the context 32 , and searches for the facilities based on the search keywords 31 (step S 12 ). If there exist search results (step S 13 : Yes), the server 10 determines the tag information 35 to be recommended, based on the search keywords 31 , the search results 34 , the context 32 and/or the user attribute 33 (step S 14 ). On the other hand, if there exists no search result (step S 13 : No), the server 10 does not determine the tag information 35 . Then, the server 10 transmits the search results 36 to the terminal device 20 (step S 15 ). At that time, if the tag information 35 is determined in step S 14 , the tag information 35 is included in the search results 36 and transmitted to the terminal device 20 . If there exists no search result in step S 13 , a message indicating that there exists no search result is transmitted to the terminal device 20 .
  • the terminal device 20 When receiving the search results 36 , the terminal device 20 displays the search result bars 41 and the tag icons 42 corresponding to the tag information 35 as shown in the example of FIG. 3A (step S 16 ). Next, the terminal device 20 determines whether or not the tag icon 42 is touched by the user (step S 17 ). If the tag icon 42 is not touched (step S 17 : No), the processing ends. On the other hand, if the tag icon 42 is touched (step S 17 : Yes), the terminal device 20 displays the tag contents 35 x corresponding to the tag icon 42 as shown in the example of FIG. 3B (step S 18 ). Then, the processing ends.
  • the tag recommending unit 13 in the server 10 determines the tag information 35 based on the search keywords 31 , the context 32 , the user attribute 33 and the search results 34 .
  • FIG. 5 illustrates an example of a configuration of the tag recommending unit 13 .
  • the tag recommending unit 13 includes a vector conversion unit 13 a , a classifier 13 b and a tag DB 13 c.
  • the vector conversion unit 13 a generates a vector based on the inputted search keywords 31 , the context 32 and the user attribute 33 , and supplies it to the classifier 13 b .
  • This vector is matrix data indicating the combination of the search keywords 31 , the context 32 and the user attribute 33 .
  • the classifier 13 b determines plural candidates of the tag information 35 based on the inputted vector, and selects the most appropriate tag information 35 from those candidates and supplies it to the tag DB 13 c .
  • the tag DB 13 c stores tag contents 35 x corresponding to the facility, for each POI (Point Of Interest) of plural facilities.
  • the tag recommending unit 13 reads out the tag contents 35 x from the tag DB 13 c based on the search results 34 and the tag information 35 determined by the classifier 13 b , and outputs it with the tag information 35 .
  • the vector conversion unit 13 a generates the vector based on the search keywords “convenience store”, the context and the user attribute, and supplies it to the classifier 13 b .
  • the classifier 13 b determines the tag information “Toilet” which fits to the vector best, and supplies it to the tag DB 13 c .
  • the POI indicating plural convenience stores are inputted to the tag DB 13 c as the search results 34 .
  • the tag DB 13 c outputs the tag contents 35 x (the store map in this example) for one or plural convenience stores having the tag information “Toilet”.
  • the classifier 13 b performs learning according to whether or not the tag information 35 presented to the user is actually selected by the user.
  • the search results recommending “Toilet” as the tag information 35 are displayed as shown in FIG. 3A , if the user presses the search result bars 41 of “A-SHOP” or “C-CONVENIENCE STORE” or the tag icons 42 , it is presumed that the user is interested in the tag information “Toilet”. Namely, perhaps the tag information “Toilet” recommended by the tag recommending unit 13 is what the user wanted, and recommending the tag information “Toilet” was right.
  • the classifier 13 b performs weighting to the combination of the search keywords, the context and the user attribute at that time such that the tag information “Toilet” becomes more appropriate. Thus, if the same combination is inputted after that, “Toilet” is determined as the tag information with higher probability.
  • the classifier 13 b performs weighting to the combination of the search keywords, the context and the user attribute at that time such that the tag information “Toilet” becomes less appropriate.
  • “Toilet” is determined as the tag information with lower probability.
  • FIG. 6 is a flowchart of the learning processing of the classifier 13 b .
  • This processing is mainly executed by the tag recommending unit 13 in the server 10 .
  • the user's selection result in response to the search results displayed in the terminal device 20 as shown in FIG. 3A , is transmitted from the terminal device 20 to the server 10 .
  • the tag recommending unit 13 determines whether or not the tag icon 42 or the search result bar 41 including the tag icon 42 (hereinafter referred to as “search result bar 41 with tag) is selected (step S 20 ).
  • step S 20 If the tag icon 42 or the search result bar 41 with tag is selected (step S 20 : Yes), the tag recommending unit 13 puts a right flag to the combination of the search keywords, the context and the user attribute at that time (step S 21 ). On the other hand, if the tag icon 42 or the search result bar 41 with tag is not selected (step S 20 : No), the tag recommending unit 13 puts a wrong flag to the combination of the search keywords, the context and the user attribute at that time (step S 22 ). Then, the tag recommending unit 13 performs the learning of the classifier 13 b based on the result of updating the flag in steps S 21 or S 22 (step S 23 ).
  • the tag recommending unit 13 makes the weight of the tag information larger to the combination to which the right flag is put, and makes the weight of the tag information smaller to the combination to which the wrong flag is put.
  • the learning of the classifier 13 b is performed based on the user's selection of the presented tag information.
  • FIGS. 7A and 7B illustrate other examples of the search results including tag information.
  • the search results are displayed in the order the distance from the current position, i.e., from the nearest one to the farthest one, regardless of the present/absence of the tag information.
  • the user can select one of the display mode in which the distance from the current position is preferential (hereinafter referred to as “distance priority mode”) and the display mode in which the tag information is preferential (hereinafter referred to as “tag priority mode”).
  • distance priority mode the display mode in which the distance from the current position is preferential
  • tag priority mode the display mode in which the tag information is preferential
  • FIG. 7A is an example of the distance priority mode, wherein each search results are displayed in the order from the one nearest from the current position.
  • the display is switched to the tag priority mode shown in FIG. 7B .
  • the search results including the tag information are displayed with higher priority.
  • the switch button 43 labeled “DISTANCE PRIORITY” the display is switched to the distance priority mode shown in FIG. 7A .
  • the user can select his or her favorite display mode.
  • the terminal device 20 is provided with the search unit 11 , the search DB 12 , the tag recommending unit 13 and the user attribute DB 14 , which are provided in the server 10 in the above embodiment, and the terminal device 20 may determine the tag information and present the tag information to the user.
  • the server 10 may be provided with the search unit 11 and the search DB 12
  • the terminal device 20 may be provided with the tag recommending unit 13 and the user attribute DB 14 .
  • the server 10 performs the search based on the search keywords and transmits the search results to the terminal device 20 .
  • the terminal device 20 determines the tag information by the tag recommending unit 13 based on the received search results, and presents the tag information to the user.
  • the tag icon 42 when the user touches the tag icon 42 in the display example of the search results shown in FIG. 3A , the corresponding tag contents are displayed. Instead, when the tag icon 42 is touched, a new search using the tag contents as the search keywords may be performed. For example, when the user touches the tag icon 42 corresponding to “Toilet”, a new search for the facility using the search keyword “Toilet” is performed. Thus, it becomes possible to present the results of the search mainly using the tag information “Toilet”, in which the user interested.
  • the tag information is information relevant to the equipment belonging to the facility.
  • the tag information may be information relevant to events or services performed in the facility, for example.
  • the present invention is applied to the search for the facilities in the car navigation device.
  • the present invention may be applied to a general Web search. For example, when a user watching a drama currently being broadcasted (Drama name: “ ⁇ ”) makes a search with a search keyword “Actor” and the detail of the search results includes the drama “ ⁇ ”, a tag “XX (the actor appearing in the drama ⁇ )” may be added to the search results.
  • This invention can be used for a system displaying information on a terminal device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Library & Information Science (AREA)
  • Computational Linguistics (AREA)
  • Automation & Control Theory (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Navigation (AREA)

Abstract

The information display system receives a search condition for searching for facilities from a user, and obtains an attribute and/or a current situation of the user. Next, the information display system searches for facilities based on the search condition, an attribute and/or a current situation of the user, and outputs facility information relevant to the facilities. Then, the information display system obtains additional information for the facilities, and displays the additional image indicating the additional information, together with the facility information.

Description

    TECHNICAL FIELD
  • The present invention relates to a technique of displaying information relevant to facilities.
  • BACKGROUND TECHNIQUE
  • There is known a technique of presuming and recommending information that a user wants when the user makes a search using search keywords in an onboard device such as a car navigation device. For example, in a system disclosed in Patent Reference-1, the service candidate providing server presumes service candidates based on user's character such as an attribute, a taste and a history of the user, and transmits its result to the onboard device. The onboard device selects the service candidates in consideration of the user's situation and displays the result of the selection.
  • PRIOR ART REFERENCES Patent References
  • Patent Reference-1: Japanese Patent Application Laid-Open under No. 2005-208943
  • SUMMARY OF THE INVENTION Problem to be Solved by the Invention
  • In the technique of Patent Reference-1, the search results are selected based on the user's situation. Therefore, when the search results based on the situation are not needed, it is not possible to provide the search result that the user wants. Also, when the search is made without consideration of the user's situation, the user actually needs to access the details of the respective search results in the list of the search results so as to know whether or not the search results include information that the user explicitly or implicitly wants.
  • The above is an example of the problem to be solved by the present invention. It is an object of the present invention to provide an information display technique by which a user can obtain information that he wants only by seeing the displayed search results.
  • Means for Solving the Problem
  • An invention described in claims is an information display device comprising: a receiving unit configured to receive a search condition from a user; a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • Another invention described in claims is an information display method executed by an information display device, comprising: a receiving process configured to receive a search condition from a user; a search condition obtaining process configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining process configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display process configured to display an additional image indicating the additional information, together with the search object information.
  • Still another invention described in claims is a program executed by an information display device including a computer, the program causing the computer to function as: a receiving unit configured to receive a search condition from a user; a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • Still another invention described in claims is an information display system comprising: a receiving unit configured to receive a search condition from a user; a first obtaining unit configured to obtain an attribute and/or a current situation of the user; a second obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; a third obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a whole configuration of an information display system according to an embodiment of the present invention.
  • FIG. 2 illustrates an example of context.
  • FIGS. 3A and 3B illustrate display examples of search results.
  • FIG. 4 is a flowchart of search processing.
  • FIG. 5 illustrates an example of configuration of a tag recommending unit.
  • FIG. 6 is a flowchart of learning processing of a classifier.
  • FIGS. 7A and 7B illustrate display examples of search results according to a modified example.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • According to one aspect of the present invention, there is provided an information display device comprising: a receiving unit configured to receive a search condition from a user; a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • The above information display device receives a search condition for searching for facilities from a user, and obtains search object information indicating a search object searched based on the search condition. Also, the information display device obtains additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user. Then, the information display device displays an additional image indicating the additional information, together with the search object information. Thus, it becomes possible to determine appropriate additional information based on the attribute and/or the current situation of the user and display the additional image indicating the additional information.
  • In one mode of the above information display device, the search object information includes facility information relevant to a facility, and the display unit preferentially displays the facility information of a facility having the additional information, out of a plurality of facility information, in accordance with a designation by the user. In this mode, the additional information can be positively presented to the user.
  • In another mode of the above information display device, the search object information includes facility information relevant to a facility, the additional information is information relevant to equipment existing in the facility, and the display unit displays detailed image information indicating a position of the equipment in the facility when the user selects the additional information. In this mode, when the user is interested in the additional information, it becomes possible to present more detailed information.
  • In still another mode of the above information display, the search object obtaining unit obtains the facility information obtained by a new search based on the search condition including the equipment corresponding to the additional image when the user selects the additional information, and the display unit displays the facility information obtained by the new search. In this mode, when the user is interested in the additional information, it becomes possible to present search results of the search using the additional information as the search condition.
  • In still another mode of the above information display device, the additional information obtaining unit obtains the additional information based on the attribute and/or the current situation of the user obtained, without receiving an input by the user.
  • According to another aspect of the present invention, there is provided an information display method executed by an information display device, comprising: a receiving process configured to receive a search condition from a user; a search condition obtaining process configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining process configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display process configured to display an additional image indicating the additional information, together with the search object information. By this method, it becomes possible to determine appropriate additional information based on the attribute and/or the current situation of the user and display the additional image indicating the additional information.
  • According to another aspect of the present invention, there is provided a program executed by an information display device including a computer, the program causing the computer to function as: a receiving unit configured to receive a search condition from a user; a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information. By executing this program, it becomes possible to determine appropriate additional information based on the attribute and/or the current situation of the user and display the additional image indicating the additional information. This program may be handled in a manner stored in a storage medium.
  • According to still another aspect of the present invention, there is provided an information display system comprising: a receiving unit configured to receive a search condition from a user; a first obtaining unit configured to obtain an attribute and/or a current situation of the user; a second obtaining unit configured to obtain search object information indicating a search object searched based on the search condition; a third obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and a display unit configured to display an additional image indicating the additional information, together with the search object information.
  • The above information display system receives a search condition for searching for facilities from a user, and obtains an attribute and/or a current situation of the user. Next, the information display system obtains search object information indicating a search object searched based on the search condition. Also, the information display system obtains additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user. Then, the information display system displays an additional image indicating the additional information, together with the search object information. Thus, it becomes possible to determine appropriate additional information based on the attribute and/or the current situation of the user and display the additional image indicating the additional information.
  • EMBODIMENTS
  • Preferred embodiments of the present invention will be described below with reference to the accompanied drawings.
  • [Whole Configuration]
  • FIG. 1 illustrates a whole configuration of an information display system 1 according to an embodiment of the present invention. The information display system. 1 includes a server 10 and a terminal device 20. The terminal device 20 may be a smartphone 20 a or a car navigation device 20 b loaded on a vehicle, for example. The terminal device 20 communicates with the server 10 by wireless communication. The server 10 includes a search unit 11, a search database (“database” will be simply expressed as “DB”) 12, a tag recommending unit 13 and a user attribute DB 14.
  • First, an outline of the operation of the information display system 1 will be described. The information display system 1 is characteristic in displaying additional information relevant to facilities (hereinafter referred to as “tag information”) together with the search result, when a user searches for the facilities. Basically, the tag information is information relevant to equipment existing in the facility. Specifically, when a user inputs search keywords and instructs the search, the server 10 searches for the facilities based on the inputted search keywords and obtains one or more search results. Additionally, based on the search keywords, the obtained search results, context and the user attribute, the server 10 determines the tag information to be recommended to the user in association with the facility obtained as the search result, and presents the tag information to the user together with the search results. It is noted that “context” is information indicating current situation of the user.
  • As concrete examples, in a first example, when the user makes a search by inputting search keywords “shopping mall”, the server 10 obtains the facility information of plural shopping malls as the search results. In addition, the server 10 determines that the user has a baby based on the context and/or the user attribute, and adds the tag information indicating a nursing room to the shopping malls having the nursing room, when it presents the list of the plural shopping malls obtained as the search results.
  • In a second example, when the user makes a search by inputting search keywords “convenience store”, the server 10 obtains the facility information of plural convenience stores as the search results. In addition, the server 10 determines that the user is continuously driving the vehicle more than two hours and a female passenger is on the vehicle based on the context, and adds the tag information indicating a toilet to the convenience stores having the toilet, when it presents the list of the plural convenience stores obtained as the search results.
  • In a third example, when the user makes a search by inputting search keywords “facility with a parking lot”, the server 10 obtains the facility information of plural facilities with a parking lot as the search results. In addition, the server 10 determines that it will rain with high possibility at the time when the user will reach the destination based on the context, and adds the tag information indicating a roofed parking lot to the facilities having the roofed parking lot, when it presents the list of the plural facilities with a parking lot obtained as the search results.
  • Next, specific processing by the information display system 1 will be described with reference to FIG. 1. When the user inputs the search keywords and instructs the search to the terminal device 20, the search keywords 31 and the context 32 are transmitted from the terminal device 20 to the server 10.
  • FIG. 2 illustrates an example of the context 32. The context is obtained by the terminal device 20 and is stored in the terminal device 20. In the example of FIG. 2, the context 32 includes “Continuous driving time”, “Member of passengers”, “Weather in destination at expected arrival time”, “Current time”, “Current position”, “During route guidance or not”. “Continuous driving time” is obtained by a control unit of the vehicle or the navigation device 20 b, and “Member of passengers” is inputted by the user or obtained by a sensor or a camera installed in the vehicle. “Weather in destination at expected arrival time” is obtained by the terminal device 20 from an external server via the Internet. “Current time” may be obtained from an internal clock of the terminal device 20, and “Current position” may be obtained by the GPS installed in the terminal device 20. Information of “During route guide or not” may be obtained from the navigation application operating in the terminal device 20.
  • Returning to FIG. 1, the search unit 11 makes a search using the search keywords 31 obtained from the terminal device 20 by referring to the search DB 12, and obtains one or plural facility information as the search results 34. Then, the search unit 11 sends the search result 34 thus obtained, the search keywords 31 and the context 32 to the tag recommending unit 13. The tag recommending unit 13 obtains the user attribute 33 from the user attribute DB 14, in addition to the search keywords 31, the context 32 and the search result 34, determines the tag information 35 to be recommended to the user based on those information, and sends the tag information 35 to the search unit 11. As described later, the tag information 35 includes tag contents 35 x. Then, the search unit 11 transmits the search results 36 including the tag information 35 to the terminal device 20 based on the search results 34 and the tag information 35.
  • The terminal device 20 receives the search results 36 including the tag information 35 and displays them on a display unit. FIG. 3A is a display example of the search results. FIG. 3A is the display example in the above-described second example. The user inputs the search keywords “convenience store” and makes a search, and the search result bars 41 indicating four convenience stores are displayed as the search results. Also, since “Toilet” is determined as the tag information to be recommended to the user, the tag icons 42 indicating the toilet are displayed in the search result bars 41 of the convenience store having the toilet, among the four convenience stores.
  • As described above, in this embodiment, the search results are displayed based on the search keywords inputted by the user, and if there is tag information 35 to be recommended to the user in consideration of the context at that time and/or the user attribute, the tag icon 42 indicating the tag information 35 is displayed. Therefore, the basic search results are displayed first, and the tag information may be naturally presented to the user.
  • In the display exampled shown in FIG. 3A, if the user touches the tag icon 42, the tag contents 35 x shown in FIG. 3B are displayed. The tag contents 35 x is information indicating the detail of the tag information 35 that the tag icon 42 indicates. In the example of FIG. 3B, the store map including the toilet is displayed as the tag contents 35 x corresponding to the tag icon 42 of toilet. By displaying the tag contents 35 x when the user selects the tag information 35 in this way, more detailed information may be presented to the user if the user is interested in the tag information 35.
  • In the above description, the search results 34 are examples of the facility information of the present invention, the context 32 is an example of the current situation of the present invention, the tag information 35 is an example of the additional information of the present invention, and the tag icon 42 is an example of the additional image in the present invention.
  • Next, search processing by the information display system 1 will be described. FIG. 4 is a flowchart of the search processing. This processing is executed by the server 10 and the terminal device 20.
  • First, the terminal device 20 determines whether or not the user inputs the search keywords 31 (step S10). When the search keywords 31 are inputted (step S10: Yes), the terminal device 20 transmits the search keywords 31 and the context 32 at that time to the server 10 (step S11).
  • The server 10 receives the search keywords 31 and the context 32, and searches for the facilities based on the search keywords 31 (step S12). If there exist search results (step S13: Yes), the server 10 determines the tag information 35 to be recommended, based on the search keywords 31, the search results 34, the context 32 and/or the user attribute 33 (step S14). On the other hand, if there exists no search result (step S13: No), the server 10 does not determine the tag information 35. Then, the server 10 transmits the search results 36 to the terminal device 20 (step S15). At that time, if the tag information 35 is determined in step S14, the tag information 35 is included in the search results 36 and transmitted to the terminal device 20. If there exists no search result in step S13, a message indicating that there exists no search result is transmitted to the terminal device 20.
  • When receiving the search results 36, the terminal device 20 displays the search result bars 41 and the tag icons 42 corresponding to the tag information 35 as shown in the example of FIG. 3A (step S16). Next, the terminal device 20 determines whether or not the tag icon 42 is touched by the user (step S17). If the tag icon 42 is not touched (step S17: No), the processing ends. On the other hand, if the tag icon 42 is touched (step S17: Yes), the terminal device 20 displays the tag contents 35 x corresponding to the tag icon 42 as shown in the example of FIG. 3B (step S18). Then, the processing ends.
  • [Tag Recommending Unit]
  • Next, the tag recommending unit 13 in the server 10 will be described in detail. As described above, the tag recommending unit 13 determines the tag information 35 based on the search keywords 31, the context 32, the user attribute 33 and the search results 34. FIG. 5 illustrates an example of a configuration of the tag recommending unit 13. The tag recommending unit 13 includes a vector conversion unit 13 a, a classifier 13 b and a tag DB 13 c.
  • The vector conversion unit 13 a generates a vector based on the inputted search keywords 31, the context 32 and the user attribute 33, and supplies it to the classifier 13 b. This vector is matrix data indicating the combination of the search keywords 31, the context 32 and the user attribute 33. The classifier 13 b determines plural candidates of the tag information 35 based on the inputted vector, and selects the most appropriate tag information 35 from those candidates and supplies it to the tag DB 13 c. The tag DB 13 c stores tag contents 35 x corresponding to the facility, for each POI (Point Of Interest) of plural facilities. The tag recommending unit 13 reads out the tag contents 35 x from the tag DB 13 c based on the search results 34 and the tag information 35 determined by the classifier 13 b, and outputs it with the tag information 35.
  • For example, in the above-described second example, the vector conversion unit 13 a generates the vector based on the search keywords “convenience store”, the context and the user attribute, and supplies it to the classifier 13 b. The classifier 13 b determines the tag information “Toilet” which fits to the vector best, and supplies it to the tag DB 13 c. Also, the POI indicating plural convenience stores are inputted to the tag DB 13 c as the search results 34. The tag DB 13 c outputs the tag contents 35 x (the store map in this example) for one or plural convenience stores having the tag information “Toilet”.
  • Next, learning processing of the classifier 13 b in the tag recommending unit 13 will be described. The classifier 13 b performs learning according to whether or not the tag information 35 presented to the user is actually selected by the user. When the search results recommending “Toilet” as the tag information 35 are displayed as shown in FIG. 3A, if the user presses the search result bars 41 of “A-SHOP” or “C-CONVENIENCE STORE” or the tag icons 42, it is presumed that the user is interested in the tag information “Toilet”. Namely, perhaps the tag information “Toilet” recommended by the tag recommending unit 13 is what the user wanted, and recommending the tag information “Toilet” was right. Therefore, the classifier 13 b performs weighting to the combination of the search keywords, the context and the user attribute at that time such that the tag information “Toilet” becomes more appropriate. Thus, if the same combination is inputted after that, “Toilet” is determined as the tag information with higher probability.
  • On the other hand, if the user does not press the search result bars 41 of “A-SHOP” or “C-CONVENIENCE STORE” or the tag icons 42, it is presumed that the user is not interested in the tag information “Toilet”. Namely, perhaps the recommended tag information “Toilet” is not what the user wanted, and recommending the tag information “Toilet” was wrong. Therefore, the classifier 13 b performs weighting to the combination of the search keywords, the context and the user attribute at that time such that the tag information “Toilet” becomes less appropriate. Thus, if the same combination is inputted after that, “Toilet” is determined as the tag information with lower probability. In this way, by executing the learning of the classifier 13 b based on the presence/absence of the user's selection of the presented tag information, it becomes possible to present more appropriate tag information to the user.
  • FIG. 6 is a flowchart of the learning processing of the classifier 13 b. This processing is mainly executed by the tag recommending unit 13 in the server 10. First, the user's selection result, in response to the search results displayed in the terminal device 20 as shown in FIG. 3A, is transmitted from the terminal device 20 to the server 10. The tag recommending unit 13 determines whether or not the tag icon 42 or the search result bar 41 including the tag icon 42 (hereinafter referred to as “search result bar 41 with tag) is selected (step S20).
  • If the tag icon 42 or the search result bar 41 with tag is selected (step S20: Yes), the tag recommending unit 13 puts a right flag to the combination of the search keywords, the context and the user attribute at that time (step S21). On the other hand, if the tag icon 42 or the search result bar 41 with tag is not selected (step S20: No), the tag recommending unit 13 puts a wrong flag to the combination of the search keywords, the context and the user attribute at that time (step S22). Then, the tag recommending unit 13 performs the learning of the classifier 13 b based on the result of updating the flag in steps S21 or S22 (step S23). Specifically, the tag recommending unit 13 makes the weight of the tag information larger to the combination to which the right flag is put, and makes the weight of the tag information smaller to the combination to which the wrong flag is put. Thus, the learning of the classifier 13 b is performed based on the user's selection of the presented tag information.
  • MODIFIED EXAMPLES
  • The following modified examples may be suitably applied in combination.
  • Modified Example 1
  • FIGS. 7A and 7B illustrate other examples of the search results including tag information. In the example of FIG. 3A, the search results are displayed in the order the distance from the current position, i.e., from the nearest one to the farthest one, regardless of the present/absence of the tag information. In contrast, in the example of FIG. 7A, the user can select one of the display mode in which the distance from the current position is preferential (hereinafter referred to as “distance priority mode”) and the display mode in which the tag information is preferential (hereinafter referred to as “tag priority mode”). Specifically, FIG. 7A is an example of the distance priority mode, wherein each search results are displayed in the order from the one nearest from the current position. If the user presses the switch button 43 labeled “TAG PRIORITY”, the display is switched to the tag priority mode shown in FIG. 7B. As shown in FIG. 7B, in the tag priority mode, the search results including the tag information are displayed with higher priority. In FIG. 7B, if the user presses the switch button 43 labeled “DISTANCE PRIORITY”, the display is switched to the distance priority mode shown in FIG. 7A. Thus, the user can select his or her favorite display mode.
  • Modified Example 2
  • While determining the tag information and/or supplying the tag contents are performed on the server 10 side in the above embodiment, those processing may be performed on the terminal 20 side. Specifically, the terminal device 20 is provided with the search unit 11, the search DB 12, the tag recommending unit 13 and the user attribute DB 14, which are provided in the server 10 in the above embodiment, and the terminal device 20 may determine the tag information and present the tag information to the user. In another example, the server 10 may be provided with the search unit 11 and the search DB 12, and the terminal device 20 may be provided with the tag recommending unit 13 and the user attribute DB 14. In this case, the server 10 performs the search based on the search keywords and transmits the search results to the terminal device 20. The terminal device 20 determines the tag information by the tag recommending unit 13 based on the received search results, and presents the tag information to the user.
  • Modified Example 3
  • In the above embodiment, when the user touches the tag icon 42 in the display example of the search results shown in FIG. 3A, the corresponding tag contents are displayed. Instead, when the tag icon 42 is touched, a new search using the tag contents as the search keywords may be performed. For example, when the user touches the tag icon 42 corresponding to “Toilet”, a new search for the facility using the search keyword “Toilet” is performed. Thus, it becomes possible to present the results of the search mainly using the tag information “Toilet”, in which the user interested.
  • Modified Example 4
  • In the above embodiment, the tag information is information relevant to the equipment belonging to the facility. Instead, the tag information may be information relevant to events or services performed in the facility, for example.
  • Modified Example 5
  • In the above example, the present invention is applied to the search for the facilities in the car navigation device. Instead, the present invention may be applied to a general Web search. For example, when a user watching a drama currently being broadcasted (Drama name: “◯◯”) makes a search with a search keyword “Actor” and the detail of the search results includes the drama “◯◯”, a tag “XX (the actor appearing in the drama ◯◯)” may be added to the search results.
  • INDUSTRIAL APPLICABILITY
  • This invention can be used for a system displaying information on a terminal device.
  • BRIEF DESCRIPTION OF REFERENCE NUMBERS
      • 1 Information display system
      • 10 Server
      • 11 Search unit
      • 12 Search DB
      • 13 Tag recommending unit
      • 14 User attribute DB
      • 20 Terminal device
      • 20 a Smartphone
      • 20 b Car navigation device

Claims (9)

1. An information display device comprising:
a receiving unit configured to receive a search condition from a user;
a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition;
an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and
a display unit configured to display an additional image indicating the additional information, together with the search object information.
2. The information display device according to claim 1,
wherein the search object information includes facility information relevant to a facility, and
wherein the display unit preferentially displays the facility information of a facility having the additional information, out of a plurality of facility information, in accordance with a designation by the user.
3. The information display device according to claim 1,
wherein the search object information includes facility information relevant to a facility,
wherein the additional information is information relevant to equipment existing in the facility, and
wherein the display unit displays detailed image information indicating a position of the equipment in the facility when the user selects the additional information.
4. The information display device according to claim 3,
wherein the search object obtaining unit obtains the facility information obtained by a new search based on the search condition including the equipment corresponding to the additional image when the user selects the additional information, and
wherein the display unit displays the facility information obtained by the new search.
5. The information display device according to claim 1, wherein the additional information obtaining unit obtains the additional information based on the attribute and/or the current situation of the user obtained, without receiving an input by the user.
6. An information display method executed by an information display device, comprising:
a receiving process configured to receive a search condition from a user;
a search condition obtaining process configured to obtain search object information indicating a search object searched based on the search condition;
an additional information obtaining process configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and
a display process configured to display an additional image indicating the additional information, together with the search object information.
7. A non-transitory computer-readable medium storing a program executed by an information display device including a computer, the program causing the computer to function as:
a receiving unit configured to receive a search condition from a user;
a search object obtaining unit configured to obtain search object information indicating a search object searched based on the search condition;
an additional information obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and
a display unit configured to display an additional image indicating the additional information, together with the search object information.
8. (canceled)
9. An information display system comprising:
a receiving unit configured to receive a search condition from a user;
a first obtaining unit configured to obtain an attribute and/or a current situation of the user;
a second obtaining unit configured to obtain search object information indicating a search object searched based on the search condition;
a third obtaining unit configured to obtain additional information with respect to the search object, based on the search condition, an attribute and/or a current situation of the user; and
a display unit configured to display an additional image indicating the additional information, together with the search object information.
US16/081,868 2016-03-01 2016-03-01 Information display device, information display method, program and information display system Abandoned US20190080011A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/056213 WO2017149648A1 (en) 2016-03-01 2016-03-01 Information display device, information display method, program, and information display system

Publications (1)

Publication Number Publication Date
US20190080011A1 true US20190080011A1 (en) 2019-03-14

Family

ID=59742611

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/081,868 Abandoned US20190080011A1 (en) 2016-03-01 2016-03-01 Information display device, information display method, program and information display system

Country Status (3)

Country Link
US (1) US20190080011A1 (en)
JP (1) JPWO2017149648A1 (en)
WO (1) WO2017149648A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110241882A1 (en) * 2010-04-01 2011-10-06 Sony Ericsson Mobile Communications Japan, Inc. Mobile terminal, location-based service server, and information providing system
US20140310268A1 (en) * 2013-06-27 2014-10-16 Google Inc. Location refinement
US20160041982A1 (en) * 2014-08-05 2016-02-11 Facebook, Inc. Conditioned Search Ranking Models on Online Social Networks
US20160196010A1 (en) * 2010-05-21 2016-07-07 Telecommunication Systems, Inc. Personal Wireless Navigation System
US10048079B2 (en) * 2014-06-19 2018-08-14 Denso Corporation Destination determination device for vehicle and destination determination system for vehicle

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3447816B2 (en) * 1994-09-27 2003-09-16 株式会社東芝 Information guidance device
JP2006195637A (en) * 2005-01-12 2006-07-27 Toyota Motor Corp Spoken dialogue system for vehicles
JP4588531B2 (en) * 2005-05-23 2010-12-01 株式会社ケンウッド SEARCH DEVICE, PROGRAM, AND SEARCH METHOD
JP2010237134A (en) * 2009-03-31 2010-10-21 Equos Research Co Ltd Destination presentation system and navigation system
JP5953336B2 (en) * 2014-05-30 2016-07-20 ヤフー株式会社 Distribution apparatus, distribution method, and distribution program
WO2015198376A1 (en) * 2014-06-23 2015-12-30 楽天株式会社 Information processing device, information processing method, program, and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110241882A1 (en) * 2010-04-01 2011-10-06 Sony Ericsson Mobile Communications Japan, Inc. Mobile terminal, location-based service server, and information providing system
US20160196010A1 (en) * 2010-05-21 2016-07-07 Telecommunication Systems, Inc. Personal Wireless Navigation System
US20140310268A1 (en) * 2013-06-27 2014-10-16 Google Inc. Location refinement
US10048079B2 (en) * 2014-06-19 2018-08-14 Denso Corporation Destination determination device for vehicle and destination determination system for vehicle
US20160041982A1 (en) * 2014-08-05 2016-02-11 Facebook, Inc. Conditioned Search Ranking Models on Online Social Networks

Also Published As

Publication number Publication date
JPWO2017149648A1 (en) 2018-12-20
WO2017149648A1 (en) 2017-09-08

Similar Documents

Publication Publication Date Title
US11841237B2 (en) Navigation device, navigation system, and method of operating the same
US9909899B2 (en) Mobile terminal and control method for the mobile terminal
JP6915195B2 (en) Designated reception system, search system, search terminal, designated reception program, search program and search terminal control program
JP6102282B2 (en) Facility search system, facility search program, and vehicle device
JP5560229B2 (en) Route search system and route search method
US8812230B2 (en) Navigation device
JP6260084B2 (en) Information search system, vehicle device, portable communication terminal, information search program
WO2018142680A1 (en) Portable information terminal, information acquisition method, and information provision system
US10048079B2 (en) Destination determination device for vehicle and destination determination system for vehicle
CN111462513B (en) Server, server control method, communication terminal, terminal control method, and computer-readable medium
KR101504990B1 (en) Travel contents providing system based on the location information
US9996633B2 (en) Device for creating facility display data, facility display system, and program for creating data for facility display
JP2024045381A (en) Information display device, information display method, and program
US20160003635A1 (en) Apparatus and method for providing location based multimedia contents
WO2013145643A1 (en) Information provision system
US20190080011A1 (en) Information display device, information display method, program and information display system
JP5607848B1 (en) Portable information terminal, computer program, and operation control system
JP2009086720A (en) Facility retrieval system
CN108762713B (en) Method and device for positioning and navigating reserved vehicle and computer readable storage medium
JP2010231346A (en) Retrieval support device, retrieval support method, and retrieval support program
JP2018088120A (en) Signage system, digital signage terminal, server, control method and display method
US20160004755A1 (en) Information providing device, information providing program, information providing server, and information providing method
JP7456926B2 (en) Information processing device, information processing method, and program
JP2011180086A (en) Device, method and program for specifying feature
KR20170015694A (en) Method for searching and inserting intermediate point in navigation and apparatus thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: PIONEER CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MITSUI, TETSUYA;REEL/FRAME:046771/0785

Effective date: 20180620

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION