CN115712379A - Interaction method for vehicle-mounted machine system, vehicle-mounted machine system and vehicle - Google Patents
Interaction method for vehicle-mounted machine system, vehicle-mounted machine system and vehicle Download PDFInfo
- Publication number
- CN115712379A CN115712379A CN202110958307.2A CN202110958307A CN115712379A CN 115712379 A CN115712379 A CN 115712379A CN 202110958307 A CN202110958307 A CN 202110958307A CN 115712379 A CN115712379 A CN 115712379A
- Authority
- CN
- China
- Prior art keywords
- user
- display interface
- content
- interface
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 230000003993 interaction Effects 0.000 title claims abstract description 19
- 230000004044 response Effects 0.000 claims abstract description 22
- 230000009471 action Effects 0.000 claims description 40
- 230000015654 memory Effects 0.000 claims description 33
- 238000012545 processing Methods 0.000 claims description 11
- 230000000873 masking effect Effects 0.000 claims description 5
- 230000001788 irregular Effects 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 12
- 230000001360 synchronised effect Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 4
- 239000000446 fuel Substances 0.000 description 3
- 230000036541 health Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000007659 motor function Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000010422 painting Methods 0.000 description 2
- 230000002085 persistent effect Effects 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 238000013186 photoplethysmography Methods 0.000 description 1
- 229920002239 polyacrylonitrile Polymers 0.000 description 1
- 201000006292 polyarteritis nodosa Diseases 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The application provides an interaction method for a vehicle-mounted machine system, the vehicle-mounted machine system and a vehicle comprising the vehicle-mounted machine system. The method of the present application comprises: receiving input of a first user to select at least a portion of content presented in a first display interface; detecting a first user sharing instruction; sharing at least a part of content to a second display interface in response to detecting the first user sharing instruction; and presenting at least a portion of the content in a second display interface, wherein the first display interface and the second display interface are associated with a car machine system; at least a portion of the content includes at least a portion of an area of an interface of at least one application.
Description
Technical Field
The application relates to the field of vehicles, in particular to an interaction method for a vehicle-mounted machine system, the vehicle-mounted machine system and a vehicle comprising the vehicle-mounted machine system.
Background
Vehicles are already inseparable from people's lives. Vehicles perform many functions in addition to basic transportation. For example, a vehicle may include a display system, a sound system, a climate control system, a navigation system, an entertainment system, an interface with mobile devices, a communication system to communicate with devices and systems external to the vehicle, and so forth. The driver or the passenger in the vehicle can use these additional functions and enjoy the convenience of these additional functions, and can increase the interaction between the passengers in the vehicle, even between the passengers in the vehicle and others outside the vehicle and improve the ride quality using these additional functions.
The Vehicle machine is a short name of an In-Vehicle information entertainment (IVI) product installed on a Vehicle, and can realize information communication between people and the Vehicle and between the Vehicle and the outside (Vehicle and Vehicle) In function. For vehicles, there is currently a need to build feasibility for vehicle intelligence and human-based unique experiences, enhance user experience, and improve user satisfaction.
Disclosure of Invention
This application summarizes aspects of the embodiments and should not be used to limit the claims. Other embodiments are contemplated in accordance with the techniques described herein, as will become apparent to those skilled in the art upon study of the following drawings and detailed description, and are intended to be included within the scope of the present application.
The inventor of the application recognizes that an interaction method for a vehicle-mounted device system and the vehicle-mounted device system are needed, so that the interaction among passengers in a vehicle can be increased, the riding interest can be improved, the user experience can be enhanced, and the user satisfaction can be improved through the method and the system.
According to an aspect of the present application, an interaction method for a car machine system is provided, which includes:
receiving input of a first user to select at least a portion of content presented in a first display interface;
detecting a first user sharing instruction;
sharing the at least a portion of content to a second display interface in response to detecting the first user sharing instruction; and
presenting the at least a portion of the content in the second display interface,
wherein the first display interface and the second display interface are associated with the in-vehicle machine system; the at least a portion of the content includes at least a portion of an area of an interface of at least one application.
In an embodiment of the application, the at least a portion of the area corresponds to a first area of the interface of the application, and the method further comprises masking a second area of the interface of the application not selected by the first user without involvement of the first user.
In an embodiment of the present application, the method further comprises: presenting the at least a portion of content in the second display interface in synchronization with the first display interface, but not presenting content in the second area that is masked.
In an embodiment of the present application, the method further comprises: receiving a second user's input at the second display interface and synchronizing the second user's input to the first display interface.
In an embodiment of the present application, the method further comprises: receiving input of a second user at the second display interface, wherein the input of the second user is used for closing and/or returning the at least one part of shared content to the first display interface.
In an embodiment of the present application, the method further comprises: stopping sharing the at least a portion of the content to the second display interface in response to detecting a first user stop sharing instruction.
In an embodiment of the application, the first user sharing instruction and/or the first user sharing stop instruction include a user action and/or a user voice instruction.
In an embodiment of the application, the user action comprises a contact action of a user operating the first display interface and/or a user non-contact action comprising one or more of a user limb action, a head action, a line of sight action.
In an embodiment of the application, the contact action includes one or more of clicking, dragging, sliding to a specific direction, and multi-finger operation, and the contact action is associated with a position of the second display interface receiving the at least a portion of content relative to the first display interface.
In an embodiment of the application, at least a portion of the content presented in the first display interface is selected by selecting at least a portion of an irregular area of the first display interface.
In an embodiment of the application, the first display interface includes a plurality of display regions that are pre-divided, wherein the method further comprises selecting at least one of the plurality of display regions to select at least a portion of the content presented in the first display interface.
In an embodiment of the application, the first display interface and/or the second display interface is/are provided on a vehicle.
In an embodiment of the application, the first display interface and/or the second display interface is a display interface of a mobile terminal.
In an embodiment of the present application, the at least a portion of the content includes an input interface, a search interface, a game interface, a driving interface, and/or a question and answer interface.
In an embodiment of the present application, the method further comprises: and responding to the first user sharing instruction, receiving input of a second user on the second display interface, processing the at least one part of content based on the input of the second user, and synchronizing the processed at least one part of content to the first display interface.
According to another aspect of the present application, there is provided a car machine system, comprising a processor and a memory, the memory storing processor-executable instructions, which when executed by the processor, implement the steps of:
receiving input of a first user to select at least a portion of content presented in a first display interface;
detecting a first user sharing instruction;
sharing the at least a portion of content to a second display interface in response to detecting the first user sharing instruction; and
presenting the at least a portion of the content in the second display interface,
wherein the first display interface and the second display interface are associated with the in-vehicle machine system; the at least a portion of the content includes at least a portion of an area of an interface of at least one application.
In an embodiment of the application, the at least one partial area corresponds to a first area of an interface of the application, and the instructions when executed by the processor further implement the steps of: masking a second region of an interface of the application not selected by the first user without involvement of the first user.
In an embodiment of the application, the instructions when executed by the processor further implement the steps of: receiving a second user's input at the second display interface and synchronizing the second user's input to the first display interface.
In an embodiment of the application, the instructions when executed by the processor further implement the steps of: stopping sharing the at least a portion of the content to the second display interface in response to detecting a first user stop sharing instruction.
According to yet another aspect of the present application, there is provided a vehicle comprising a vehicle machine system including a processor and a memory, the memory storing processor-executable instructions that, when executed by the processor, implement the steps of:
receiving a first input of a first user to select at least a portion of content presented in a first display interface;
detecting a first user sharing instruction;
sharing the at least a portion of content to a second display interface in response to detecting the first user sharing instruction;
presenting the at least a portion of the content in the second display interface;
receiving a second input by the first user or a second user;
synchronizing the second input at the first display interface and the second display interface;
receiving a sharing stopping instruction of the first user or the second user; and
stopping the sharing of the at least a portion of the content,
wherein the first display interface and the second display interface are associated with the in-vehicle machine system; the at least a portion of content includes at least a portion of an area of an interface of at least one application.
Drawings
For a more complete understanding of embodiments of the present application, reference should be made to the embodiments illustrated in greater detail in the accompanying drawings and described below by way of example, wherein:
FIG. 1 illustrates a vehicle including a vehicle machine system according to an embodiment of the present application;
FIG. 2 illustrates an exemplary block topology diagram of an in-vehicle management system according to the present application;
fig. 3 shows a block diagram of steps implemented by executable instructions included in a car machine system according to an embodiment of the present application when executed;
FIG. 4a shows a schematic diagram of a first display interface according to an embodiment of the present application;
FIG. 4b shows another schematic diagram of a first display interface according to an embodiment of the present application;
fig. 5 shows a block diagram of steps implemented by executable instructions included in a car machine system according to an embodiment of the present application when executed;
fig. 6 shows a block diagram of steps implemented by executable instructions included in a car machine system according to an embodiment of the present application when executed;
FIG. 7 shows a schematic diagram of a first display interface and a second display interface according to an embodiment of the application;
FIG. 8 is a block diagram illustrating steps implemented by a car machine system when executable instructions included therein are executed according to an embodiment of the present application;
fig. 9 shows a block diagram of steps implemented by executable instructions included in a car machine system according to an embodiment of the present application when executed;
FIG. 10 shows a schematic view of a first display interface according to an embodiment of the present application;
FIG. 11 shows a schematic diagram of a first display interface according to an embodiment of the present application; and
fig. 12 shows a schematic diagram of interaction between a car machine system and a first display interface and a second display interface according to an embodiment of the present application.
Detailed Description
Embodiments of the present disclosure are described below. However, it is to be understood that the disclosed embodiments are merely examples and that other embodiments may take various and alternative forms. The figures are not necessarily to scale; certain features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present application. As will be appreciated by one of skill in the art, various features illustrated and described with reference to any one of the figures may be combined with features illustrated in one or more other figures to produce embodiments that are not explicitly illustrated or described. The combination of features shown provides a representative embodiment for a typical application. However, various combinations and modifications of the features consistent with the teachings of the present disclosure may be desirable for certain specific applications or implementations.
Moreover, in this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
One or more embodiments of the present application will be described below with reference to the accompanying drawings. Flow diagrams illustrate processes performed by systems according to the present application, it being understood that the flow diagrams need not be performed in an order, one or more steps may be omitted, one or more steps may be added, and one or more steps may be performed in an order or reversed, or even simultaneously in some embodiments.
The vehicle 100 referred to in the following embodiments may be a standard gasoline powered vehicle, a hybrid vehicle, an electric vehicle, a fuel cell vehicle, and/or any other type of vehicle, and may also be other vehicles such as a bus, a ship, or an aircraft. Vehicles include mobility-related components such as an engine, electric motor, transmission, suspension, drive shafts, and/or wheels, among others. The vehicle may be non-autonomous, semi-autonomous (e.g., some conventional motor functions are controlled by the vehicle), or autonomous (e.g., motor functions are controlled by the vehicle without direct driver input).
The following embodiments refer to "user", "driver", "occupant", "passenger", etc., which in one or more embodiments are used to illustrate the interaction between the vehicle and the user, and in some cases the roles may be exchanged or otherwise referred to without departing from the spirit of the application.
In order to increase the interaction among passengers in a vehicle, improve the riding interest, enhance the user experience, and improve the user satisfaction, the present application provides an interaction method 300 for a vehicle-mounted device system, as shown in fig. 3. The application also discloses a vehicle machine system 105 (hereinafter referred to as "system 105") and a vehicle 100 comprising the system, as shown in fig. 1.
In some embodiments, system 105 may be part of an onboard management system (also known as a Vehicle Computing System (VCS) 1 in motor vehicle 100, or separate from onboard management system 1 and connected to onboard management system 1 to share some or all of its functionality, as further shown in the exemplary block topology of onboard management system 1 of FIG. 2, onboard management system 1 includes a processor (CPU) 3 and a memory 7, memory 7 storing processor-executable instructions that, when executed by processor 3, may implement steps S125, S128, S130, S135, S140, and S145 of method 300 shown in FIG. 3. Method 300 may include beginning at step S125; in step S128, input from a first user is received to select at least a portion of content presented in a first display interface, in step S130, a first user sharing instruction is detected, in step S135, at least a portion of the content is shared to a second display interface in response to detecting the first user sharing instruction, in step S140, at least a portion of the content is presented in the second display interface, the method 300 may end in step S145, wherein the first display interface and the second display interface may be associated with the in-vehicle machine system 105, and at least a portion of the content may include at least a portion of an area of an interface of at least one application, as described elsewhere herein Or the second display interface is another terminal that establishes a communication connection with the in-vehicle machine system 105, for example, the passenger establishes a communication connection with the in-vehicle machine system 105 through a wired or wireless manner using his mobile terminal, PAD, personal computer, etc. It should be understood that the above steps also do not necessarily represent the execution order, for example, in some embodiments, the first user sharing instruction may be detected first as described in step S130, then the input of the first user is received to select at least a portion of the content presented in the first display interface as described in step S128, and at least a portion of the content is shared to the second display interface in response to detecting the first user sharing instruction as described in step S135, and then at least a portion of the content is presented in the second display interface as described in step S140.
The first user "sharing instructions" referred to herein or elsewhere in this application may include executable instructions to cause the first user selected particular content to be shared to the second user interface; in another embodiment, the sharing instructions further include executable instructions such that a second user input is receivable at the second user interface and the shared particular content is processed and updated by the processor based on the second user input; in yet another embodiment, the sharing instructions further include executable instructions such that the particular content edited or processed or updated based on the second user input may be synchronized to the first user interface. In other words, the term "sharing instruction" may include an executable instruction for sharing the specific content, sharing the processing authority of the specific content, enabling the specific content to receive the authority sharing of the second input, and subsequent data to be operated in multiple aspects such as updating and synchronizing the first and second user interfaces.
Next, an exemplary hardware environment of the on-vehicle management system 1 for the vehicle 100 is explained with reference to fig. 2. An example of the operating system built in this onboard administration system 1 is the SYNC system manufactured by ford motor company. The vehicle 100 provided with the on-board management system 1 may include a display 4 (as shown in fig. 1) located in the vehicle 100, and the display 4 may be one or more, so as to present vehicle information or content interacting with the vehicle, such as display of information related to the vehicle and vehicle driving and display and interaction of various application programs installed in the on-board management system, alone or in cooperation. By way of example, and not limitation, display types may include HUD (heads up display), CRT (cathode ray tube) display, LCD (liquid crystal) display, LED (light emitting diode) display, PDP (plasma display), laser display, VR (virtual reality) display, and the like. It is understood that the display may be located at a suitable location on the vehicle, such as, but not limited to, a center console; and projected on any suitable background such as, but not limited to, a vehicle window, windshield, etc.
The processor 3 in the onboard management system 1 controls at least a part of its own operations. The processor 3 is capable of executing in-vehicle processing instructions and programs, such as the processor-executable instructions described herein with respect to the in-vehicle management system 1. The processor 3 is connected to a non-persistent memory 5 and a persistent memory 7. The memories 5, 7 may comprise volatile and non-volatile memories such as Read Only Memory (ROM), random Access Memory (RAM) and Keep Alive Memory (KAM). The memories 5, 7 may be implemented using any number of known memory devices, such as programmable read-only memories (PROMs) capable of storing data, EPROMs (electrically programmable read-only memories), EEPROMs (electrically erasable programmable read-only memories), flash memories, or any other electronic, magnetic, optical, or combination memory devices. The memories 5, 7 may store processor-executable instructions of the in-vehicle control system 1, for example.
The processor 3 is also provided with a number of different inputs to allow a user to interact with the processor. In an illustrative embodiment, the inputs include a microphone 29 configured to receive voice signals, an auxiliary input 25 (e.g., CD, tape, etc.) for input 33, a USB (Universal Serial bus) input 23, a GPS (Global positioning System) input 24, and a Bluetooth input 15. An input selector 51 is also provided to allow the user to switch between various inputs. The microphone and auxiliary connector inputs may be converted from analog to digital signals by a converter 27 before being passed to the processor. In addition, although not shown, a plurality of vehicle components and auxiliary components in communication with the in-vehicle management system 1 may use a vehicle network (such as, but not limited to, a CAN (controller area network) bus) to transmit data to or receive data from the in-vehicle management system 1 (or components thereof).
Additionally, the processor 3 may communicate with a plurality of vehicle sensors and drivers via an input/output (I/O) interface, which may be implemented as a single integrated interface providing a plurality of raw data or signal conditioning, processing and/or conversion, short circuit protection, and the like. Further, by way of example and not limitation, the types of sensors communicatively connected to the processor 3 may include, for example, cameras, ultrasonic sensors, seat/pressure sensors, fuel level sensors, engine speed sensors, temperature sensors, photoplethysmography sensors, and the like, to recognize user interaction information such as button presses, voice, touch, text input, facial expressions or actions, hand gestures or actions, head gestures or actions, and limb gestures or actions, as well as to recognize information such as fuel level, powertrain faults, in-vehicle and out-vehicle temperatures, and the like.
The outputs of the in-vehicle management system 1 may include, but are not limited to, the display 4, the speaker 13, and various actuators. A speaker 13 may be connected to the amplifier 11 and receive its signal from the processor 3 through a digital-to-analog converter 9. The output of the system may also be output to a remote bluetooth device (e.g., personal navigation device 54) or USB device (e.g., vehicle navigation device 60) along the bi-directional data streams shown at 19, 21, respectively.
In one illustrative embodiment, the in-vehicle management system 1 communicates with a user's nomadic device 53 (e.g., cell phone, smart phone, personal digital assistant, etc.) using the antenna 17 of the BLUETOOTH transceiver 15. The nomadic device 53 can then communicate 59 with a cloud 125 outside the vehicle 31 through, for example, communication 55 with a cellular tower 57. In some embodiments, cellular tower 57 may be a Wi-Fi (Wireless local area network) access point. Signal 14 represents exemplary communication between nomadic device 53 and BLUETOOTH transceiver 15. Pairing of the nomadic device 53 and the BLUETOOTH transceiver 15 can be instructed through a button 52 or similar input, indicating to the processor 3 that the onboard BLUETOOTH transceiver will be paired with a BLUETOOTH transceiver in a nomadic device.
Data may be transferred between the processor 3 and the cloud 125 using, for example, a data plan (data-plan), data over voice (data over voice), or dual tone multi-frequency (DTMF) tones associated with the nomadic device 53. Alternatively, the in-vehicle management system 1 may include an in-vehicle modem 63 having an antenna 18 to communicate 16 data between the processor 3 and the nomadic device 53 over a voice band (voice band). The nomadic device 53 can then communicate 59 with a cloud 125 outside the vehicle 31 through, for example, communication 55 with a cellular tower 57. In some embodiments, the modem 63 may establish communications 20 directly with the cell tower for further use in communicating with the cloud 125. As a non-limiting example, modem 63 may be a USB cellular modem and communication 20 may be cellular communication.
In one illustrative embodiment, the processor is provided with an operating system that includes an API (application programming interface) that communicates with modem application software. The modem application software may access an embedded module or firmware on the BLUETOOTH transceiver 15 to complete wireless communication with a remote BLUETOOTH transceiver (such as that provided in a nomadic device). Bluetooth is a subset of the IEEE 802PAN (personal area network) protocol. The IEEE 802LAN (local area network) protocol includes Wi-Fi and has considerable cross-functionality with IEEE 802 PANs. Both are suitable for wireless communication in a vehicle. Other communication means may include free space optical communication (e.g., infrared data protocol, irDA) and non-standard consumer infrared (consumer IR) protocols, among others.
In an embodiment, roaming device 53 may be a wireless Local Area Network (LAN) device capable of communicating over, for example, an 802.11 network (e.g., wi-Fi) or a WiMax (worldwide interoperability for microwave Access) network. Other sources that may interact with the vehicle include a personal navigation device 54 having, for example, a USB connection 56 and/or an antenna 58, or a vehicle navigation device 60 having a USB 62 or other connection, an onboard GPS device 24, or a remote navigation system (not shown) connected to the cloud 125.
Further, the processor 3 may communicate with a plurality of other auxiliary devices 65. These devices may be connected by a wireless 67 or wired 69 connection. Also or alternatively, the CPU may be connected to a vehicle-based wireless router 73 using, for example, a Wi-Fi 71 transceiver. This may allow the CPU to connect to remote networks within range of the local router 73. The auxiliary devices 65 may include, but are not limited to, personal media players, wireless health devices, mobile computers, and the like.
In an embodiment of the application, as shown in fig. 4a, the system 105 may comprise a first display interface 106, which first display interface 106 may be a display interface of the display 4, for example as described above. Additionally, the system 105 may include other display interfaces, such as a second display interface, a third display interface, etc., which may be associated with the system 105, e.g., mounted on the vehicle and may be located in different locations of the vehicle, e.g., may be located on a front seat back of the vehicle for use by rear occupants; the first display interface and/or the other plurality of display interfaces may also be display interfaces of a mobile terminal capable of establishing a connection with the system 105. The mobile terminal may comprise, for example, a mobile phone, a tablet, a laptop, etc. The mobile terminal may also be a wearable device, such as a "smart" watch, health ring, smart apparel, accessories, and the like. The mobile terminal may include the hardware and software necessary to establish communication with the system 105 in order to implement the method 300 described herein. The mobile terminal may or may not be located within the vehicle 100 and may communicate with the system 105 and other mobile terminals within the vehicle via a network and cloud.
The first display interface 106 may include at least one application 107. In the embodiment shown in fig. 4a, the first display interface 106 may include six applications, however, the number of applications 107 is not limited thereto. When a user desires to use an application, the user may open the application in a manner common in the art, thereby presenting an interface of the opened application in the first display interface 106. The interface of the opened application may be presented in full screen or picture-in-picture in the first display interface 106. As shown in fig. 4b, the interface 108 of the application 107 is presented in the first display interface 106 in a picture-in-picture format, and the interface 108 of the application 107 can be moved by the user to any position in the first display interface 106. The application 107 may be a video/music-like application that, when opened, may present user-selected video/music content in the first display interface 106 in a full-screen or picture-in-picture format. The application 107 may be a social communication application, a document processing application, a recreation application, a life utility application, a photography application, a reading application, a shopping application, a finance application, a travel application, or other application that may be available or downloadable to the system 105.
There is a need for increased interaction between occupants within vehicle 100 during travel of vehicle 100, or when vehicle 100 is temporarily parked. In one embodiment, a user using the first display interface 106 may wish to share content presented by the first display interface 106 with other users, for example, a user using the first display interface 106 may wish to share an interface of at least one application presented in the first display interface 106. Or the users may process or complete a task together through the first display interface 106 and/or other display interfaces, as described elsewhere herein. The other users may be users who are also located in the vehicle 100, or may be other users who are not in the vehicle but are in communicative connection with the system 105 or mobile terminal that includes the first display interface 106. Co-processing or completing a task may include co-completing vehicle driving operations, co-processing documents, co-completing paintings, co-completing questionnaires, co-completing games, and so forth.
The present application is further described below with reference to fig. 5 on the basis of fig. 4a and 4b, wherein a method flow 500 implemented by executable instructions included in the system 105 according to an embodiment of the present application is shown when executed.
Flow 500 may begin at block 505. For example, the process 500 may begin in response to vehicle ignition. The process 500 may also begin, as desired, in response to the system 105 detecting that the user is at any time within the vehicle 100. In one example, the system 105 may detect that the user is inside the vehicle 100 through a sensor such as a microphone, a camera, a touch sensor, a seat sensor/pressure sensor, or a pairing of movable terminals.
Next, the first display interface 106 is activated at block 510. For example, the user may turn on a power switch of the first display interface 106 or unlock the first display interface 106 from a locked state. According to the requirement, the user can open any one of the application programs 107 to present the interface 108 of the application program 107 on the first display interface 106 in a full screen or picture-in-picture form.
Thereafter, input from a user is received to select at least a portion of the content presented in the first display interface at block 515.
In one embodiment, the user may select content presented in the first display interface 106 by selecting a desired scope in the first display interface 106, which may include at least a portion of an area of the interface 108 of the at least one application 107. For example, the user may draw a regular or irregular closed figure in the first display interface 106 with at least one of his fingers to select the desired sharing range, or the user may select the desired sharing range by enlarging or reducing a rectangular, circular, or other shaped selection box with at least one of his fingers. In another embodiment, the user may take a screen shot of the first display interface 106. And when the complete screen shot needs to be shared, sharing the complete screen shot. The screen shot may also be cropped, as needed, to crop out a portion of the full screen shot, for example, by selecting and cropping through a selection box that may be zoomed in and out. In yet another embodiment, the user may directly capture at least a portion of the first display interface 106, for example, the user may capture the selected range after selecting the range to be shared as described above. Screenshots are particularly useful for comparison in two scenarios, for example, where multiple documents need to be compared when using a document processing class application, differences between two or even more documents may be visually compared using screenshots. In addition, a plurality of application programs can be intercepted and shared simultaneously through screen capture.
In further embodiments, the first display interface 106 may be divided into a plurality of display regions, and the user may select at least one of the divided display regions to select content presented in the first display interface 106. As shown in FIG. 11, the first display interface 106 may be divided into N rows and N columns, where a first row and a first column may be labeled A1, a first row and a second column may be labeled A2, \ 8230; \8230; a first row and An nth column may be labeled An; the first column of the second row can be marked as B1, the second column of the second row can be marked as B2, \8230; the nth column of the second row can be marked as Bn; the third row, first column may be labeled C1, the third row, second column may be labeled C2, \8230 \ 8230and the third row, nth column may be labeled Cn. Row N the first column may be labeled N1, row N the second column may be labeled N2, \8230column \8230, row N the nth column may be labeled Nn. The user may select, for example, the display area 109 formed by B1, B2, C1, and C2 for sharing by a voice instruction or selection of a division mark on the display interface with respect to the display interface.
In further embodiments, an interface of at least one application 107 presented in the first display interface 106 may also be selected. For example, the interface of the at least one application 107 presented in the first display interface 106 may be selected by one of user finger selection, screen capture, and division of the display area as described above. Furthermore, the interface of the at least one application 107 presented in the first display interface 106 may also be selected by voice instructions. In the embodiment shown in FIG. 4b, the user may also select the interface 108 of the application 107 presented in picture-in-picture form in the first display interface 106 by tapping, double-clicking, or otherwise touching the interface 108; alternatively, the user may select the interface 108 by finger selection or by a selection box. In the case where interfaces of a plurality of applications are presented in the first display interface 106, the user may also select the interfaces of the plurality of applications. In one or more embodiments, the content selected for sharing may include a still picture or a video picture formed of multiple frames.
Next, a user sharing instruction is detected at block 520. In an embodiment of the present application, the user sharing instruction may include a user action and/or a user voice instruction. The user action may include a contact action of the user operating the first display interface 106 and/or a non-contact action of the user, where the non-contact action of the user may include one or more of a limb action, a head action, and a sight line action of the user, for example, a limb/hand of the user makes a predetermined sharing instruction action, and the predetermined sharing instruction action of the user may be determined as the user sharing instruction through a proximity sensor or a camera, etc. The contact action may include one or more of clicking, dragging, sliding in a particular direction, multi-finger operation, e.g., a user's finger may click on a virtual confirmation button shared by the user on the first display interface 106; the user can drag the selected content to a position convenient for sharing in the first display interface 106 according to needs, and then slide the shared content to a specific direction; the user can also move and slide the shared content by using multi-finger operation. Further, the contact action may be associated with a position of the other display interface relative to the first display interface 106 that receives the user-selected at least a portion of the content in the first display interface 106. For example, one or more fingers of the driver may slide to the right to share content presented in the selected first display interface 106 to a second display interface carried or worn by an occupant located to the right of the driver, or vice versa. For another example, one or more fingers of an occupant seated in a front seat of the vehicle 100 may slide to the lower right to share selected content presented in the first display interface 106 to a display screen disposed at the rear of a seat back of the front seat (the display interface of the display screen may also be the display interface described herein), or to a second display interface carried or worn by an occupant seated in a rear seat.
Thereafter, a determination is made at block 525 whether a user sharing instruction is detected. If a user action is detected as described above or an instruction from the user is received indicating that the user wishes to share, which may indicate that a user sharing instruction is detected, then flow 500 may proceed to block 530, otherwise, flow 500 may return to block 520.
At block 530, at least a portion of the content presented in the selected first display interface 106 is shared to the second display interface. As described above, the second display interface may be a display interface of a display provided on the vehicle, or may be a display interface of a mobile terminal carried or worn by the user. The shared content can be transferred to the second display interface in real time, for example, in various modes similar to the modes of the PPT when the slides are switched. Under the condition that the shared content needs to be shared by a plurality of users, the shared content can be transmitted to a plurality of display interfaces. In one or more embodiments, when sharing content to the second display interface, the first display interface may continue to present the shared content. For example, the driver may share at least a portion of the content presented in the selected first display interface 106 through voice instructions to the display interfaces of the mobile terminals carried or worn by multiple occupants seated in the rear seats, which will receive the shared content at the same time. For another example, the occupant on the right side of the driver may first slide down left and down right on the first display interface 106, and then share the shared content to the occupants on the left and right sides of the rear seat.
Thereafter, the user is detected to stop sharing instructions at block 535. In an embodiment of the present application, the instruction for the user to stop sharing may also include a user action and/or a user voice instruction similar to those described above. For example, when a limb/hand of the user makes a predetermined sharing stopping instruction action, the predetermined sharing stopping instruction action of the user can be determined as a user sharing stopping instruction through a proximity sensor or a camera; the user's finger may click on a virtual confirmation button on the first display interface 106 that the user stopped sharing; one or more fingers of the user may slide in the opposite direction to the sharing to stop sharing.
Next, at block 540, it is determined whether a user stop sharing command is detected. If so, then flow proceeds to block 545 to cease sharing at least a portion of the content presented in the selected first display interface 106 to the second display interface. In an embodiment of the present application, after the instruction to stop sharing is executed, the shared content is no longer presented in one or more other display interfaces that previously received the shared content from the first display interface 106. Thereafter, the flow 500 ends at block 550. Alternatively, the process 500 may also end in response to the vehicle 100 being shut down. Alternatively, the process 500 may also end in response to a request by a user.
If at block 540, the user has not been detected to stop sharing instructions, the flow 500 returns to block 535.
In an embodiment of the present application, the process 500 may further include the process steps of receiving an input of a second user at the second display interface and synchronizing the input of the second user to the first display interface after sharing at least a portion of the content presented in the selected first display interface to the second display interface. For example, flow 500 may proceed from block 530 to block a, from which to block 531 in fig. 6, where the input of the second user is received at the second display interface, e.g., the input of the second user may be received through voice instructions. The operation of the display interface by a user who is inconvenient or impossible to manually input or a user who is, for example, elderly can be facilitated through the voice command. The second user's input is then synchronized to the first display interface at block 532. The same content may be displayed in the first display interface and the second display interface as needed, or after sharing or synchronization is completed, the shared or synchronized display interface may present other content selected by the user. Thereafter, the process 500 ends at block 533. In further embodiments, the input of the second display interface may also be disabled as desired.
In an embodiment of the present application, after the driver shares at least a portion of the content presented in the selected first display interface 106 to a second display interface carried or worn by an occupant in a rear seat, the occupant's input may be received at the second display interface, and the input from the occupant may be synchronized to the first display interface 106. For example, in the embodiment shown in FIG. 7, an interface of a document handling class application is presented in the first display interface 106, which is, for example, a document, and a first user (e.g., a driver) enters text therein, e.g., enters "ABC", after which the first user selects a portion 120 of the display interface and shares the portion 120 to a second display interface 121 carried by a second user (e.g., an occupant sitting in a rear seat directly behind the driver), as indicated by arrow Y1. At this point, the portion 120 is presented in the second display interface 121. The second user may enter text in the second display 121, for example, as indicated by arrow Y2, the second user further entering the letter "DE" based on the first user input. The second user may then synchronize the portion 120 containing the further input "DE" to the first display interface 106, as indicated by arrow Y3. It will be understood by those skilled in the art that while a document processing type application is illustrated, other applications can be implemented by the methods described herein, and that while the input of letters is illustrated, other inputs are within the scope of the present application. By the method, multi-person interaction and cooperation in the vehicle can be realized, for example, documents, paintings, picture processing, multi-person office work and the like can be completed by multiple persons, riding interestingness is increased, and multi-person cooperation efficiency is improved.
In further embodiments of the present application, the at least a portion of the content presented in the first display interface selected by the user may include an input interface, a search interface, a game interface, a driving interface, and/or a question and answer interface, and may include other interfaces as desired. These interfaces may be self-contained by the system 105 or may be provided by applications as described above. In one embodiment, the first display interface may be a mobile terminal carried or worn by a child sitting on a back seat, the child wishing to search for content of interest, such as an animation, a story, a children song, or the like, and the first display interface is presented with a search interface, and the child may not input methods or operate related interfaces, so that the content of interest cannot be searched. In the context of the present application, a child may share a search interface on a first display interface to other display interfaces carried or worn by adults, for example, seated in driver seats or other seats, or to a display interface of the system 105, and adults receiving the shared content containing the search interface may enter relevant content in their corresponding search interface on the display interface to assist the child in searching for content of interest and synchronize that content to the first display interface so that the child may view or listen to the content of interest. In another embodiment, the first display interface may be a display interface of the system 105, and the driver is performing vehicle operations on the vehicle 100 through a driving interface in the first display interface. In a case where the driver may not be able to continue the vehicle operation due to the need to deal with other things or physical discomfort or the like, or in a case where another occupant in the vehicle wishes to set the vehicle, the driver can share the driving interface in the first display interface to, for example, the display interface of the movable terminal carried by the occupant in the front passenger seat, so the occupant in the front passenger seat can perform the vehicle operation on the vehicle 100 through the driving interface presented in the display interface of the movable terminal thereof. Vehicle operations described herein may include vehicle operations related to the steering, speed, etc. of the vehicle, and may also include vehicle operations related to seating, air conditioning, mood lights, infotainment devices, etc. within the vehicle.
In an embodiment of the application, in response to the first user sharing instruction, an input of a second user may be received at the second display interface and at least a portion of the content may be processed based on the input of the second user, and the processed at least a portion of the content may be synchronized to the first display interface. In the embodiment shown in fig. 12, the first display interface and the second display interface may be associated with and may interact with the in-vehicle machine system 105. The controller (e.g., processor (CPU) 3 as described above) and the memory (e.g., memory 7 as described above, not shown in fig. 12) of the in-vehicle system 105 may serve as a background for the transfer and storage of data associated with the first display interface and the second display interface. For example, a controller of the in-vehicle machine system 105 may control an application (App) presented on the first user interface in response to the first user input, and a memory of the in-vehicle machine system may store data related to the App. In the event that the first user wishes to share, the controller shares at least a portion of the content of the application selected by the first user to the second display interface based on the first user input, and the second user may make input on the second display interface. Unlike a typical screen mapping or screen projection display, a first user shares an application while sharing the authority that the application receives input, and a controller may control the application in response to a second user input and update data related to the application stored in a memory according to the second user input. And then, according to the requirement, the application program presented in the second display interface can be shared back to the first display interface or shared to other display interfaces. It should be understood by those skilled in the art that, in addition to or instead of the memory of the in-vehicle system, data related to the display interface may also be saved in a cloud (e.g., the cloud 125 as described above) and then synchronized to the corresponding display device, in which case, the effect of the present application can also be achieved.
For example, a first user sitting on a rear seat of a vehicle is playing a game with a mobile terminal, a display interface of the mobile terminal may be used as the first display interface, and the first user performs a game operation through a game interface in the first display interface, and the memory of the in-vehicle system 105 stores current operation data about the first user of the game. For example, when the first user passes through a first level of the game, the memory of the in-vehicle system 105 stores current operation data of the first user passing through the first level. If desired, the first user may share the game interface presented in the first display interface to a display interface of a movable terminal carried by a second user (e.g., an occupant in a front passenger seat) in the vehicle, the display interface of the movable terminal of the second user may serve as the second display interface, and the controller of the in-vehicle machine system may update the operation data based on the current operation data based on the second user input. For example, the second user may continue the game with the first user, and on the basis of the first user passing the first level, the second user may then start the game from the second level, without having to start from the first level. In this case, the first display interface may also synchronously display the content on the second display interface in real time, for example, in the form of a video picture formed by multiple frames of pictures, so that the first user can view the game situation of the second user; in addition, the second user may also share the game interface presented in the second display interface back to the user interfaces of the first user or other users in the vehicle, as desired. In addition, under the condition of a double-player game, the first user and the second user can operate respective game characters on the game interfaces in the first display interface and the second display interface through the vehicle-mounted computer system, so that the double-player game is completed. Similarly, multiplayer games may also be implemented by the methods of the present application. This is advantageous for improving ride quality and for interaction between occupants of the vehicle.
In the embodiment of the application, the controller of the in-vehicle machine system can also set different permissions for a plurality of users, wherein the permissions include but are not limited to administrator permissions, common permissions and the like. The administrator authority corresponds to the highest authority, for example, a user who owns the administrator authority can share content with any user related to the in-vehicle machine system without obtaining the permission of the other party. The common right corresponds to a common right, for example, a user who possesses the common right needs to share content to an opposite side under the condition that the user acquires permission of the opposite side. In one embodiment, the driver may be given administrative privileges while other vehicle occupants are given general privileges. In such an embodiment, the driver can share the relevant content to other vehicle occupants according to needs, but the other vehicle occupants can share the relevant content after obtaining the permission of the other vehicle occupants, and the other vehicle occupants cannot directly share the relevant content to the driver. This is advantageous for improving driving safety, and when the driver is concentrating on driving, the contents suddenly shared may disturb the normal operation of the driver. It will be understood by those skilled in the art that the specific rights settings for the administrator rights and general rights are not limited thereto, and may be custom set by the user or the related rights settings provided by the vehicle supplier as desired.
In an embodiment of the present application, the process 500 may further include, after sharing at least a portion of content presented in the selected first display interface to the second display interface, a user of the second display interface may select to close the shared content and/or return the shared content to the first display interface. Flow 500 may proceed from block 530 to block a and from block a to block 536 of fig. 8, where the second user input is received at the second display interface at block 536. Proceeding to block 537, at least a portion of the shared data from the first display interface is then closed and/or returned to the first display interface. In one embodiment, after the driver shares at least a portion of the content presented in the selected first display interface 106 to a second display interface carried or worn by an occupant of the front seat, the occupant may close the shared content and/or return the shared content to the first display interface. In this way, the user of the second display interface may choose whether to deny or end the sharing. In other embodiments, the user of the second display interface may minimize or temporarily hide the received shared content to "buffer" the shared content in the event that it is inconvenient to receive the shared content, and then present the shared content in the second display interface when viewing is convenient.
Referring now to FIG. 9, a flowchart 900 of steps implemented by executable instructions included in system 105 when executed is shown, according to an embodiment of the present application.
Similar to flow 500, flow 900 may begin at block 905.
Next, a first display interface is activated at block 910, and a user input is received at block 915 to select at least a portion of an interface of at least one application presented in the first display interface. The at least one application may be any one or more applications as described above, and the at least a portion of the content of the at least one application may include at least a portion of an area of an interface of the at least one application, and the at least a portion of the area may correspond to the first area of the interface of the application. In the embodiment shown in fig. 10, which shows the user selecting the interface 208 of the video/music-like application presented in the first display interface 206, the user may select the area 211 of the interface 208 where the video is played as the first area, and wish to share only that area 211 to one or more other display interfaces, without sharing other unrelated content, such as advertisements, barracks, private information (e.g., login name, etc.). Under the concept of the present application, the user may select only this region 211 to share to one or more other display interfaces through the selection method as described above. Although the selected regions 211 are shown in a regular pattern in this embodiment, the desired regions may be selected in an irregular pattern in other embodiments.
Optionally, flow 900 may also proceed to block 920, where at block 920, the interface of at least one application not selected by the user may be masked without user involvement. In embodiments of the present application, masking may include filtering, hiding, overlaying, or otherwise rendering unselected content out of view. The interface of the at least one application not selected by the user may correspond to a second area of the interface of the application. In the embodiment shown in FIG. 10, interface 208 would be shared as a whole to one or more other display interfaces, but prior to being shared, areas not selected by the user may be masked. For example, the advertisement area 212 may be hatched without being rendered, as indicated by arrow Z1. In other embodiments, portions of the interface 208 not selected by the user may also be hidden by way of playing a mosaic, or intercepted or filtered. In further embodiments, the barrage or other interactive content appearing in the area 212 may also be masked and then shared to other display interfaces, during which no user is required to expressly operate a close button on the interface, e.g., a close button such as a barrage on the interface 208 is not required to expressly close by the user, thereby enabling sharing of the desired content in a more convenient manner.
Next, the flow 900 proceeds to block 925 where a user sharing instruction is detected and at block 930 it is determined whether a user sharing instruction is detected. If not, the flow 900 returns to block 925. If so, then flow proceeds to block 935 to share at least a portion of the interface of the selected at least one application to the second display interface. Content may be presented in the second display interface in synchronization with the first display interface, but content in the masked second area is not presented. In the embodiment shown in fig. 10, the area 211 including the user selection and the masked area 212 may be shared together to one or more other display interfaces, but only the content in the area 211 and not the masked area 212 are presented on the other display interfaces, i.e., only the content that the user wishes to share is realized. It is particularly advantageous for sharing to children to block advertisements or other extraneous or even objectionable information. In addition, content related to the personal privacy of the user can be shielded in a similar manner, so that the personal privacy is protected while the content is shared.
Thereafter, the process 900 may end at block 940.
It will be appreciated by those skilled in the art that although the above embodiments describe the case where the first user, the second user or the vehicle occupant are all located within the vehicle, one of the first user or the second user may not be located within the vehicle. For example, when the driver is alone located in the vehicle 100, the driver may share at least a portion of the content presented in the display interface of the system 105 or the display interface of the mobile terminal that the driver carries or wears, via the network or cloud 125, to relatives and friends at home or other non-vehicular locations in real time, as if the relatives and friends travel with the driver; in the case where there are other occupants in the vehicle in addition to the driver, the driver can share at least a portion of the content to relatives and friends at home or other non-in-vehicle locations while also sharing the content to the occupants in the vehicle.
According to another aspect of the present application, there is also provided a car machine system 105 comprising a processor and a memory, the memory storing processor-executable instructions, which when executed by the processor, implement the steps of: receiving input of a first user to select at least a portion of content presented in a first display interface; detecting a first user sharing instruction; sharing at least a part of content to a second display interface in response to detecting the first user sharing instruction; and presenting at least a portion of the content in a second display interface, wherein the first display interface and the second display interface are associated with the in-vehicle system; the at least a portion of the content includes at least a portion of an area of an interface of at least one application.
According to another aspect of the present application, there is also provided a vehicle 100, as shown in fig. 1, comprising the in-vehicle machine system 105 described herein, the in-vehicle machine system 105 including a processor and a memory, the memory storing processor executable instructions, which when executed by the processor, implement the following steps: receiving a first input of a first user to select at least a portion of content presented in a first display interface; detecting a first user sharing instruction; sharing at least a part of content to a second display interface in response to detecting the first user sharing instruction; presenting at least a portion of the content in a second display interface; receiving a second input of the first user or the second user; synchronizing a second input on the first display interface and the second display interface; receiving a sharing stopping instruction of a first user or a second user; and stopping sharing of at least a portion of the content, wherein the first display interface and the second display interface are associated with the in-vehicle machine system; at least a portion of the content includes at least a portion of an area of an interface of at least one application.
It should be understood that all of the embodiments, features and advantages set forth above with respect to the system 105 according to the present application are equally applicable to the vehicle 100 according to the present application without conflict therewith. That is, all embodiments of the system 105 and variations thereof described above may be directly transferred to and incorporated in the vehicle 100 according to the present application and will not be repeated herein for the sake of brevity of this disclosure.
In summary, compared with the prior art, the present application provides an interaction method 300 for a vehicle-mounted device system, the vehicle-mounted device system 105 and a corresponding vehicle 100. The scheme of this application can increase the interdynamic between the passenger in the car, improve the interest by bus to strengthen user experience, improve user satisfaction.
The features mentioned above in relation to different embodiments may be combined with each other to form further embodiments within the scope of the application, where technically feasible.
In this application, the use of the conjunction of the contrary intention is intended to include the conjunction. The use of the definite or indefinite article is not intended to indicate cardinality. In particular, references to "the" object or "an" and "an" object are intended to mean one of many such objects possible. Furthermore, the conjunction "or" may be used to convey simultaneous features, rather than mutually exclusive schemes. In other words, the conjunction "or" should be understood to include "and/or". The term "comprising" is inclusive and has the same scope as "comprising".
The above-described embodiments are possible examples of implementations of the present application and are given only for the purpose of clearly understanding the principles of the present application by those skilled in the art. Those skilled in the art will understand that: the above discussion of any embodiment is merely exemplary in nature and is in no way intended to suggest that the scope of the disclosure of embodiments herein (including the claims) is limited to these examples; features from the above embodiments or from different embodiments can also be combined with each other under the general idea of the application and produce many other variations of the different aspects of the embodiments of the application as described above, which are not provided in the detailed description for the sake of brevity. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of the embodiments of the present application are intended to be included within the scope of the claims.
Claims (20)
1. An interaction method for a car machine system, comprising:
receiving input of a first user to select at least a portion of content presented in a first display interface;
detecting a first user sharing instruction;
sharing the at least a portion of content to a second display interface in response to detecting the first user sharing instruction; and
presenting the at least a portion of the content in the second display interface,
wherein the first display interface and the second display interface are associated with the in-vehicle machine system; the at least a portion of the content includes at least a portion of an area of an interface of at least one application.
2. The method of claim 1, wherein the at least a portion of the area corresponds to a first area of the interface of the application, and further comprising masking a second area of the interface of the application not selected by the first user without involvement of the first user.
3. The method of claim 2, further comprising: presenting the at least a portion of the content in the second display interface synchronously with the first display interface, but not presenting content in the second area that is masked.
4. The method of claim 1, further comprising: receiving a second user's input at the second display interface and synchronizing the second user's input to the first display interface.
5. The method of claim 1, further comprising: and receiving input of a second user at the second display interface, wherein the input of the second user is used for closing and/or returning the at least one part of the shared content to the first display interface.
6. The method of claim 1, further comprising: stopping sharing the at least a portion of the content to the second display interface in response to detecting a first user stop sharing instruction.
7. The method of claim 1, wherein the first user sharing instruction and/or the first user stop sharing instruction comprises a user action and/or a user voice instruction.
8. The method of claim 7, wherein the user action comprises a contact action of a user operating the first display interface and/or a user non-contact action comprising one or more of a user limb action, a head action, a line of sight action.
9. The method of claim 8, wherein the contact action includes one or more of a click, a drag, a slide in a particular direction, a multi-finger operation, the contact action associated with a position of the second display interface receiving the at least a portion of content relative to the first display interface.
10. The method of claim 1, wherein at least a portion of the content presented in the first display interface is selected by selecting at least a portion of an irregular area of the first display interface.
11. The method of claim 1, wherein the first display interface comprises a plurality of pre-divided display regions, wherein the method further comprises selecting at least one of the plurality of display regions to select at least a portion of the content presented in the first display interface.
12. The method of claim 1, wherein the first display interface and/or the second display interface is disposed on a vehicle.
13. The method of claim 1, wherein the first display interface and/or the second display interface is a display interface of a mobile terminal.
14. The method of claim 1, wherein the at least a portion of content comprises an input interface, a search interface, a game interface, a driving interface, and/or a question and answer interface.
15. The method of claim 1, further comprising: and responding to the first user sharing instruction, receiving input of a second user on the second display interface, processing the at least one part of content based on the input of the second user, and synchronizing the processed at least one part of content to the first display interface.
16. A car machine system comprising a processor and a memory, the memory storing processor-executable instructions that, when executed by the processor, perform the steps of:
receiving input of a first user to select at least a portion of content presented in a first display interface;
detecting a first user sharing instruction;
sharing the at least a portion of content to a second display interface in response to detecting the first user sharing instruction; and
presenting the at least a portion of the content in the second display interface,
wherein the first display interface and the second display interface are associated with the in-vehicle machine system; the at least a portion of the content includes at least a portion of an area of an interface of at least one application.
17. The system of claim 16, wherein the at least a portion of the area corresponds to a first area of an interface of the application, and the instructions when executed by the processor further implement: masking a second region of an interface of the application not selected by the first user without involvement of the first user.
18. The system of claim 16, wherein the instructions when executed by the processor further implement the steps of: receiving a second user's input at the second display interface and synchronizing the second user's input to the first display interface.
19. The system of claim 16, wherein the instructions when executed by the processor further implement the steps of: stopping sharing the at least a portion of the content to the second display interface in response to detecting a first user stop sharing instruction.
20. A vehicle comprising a vehicle machine system including a processor and a memory, the memory storing processor-executable instructions that, when executed by the processor, implement the steps of:
receiving a first input of a first user to select at least a portion of content presented in a first display interface;
detecting a first user sharing instruction;
sharing the at least a portion of content to a second display interface in response to detecting the first user sharing instruction;
presenting the at least a portion of the content in the second display interface;
receiving a second input of the first user or a second user;
synchronizing the second input at the first display interface and the second display interface;
receiving a sharing stopping instruction of the first user or the second user; and
the sharing of the at least a portion of the content is stopped,
wherein the first display interface and the second display interface are associated with the in-vehicle machine system; the at least a portion of the content includes at least a portion of an area of an interface of at least one application.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110958307.2A CN115712379A (en) | 2021-08-20 | 2021-08-20 | Interaction method for vehicle-mounted machine system, vehicle-mounted machine system and vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110958307.2A CN115712379A (en) | 2021-08-20 | 2021-08-20 | Interaction method for vehicle-mounted machine system, vehicle-mounted machine system and vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115712379A true CN115712379A (en) | 2023-02-24 |
Family
ID=85230126
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110958307.2A Pending CN115712379A (en) | 2021-08-20 | 2021-08-20 | Interaction method for vehicle-mounted machine system, vehicle-mounted machine system and vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115712379A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116394756A (en) * | 2023-03-10 | 2023-07-07 | 浙江吉利控股集团有限公司 | A screen content auxiliary display method, device, vehicle and storage medium |
CN117170543A (en) * | 2023-11-03 | 2023-12-05 | 博泰车联网(南京)有限公司 | Multi-screen linkage method, device, equipment and computer readable storage medium |
-
2021
- 2021-08-20 CN CN202110958307.2A patent/CN115712379A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116394756A (en) * | 2023-03-10 | 2023-07-07 | 浙江吉利控股集团有限公司 | A screen content auxiliary display method, device, vehicle and storage medium |
CN117170543A (en) * | 2023-11-03 | 2023-12-05 | 博泰车联网(南京)有限公司 | Multi-screen linkage method, device, equipment and computer readable storage medium |
CN117170543B (en) * | 2023-11-03 | 2024-01-26 | 博泰车联网(南京)有限公司 | Multi-screen linkage methods, devices, equipment and computer-readable storage media |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3072710B1 (en) | Vehicle, mobile terminal and method for controlling the same | |
CN110099836B (en) | Vehicle and method of controlling display therein | |
US10521179B2 (en) | Vehicle systems and methods | |
KR101730315B1 (en) | Electronic device and method for image sharing | |
CN107580104B (en) | Mobile terminal and control system including the same | |
KR101711835B1 (en) | Vehicle, Vehicle operating method and wearable device operating method | |
US8843553B2 (en) | Method and system for communication with vehicles | |
WO2013074897A1 (en) | Configurable vehicle console | |
CN112799499A (en) | A system and method for human-computer interaction of a motor vehicle | |
WO2016084360A1 (en) | Display control device for vehicle | |
US9428055B2 (en) | Vehicle, terminal communicating with the vehicle, and method of controlling the vehicle | |
JPWO2019124158A1 (en) | Information processing equipment, information processing methods, programs, display systems, and moving objects | |
CN112297842A (en) | Autonomous vehicle with multiple display modes | |
CN115712379A (en) | Interaction method for vehicle-mounted machine system, vehicle-mounted machine system and vehicle | |
KR20170007980A (en) | Mobile terminal and method for controlling the same | |
JP2014216714A (en) | Information providing apparatus for sharing information in vehicle, portable terminal, and program | |
JP6970377B2 (en) | In-vehicle user interface device | |
KR20160114486A (en) | Mobile terminal and method for controlling the same | |
KR101859043B1 (en) | Mobile terminal, vehicle and mobile terminal link system | |
KR101736820B1 (en) | Mobile terminal and method for controlling the same | |
KR101916425B1 (en) | Vehicle interface device, vehicle and mobile terminal link system | |
US11954950B2 (en) | Information interaction method and information interaction system | |
JP7571866B2 (en) | Vehicle display system, vehicle display method, and vehicle display program | |
US20250153568A1 (en) | In-Vehicle Infotainment System and Method Therefor, Vehicle, Medium and Program Product | |
WO2023171253A1 (en) | Presentation control device and presentation control method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |