[go: up one dir, main page]

WO2013181310A2 - Control of device features based on vehicles state - Google Patents

Control of device features based on vehicles state Download PDF

Info

Publication number
WO2013181310A2
WO2013181310A2 PCT/US2013/043211 US2013043211W WO2013181310A2 WO 2013181310 A2 WO2013181310 A2 WO 2013181310A2 US 2013043211 W US2013043211 W US 2013043211W WO 2013181310 A2 WO2013181310 A2 WO 2013181310A2
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
communication device
control module
feature control
features
Prior art date
Application number
PCT/US2013/043211
Other languages
French (fr)
Other versions
WO2013181310A3 (en
Inventor
Christopher P. Ricci
Original Assignee
Flextronics Ap, Llc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/679,676 external-priority patent/US20130145065A1/en
Application filed by Flextronics Ap, Llc. filed Critical Flextronics Ap, Llc.
Priority to EP13797358.2A priority Critical patent/EP2856326A4/en
Priority to CA2874651A priority patent/CA2874651A1/en
Publication of WO2013181310A2 publication Critical patent/WO2013181310A2/en
Publication of WO2013181310A3 publication Critical patent/WO2013181310A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/48Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for in-vehicle communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • H04M1/724631User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device by limiting the access to the user interface, e.g. locking a touch-screen or a keypad
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W48/00Access restriction; Network selection; Access point selection
    • H04W48/02Access restriction performed under specific conditions
    • H04W48/04Access restriction performed under specific conditions based on user or terminal location or mobility data, e.g. moving direction, speed

Definitions

  • One way to instill comfort in a vehicle is to create an environment within the vehicle similar to that of an individual's home or place of comfort. Integrating features in a vehicle that are associated with comfort found in an individual's home can ease a traveler's transition from home to vehicle.
  • Several manufacturers have added comfort features in vehicles such as the following: leather seats, adaptive and/or personal climate control systems, music and media players, ergonomic controls, and in some cases Internet connectivity. However, because these manufacturers have added features to a conveyance, they have built comfort around a vehicle and failed to build a vehicle around comfort.
  • a method of controlling access to one or more features of a communication device associated with a vehicle comprises:
  • the feature control module is configured to receive input from at least one of a vehicle sensor and a non-vehicle sensor; determining a location of the communication device; and controlling, via the feature control module and based at least partially on the location of the communication device, user access to one or more features of the communication device.
  • the present disclosure can provide a number of advantages depending on the particular aspect, embodiment, and/or configuration.
  • drivers and other vehicle operators can operate their vehicles while texting, talking, surfing the Internet, streaming video, and generally using their mobile phones and/or other connected devices.
  • Using these devices while operating a vehicle may not only be considered unsafe, but may also contradict local, state, federal, and other laws.
  • the use of devices, especially communication devices, while driving causes greater distraction and is a leading cause of accidents among teenage drivers.
  • the present disclosure is directed to an intelligent system that is capable of recognizing a user and device and determining to allow or deny the user access to device features.
  • the system may recognize one or more characteristics associated with a user and/or device and limit access to device features at least partially based on the one or more characteristics. These characteristics may include but are not limited to location of the user and/or device, user profile settings, user preferences, registration status of the device, device settings, programmed conditions, and the like.
  • a user may be operating a device in the passenger seat of an automobile.
  • the user may have established a connection between the device and the vehicle (e.g., via Bluetooth, direct electrical connection, wireless, radio frequency ( F), infrared (IR), etc.).
  • the vehicle feature control system may utilize one or more of the vehicle/device sensors to determine the location of the device user. These sensors may include cameras, weight sensors, IR detectors, temperature sensors, GPS, triangulation and/or position sensors, and combinations thereof. Many vehicles, especially cars, utilize sensors of this type to activate and/or deactivate airbag and/or safety restraint system components.
  • a feature control module may determine that feature access should not be controlled.
  • the feature control module may determine to limit access to one or more features of the device.
  • the feature control module may refer to other factors when determining to allow or deny a user access to a device's features. Among these other factors are jurisdictional and/or federal laws, contractual rules/obligations, programmed conditions, vehicle state, emergency contingencies, and combinations thereof. Contractual rules/obligations may include but are not limited to contract limitations associated with employment contracts, insurance contracts, general agreements, governmental contracts, and the like. These rules and/or laws may be used in determining feature control of a device. For instance, a vehicle may be detected to be "in motion" by the feature control module and various vehicle/device sensors. Moreover, the feature control module may be configured to communicate to a database to determine laws governing the use of communication devices in the current geographical location of the vehicle.
  • a local law may prohibit the use of communication devices by a driver of a vehicle while that vehicle is in motion.
  • the feature control module may determine to deny access to device features.
  • the feature control module may communicate with the device to deactivate the features of the device. This deactivation may be coupled with a presented warning in the form of a visual and/or audible alert on the device and/or vehicle dash display. Additionally, it is anticipated that the feature control module may reactivate these deactivated features once the vehicle is in a state of rest and/or parked.
  • the feature control module may itself receive from a satellite positioning system receiver in the vehicle or from a satellite positioning system receiver in the commu nication device satellite location information alone or in conjunction with vehicle-related state, configuration, and/or operation information (speed, parking sensors, etc.) to determine the current vehicle state, configuration, and/or operation.
  • Exemplary on-board vehicle sensors that may be accessed by the feature control module include a wheel state sensor to sense one or more of vehicle speed, acceleration, deceleration, wheel rotation, wheel speed (e.g., wheel revolutions-per-minute), wheel slip, and the like, a power source energy output sensor to sense a power output of an on-board power sou rce (e.g., an engine or energy storage device) by measuring one or more of current engine speed (e.g., revolutions-per-minute), energy input and/or output (e.g., voltage, current, fuel consumption, and torque), and the like, a switch state sensor to determine a current activation or deactivation state of a power sou rce
  • a wheel state sensor to sense one or more of vehicle speed, acceleration, deceleration, wheel rotation, wheel speed (e.g., wheel revolutions-per-minute), wheel slip, and the like
  • a power source energy output sensor to sense a power output of an on-board power sou rce (e.g., an
  • a transmission setting sensor to determine a current setting of the vehicle transmission (e.g., gear selection or setting), a gear controller sensor to determine a current setting of a gear controller, a power controller sensor to determine a current setting of a power controller (e.g., throttle), a brake sensor to determine a current state (braking or non- braking) of a vehicle braking system, a seating system sensor to determine a seat setting and current weight of seated occupant, if any, in a selected seat of the vehicle seating system, a safety system state sensors to determine a current state of a vehicular safety system (e.g., air bag setting (deployed or undeployed) and/or seat belt setting (engaged or not engaged)), a light setting sensor (e.g., current headlight, emergency light, brake light, parking light, fog light, interior or passenger compartment light, and/or tail light state (on or off)), a brake control (e.g., pedal) setting sensor, an accelerator pedal setting sensor, a
  • the feature control module can disallow/deactivate use of texting, video streaming, and other applications.
  • the applications may be allowed and activated.
  • these features may be controlled in accordance with local/state/federal laws as well as administrative agency laws, insurance contract, governmental contracts, general agreements, and/or employment contracts.
  • communication modes such as texting, tweeting, email, and the like may be enabled or disabled based on vehicle location.
  • Vehicle location may be mapped against applicable laws of a governmental entity, such as a city, municipality, county, province, state, country, and the like.
  • capabilities of the device may be enabled or disabled based on contract requirements, employer rules or policies, etc.
  • a feature control module may be programmed to control a specific device, or group of devices, based on settings associated with a user.
  • the registering party may be prompted to input specific information via a control panel, the device, and/or a dash display interface.
  • the registration of devices may be password-protected and even associated with a master key or pass.
  • the registration process will grant the feature control module permission to control one or more features of the device.
  • the feature control module may be configured to control one or more communication features of the device regardless of registration permission. This unauthorized control of device communication features may be achieved by affecting the transmission of signals sent to and/or from the device.
  • a teenage driver may own a particular communication device.
  • This device may have a unique media access control (MAC) address or other unique hardware/software identifier.
  • the device may be registered with the feature control module by an authorized user (e.g., a parent, guardian, or governmental entity). During the registration process, the authorized user may configure the settings associated with the device and teenage driver to be especially strict. In other words, the authorized user may determine to disable all communication functions of the device while the vehicle is in motion.
  • an authorized user e.g., a parent, guardian, or governmental entity.
  • the authorized user may configure the settings associated with the device and teenage driver to be especially strict. In other words, the authorized user may determine to disable all communication functions of the device while the vehicle is in motion.
  • the authorized user may determine to allow telephonic connections while in motion but disable other features such as texting, emailing, and surfing the Internet (e.g., disable the browser capability). Additionally or alternatively, an authorized user may determine that communication devices inside a vehicle (associated with any person, and even in any area), shall be controlled by the feature control module. In this instance, the feature control module may prevent the exchange of communication signals to and from one or more device inside a vehicle.
  • the feature control module may determine to control one or more features based on vehicle state and/or condition. In one embodiment, access to features of a device may be overridden. This overriding control may be beneficial in the case of an emergency. For instance, the feature control module may determine that a vehicle and/or one or more users are in a state of emergency. If a vehicle has been involved in a collision or accident, one or more sensors associated with the vehicle are configured to report the incident. In accordance with the present disclosure, the feature control module may receive input from the multiple sensors to determine appropriate device feature control. For example, a car may be involved in a roll-over accident.
  • the wheels of the car may still be moving, and the vehicle is not in "park," the presence of the accident may be reported by the sensors and therefore functionality of device features may be returned to the one or more devices associated with the vehicle.
  • a user in a vehicle may have suffered a seizure, or illness, that causes the user to shake uncontrollably. This movement and/or condition may be detected by the device associated with that user and as such signal an emergency event associated with the user.
  • the feature control module may receive this input and return device feature functionality for a period of time.
  • each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C", “one or more of A, B, or C" and "A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • the term "computer-readable medium” as used herein refers to any tangible storage and/or transmission medium that participate in providing instructions to a processor for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, NVRAM, or magnetic or optical disks. Volatile media includes dynamic memory, such as main memory.
  • Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, magneto-optical medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, a solid state medium like a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
  • a digital file attachment to e-mail or other self- contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium.
  • the computer-readable media is configured as a database
  • the database may be any type of database, such as relational, hierarchical, object-oriented, and/or the like. Accordingly, the disclosure is considered to include a tangible storage medium or distribution medium and prior art-recognized equivalents and successor media, in which the software implementations of the present disclosure are stored.
  • module refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and software that is capable of performing the functionality associated with that element.
  • GPS Global Positioning System
  • US Global Positioning System
  • GLONASS Russian
  • EU Galileo positioning system
  • Compass navigation system China
  • Regional Navigational Satellite System India
  • vehicle does not require that a conveyance moves or is capable of movement.
  • Typical vehicles may include but are in no way limited to cars, trucks, motorcycles, busses, automobiles, trains, railed conveyances, boats, ships, marine conveyances, submarine conveyances, airplanes, space craft, flying machines, human- powered conveyances, and the like.
  • dashboard and “dashboard” and variations thereof, as used herein, are used interchangeably and include any panel and/or area of a vehicle disposed adjacent to an operator, user, and/or passenger.
  • Typical dashboards may include but are not limited to one or more control panel, instrument housing, head unit, indicator, gauge, meter, light, audio equipment, computer, screen, display, HUD unit, and graphical user interface.
  • communication device any type of device capable of communicating with one or more of another device and/or across a communications network, via a communications protocol, and the like.
  • exemplary communication devices may include but are not limited to smartphones, handheld computers, laptops, netbooks, notebook computers, subnotebooks, tablet computers, scanners, portable gaming devices, phones, pagers, GPS modules, portable music players, and other Internet-enabled and/or network-connected devices.
  • FIG. 1 is a block diagram depicting a feature control system in accordance with one embodiment of the present disclosure
  • FIG. 2 is a block diagram depicting areas and zones associated with a vehicle in accordance with one embodiment of the present disclosure
  • FIG. 3 is a flow diagram depicting a first feature control system method in accordance with embodiments of the present disclosure
  • Fig. 4 is a flow diagram depicting a second feature control system method in accordance with embodiments of the present disclosure
  • FIG. 5 is a flow diagram depicting a third feature control system method in accordance with embodiments of the present disclosure.
  • Fig. 6 is a flow diagram depicting a fourth feature control system method in accordance with embodiments of the present disclosure.
  • the feature control system can comprise one device or a compilation of devices.
  • the feature control system may include one or more communications devices, such as cellular telephones, or other smart devices.
  • This device, or devices may be capa ble of communicating with other devices and/or to an individual or group of individuals. Further, this device, or these devices, can receive user input in unique ways.
  • the device(s) may be electrical, mechanical, electro-mechanical, software-based, and/or combinations thereof.
  • the feature control system 100 comprises a feature control module 104 in communication with one or more of a communication device 108, sensor 136, 140, user 112, memory 106, 120, 124, server 122, and communication network 116.
  • the feature control module 104 is configured to control one or more device 108 features based on rules and/or input received. It is anticipated that the input received may be from one or more device 108, sensor 136, 140, and or user 112.
  • rules may be stored in one or more memory 106, 120, 124 of the feature control system 100.
  • the feature control module 104 may detect the presence of a device 108 by a physical or wireless connection. Upon detecting the device 108, the feature control module 104 may determine to control features of the device 108 based on the stored rules. These stored rules may direct a course of action based on input detected at the sensors 136, 140 and/or device 108. If the sensors 136, 140 report that the device 108 and user 112 are in the driver's seat of the vehicle, the rules may determine to limit access to device 100 features.
  • a vehicle comprises the feature control module 104 in its software and/or hardware implementation.
  • the feature control module 104 may be located remotely from a vehicle and substantially perform all of the functions and operations as described herein.
  • the feature control module 104 may be integrated into the device 108.
  • the feature control module 104 and/or its functionality could be split between the device 108 and an in-vehicle representation.
  • the split embodiment may further control the device 108 by limiting the device's 108 ability to perform specific functions while coupled and/or decoupled from the feature control module 104 of the vehicle.
  • the location of the feature control module 104 may vary, for the purposes of this disclosure, the feature control module 104 will be described as residing locally within a vehicle.
  • the feature control module 104 may be configured to receive one or more inputs. These one or more inputs may be used to determine whether to control features associated with a device such as device 108.
  • a device in wireless and/or physical communication with the feature control module 104 may be controlled.
  • the feature control module 104 may affect the control of a device's features via control of one or more of the device display, communications, state, applications, and/or combinations thereof.
  • a feature control module 104 may receive permission to control a device 108. This permission may be granted upon a registration of the device 108 with the feature control module 104. Furthermore, this type of registration may be achieved via the installation and/or operation of an application on the device 108.
  • the application may at least facilitate communications between the device 108 and the feature control module 104, control the state of the device 108 at the direction of the feature control module 104, and/or control a user's 112 access to one or more features of the device 108.
  • the feature control module 104 may affect the communications ability of any device 108 within a specific area of the vehicle based on signal attenuation and/or interference techniques.
  • the device 108 may include a global positioning system (GPS) receiver.
  • GPS global positioning system
  • the GPS receiver may further comprise a GPS module that is capable of providing absolute location information to other components of the device 108 and/or the feature control module 104.
  • An accelerometer(s)/gyroscope(s) may also be included.
  • the accelerometer/gyroscope may comprise at least one accelerometer and at least one gyroscope. For example, a signal from the
  • accelerometer/gyroscope can be used to determine an orientation of the device 108. This orientation may be used by the feature control module to determine a state of the device 108.
  • the device 108 may include a dual-screen phone, smartpad, and/or vehicle console as described in respective U.S. Patent Application Nos. 13/222,921, filed August 31, 2011, entitled “DESKTOP REVEAL EXPANSION,” and 13/247,581, filed September 28, 2011, entitled “SMARTPAD ORIENTATION,” and 13/420,240, filed March 14, 2012, entitled
  • the device 108 may be associated with one or more user 112.
  • a user 112 may be identified by one or more of characteristics, preferences, identification, and usage.
  • historical data relating to the one or more user 112 may be stored by the device 108 in a memory 106, 120, 124.
  • the memory may be local 120, remote 106, 124, and/or combinations thereof.
  • the communication network 116 may be any type of known communication medium or collection of communication mediums and may use any type of protocols to transport messages between endpoints.
  • the communication network 116 may include wired and/or wireless communication technologies.
  • the Internet is an example of the communication network 116 that constitutes an IP network consisting of many computers and other communication devices located all over the world, which are connected through many telephone systems and other means.
  • the communication network 116 include, without limitation, a standard Plain Old Telephone System (POTS), an Integrated Services Digital Network (ISDN), the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a cellular communication network, a cable communication network, a satellite communication network, any type of enterprise network, and any other type of packet-switched or circuit-switched network known in the art. It can be appreciated that the communication network 116 need not be limited to any one network type, and instead may be comprised of a number of different networks and/or network types. In some embodiments, the communication network 116 may comprise a controller area network, or CANbus, associated with vehicle, automotive, and/or automation communications. Moreover, it is anticipated that
  • the server 122 may comprise a general purpose programmable processor or controller for executing application programming or instructions.
  • the server 122 may include multiple processor cores, and/or implement multiple virtual processors.
  • the server 122 may include multiple physical processors.
  • the server may comprise a specially configured application specific integrated circuit (ASIC) or other integrated circuit, a digital signal processor, a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like.
  • ASIC application specific integrated circuit
  • the server 122 generally functions to run programming code or instructions implementing various functions of the feature control system 100 and/or feature control module 104.
  • the vehicle sensors 132 may include but are not limited to one or more of a throttle position sensor, accelerator pedal angle sensor, speed sensor, speedometer, vehicle speed sensor, wind speed, radar, brake position sensor, brake wear sensor, steering/torque sensor, transmission sensor, oxygen sensor, headlight sensor, ambient lighting sensor, vision system sensor, ranging sensor, parking sensor, heating venting and air conditioning (HVAC) sensor, turbine speed sensor, input speed sensor, water sensor, air-fuel ratio meter, blind spot monitor, crankshaft position sensor, engine temperature sensor, cabin temperature sensor, hall effect sensor, manifold absolute pressure sensor, mass flow sensor, microphone, camera sensor, crash detection sensor, safety restraint sensors, weight sensor, radio frequency (RF) sensor, infrared sensor (IR), vehicle control system sensors, location and/or position sensors, Wi-Fi sensor, cellular data sensor, Bluetooth sensor, and the like.
  • a throttle position sensor a throttle position sensor
  • accelerator pedal angle sensor speed sensor
  • speedometer speedometer
  • vehicle speed sensor wind speed
  • radar brake position sensor
  • brake wear sensor brake/
  • the one or more vehicle sensors 132 may be located in different areas or zones of a vehicle. For instance a first sensor 136a may be located in a proximal portion of a vehicle, while a second sensor 136b may be located in a distal portion of the vehicle. As can be appreciated the number of vehicle sensors 132 may vary according to vehicle type and/or vehicle control system complexity. In an exemplary embodiment, the vehicle sensors 132 may be configured to communicate across a communication network 116 and/or directly with the feature control module 104. One example of a communication network in a typical automotive application may include utilizing the CANbus and associated protocol.
  • the feature control module 104 may employ the use of one or more non-vehicle sensors 140.
  • the non-vehicle sensors 140 may include one or more type of vehicle sensor 132 described herein. However, the non-vehicle sensors 140 may be separated from the vehicle. Additionally or alternatively, the non-vehicle sensors 140 may comprise sensors associated with one or more other devices. For instance, the non-vehicle sensors 140 may be associated with at least one device 108. These sensors may include but are not limited to one or more of an accelerometer/gyroscope, GPS, compass, camera, microphone, audio input/output, temperature sensor, health monitoring sensors, and the like.
  • Fig. 2 is a block diagram depicting areas and zones associated with a vehicle 204 in accordance with one embodiment of the present disclosure.
  • a vehicle 204 may comprise one or more areas 208, 216, 220.
  • the areas 208, 216, 220 may in fact be a volume of space and/or a point location (e.g., a docking location, holder, power port, signal port, and so on).
  • These one or more areas 208, 216, 220 may be located inside (208) or outside (216, 220) of a vehicle 204.
  • the one or more areas 208, 216, 220 of a vehicle 204 may occupy different, overlapping, or substantially similar physical positions in and/or about the vehicle 204.
  • the inside of a vehicle 204 may comprise a first area 208a and a second area 208b.
  • the first area 208a may occupy a different physical location of the vehicle 204 than the second area 208b.
  • the areas 208 may be subdivided into one or more zones 212.
  • the one or more zones 212 may completely occupy an area 208 of the vehicle 204. Additionally or alternatively, the one or more zones 212 may occupy a portion of an area 208 of the vehicle 204.
  • a vehicle 204 may comprise a first area 208a including a first zone 212a and a second zone 212b.
  • This first area 208a may correspond to the proximal portion of a vehicle 204.
  • the first zone 212a may represent a driver/operator seat of a vehicle 204, while the second zone 212b may represent a proximal passenger seat of a vehicle 204.
  • a second area 208b may include a third zone 212c, a fourth zone 212d, and a fifth zone 212e.
  • This second area 208b may represent a passenger area of a vehicle 204.
  • the third zone 212c, fourth zone 212d, and fifth zone 212e may represent individual passenger seats, and/or areas, in the passenger area of the vehicle 204.
  • each area 208, 216, 220 and/or zone 212 associated with a vehicle 204 may comprise one or more sensors to determine a presence in and/or adjacent to each area 208, 216, 220 and/or zone 212.
  • the sensors may include vehicle sensors 132 and/or non-vehicle sensors 140 as described herein. It is anticipated that the sensors may be configured to communicate with a vehicle controls system and/or the feature control module 104.
  • the sensors may communicate with a device 108.
  • the sensors may communicate with a device 108.
  • a vehicle operator may be located in a second outside area 220 associated with a vehicle 204. As the operator approaches the first outside area 216 associated with the vehicle 204, the feature control module 104 may determine to control features associated with one or more device 108. In an exemplary embodiment, the feature control module 104 may determine to control features associated with the device 108 of the vehicle operator. In this scenario, the feature control module 104 may determine to control a vehicle status application on the device 108. Once the vehicle operator enters the vehicle 204, the sensors 132, 140 may determine that the vehicle operator is in an area 208 and/or zone 212. As is further described herein, the feature control module 104 may utilize the device 108, and/or user 112, location information to control features of the device 108 based on rules.
  • Figs. 3-6 depict multiple methods of the feature control system 100 operation.
  • the feature control system 100 methods may be controlled manually via user input and/or automatically via a processor.
  • Fig. 3 is a flow diagram depicting a first feature control system method 300 in accordance with embodiments of the present disclosure.
  • the method 300 begins at step 304 by detecting one or more devices 108 associated with the vehicle 204. Detection may include a voluntary registration and/or communication between a vehicle 204 and a device 108. Among other things, this type of registration and/or communication may be facilitated via the installation of an application on the device 108.
  • the application may provide one or more of a communication protocol, use permissions, and access to the feature control module 104. For example, a user may turn on a newly presented device 108 inside a vehicle 204, and as a result may be prompted to register the device 108 with the vehicle 204.
  • This registration prompt process may be effected automatically and/or manually.
  • the feature control module 104 utilizing one or more sensors 132, 140, may detect the presence of a device 108 and send a signal to the device 108 in the form of an installation prompt.
  • the feature control module 104 may communicate with a device 108 via a physical electrical connection.
  • the feature control module 104 may include an electrical interconnection configured to facilitate communications between the feature control module 104 and at least one device 108.
  • the electrical interconnection may provide power to the device 108 via this electrical interconnection.
  • the feature control module 104 may communicate with a device 108 via one or more wireless protocol. It is anticipated that the wireless protocol may include, but is not limited to, one or more existing communications protocols and/or equivalents thereof. Common device 108 communications protocols may include Bluetooth ® , Wi-Fi (IEEE 802.11 standards), RF, IR, and variations thereof. In some instances, a device 108 may be paired with one or more sensors used by the feature control module 104 to allow persistent and/or reestablishing communications between the device 108 and the feature control module 104.
  • the method 300 continues at step 312 by determining the location of the one or more detected devices 108.
  • the location of a device 108 may be found using vehicle sensors 132 and/or non-vehicle sensors 140.
  • a device 108 may be detected using sensors 132, 140 found inside a vehicle 204.
  • the location of the device 108 inside the vehicle 204 may be obtained via the use of
  • the procedure of determining a location associated with a device 108 becomes more streamlined upon the physical connection to a known port/electrical connection of the vehicle 204. Moreover, if the device 108 is registered to a particular user 112, the location of the device 108 may be interpreted using stored preferences and/or settings. It is an aspect of the present disclosure that the device 108 itself may report a position/location. This location may be provided via typical device 108 location services such as GPS, Wi-Fi data, and/or cellular data.
  • different locations of a device 108 may provide different responses from the feature control module 104.
  • a device 108 may be determined to be in a location where use of a device 108 is considered to be highly-restricted.
  • the driver's seat and/or pilot area may be an example of such a highly-restricted use location.
  • the feature control module 104 may limit access to the device 108 and/or features of the device 108 based on rules assigned to this zone 212 and/or area 208.
  • another location of the vehicle 204 may be classified as a restricted location.
  • the feature control module 104 may determine to control access to the device 108 and/or features of the device 108 based on less restrictive rules than those used for the highly- restricted location.
  • a device 108 may be used in an unrestricted location. This unrestricted location may allow a user 112 complete access to a device 108 based on rules defined for the unrestricted location.
  • different areas 208 and/or zones 212 of a vehicle 204 may be classified as various levels of restricted use.
  • the highly- restricted, restricted, and unrestricted locations have be presented herein, it is an aspect of the present disclosure that may levels of restricted and/or unrestricted use may be utilized by the feature control module 104.
  • the method 300 continues by determining one or more vehicle- device use laws (step 312).
  • vehicle-device use laws may be provided by an organization, governmental entity, group, individual, and/or combinations thereof. Additionally or alternatively, the laws may be created in response to detected input and/or conditions monitored by the feature control module 104, device 108, and/or sensors 132, 140.
  • the laws may be stored in local memory 106 by the feature control module 104, or the laws may be retrieved from another stored data memory 120, 124.
  • the feature control module may refer to a remote memory 120, 124 to determine laws and/or rules associated with a specific locality, region, user 112, and/or device 108.
  • the laws may be statutes and/or regulations that are enforced by a government entity. These laws may define vehicle, traffic, transportation, and/or safety rules associated with a given geographical region. Moreover, these laws may be stored locally and/or remotely as described herein. Furthermore, the laws may be updated from time to time to, among other things, account for changes in the laws. For example, the State of Idaho may ban the use of texting (i.e., sending a text message via some device 108) while driving, but may allow the use of a handheld mobile phone (e.g., device). In contrast, the State of Oregon may completely ban the use of handheld devices.
  • the feature control module 104 may refer to the laws of Idaho and determine to control the device 108 in accordance with Idaho law. However, once the user 112 is detected as being in Oregon, the feature control module 104 may control the device 108 based, at least in part, on the laws of Oregon. This procedure will be described further herein, however, it should be noted that the vehicle sensors 132 and/or other sensors 140 may determine at least one location of the device 108, and refer to laws associated with that at least one location to control the device 108 accordingly.
  • the method continues at step 316 by determining settings of the one or more associated devices 108.
  • These settings may include data relating to the feature control module 104, communications, permissions, device 108 control, methods, user preferences, historical data, and the like.
  • a device 108 may have multiple power states associated with its operation. Most devices, including smartphones, tablets, handheld computers, and the like, do not have simple "On/Off" states. To differentiate between these power states, the following terminology will be used to better define the multiple power states of a device 108. "Device Off” is used to indicate that the device 108 is completely turned off; in other words, virtually no power is being used by the communication device 108 in this state.
  • Device Off the device 108 cannot receive or transmit typical communications, signals, alerts, and the like.
  • “Device On” is used to indicate that the device 108 is turned on, capable of receiving and transmitting communications, signals, and alerts, and power is directed to the device 108 display and all recruited components.
  • "Device On” may indicate that the device 108 display is fully powered.
  • a fully powered display may indicate that the device 108 is in a condition to detect input received at all areas of the display (e.g., touch-screen).
  • “Device Lock” is used to indicate that power to the
  • a Device Lock state may cause reduced power to be directed to the display (e.g., in a limited area or section of the display).
  • the feature control module 104 and/or application may transition the device 108 from a Device On state to a Device Lock state and vice versa.
  • the settings of a device 108 may be configured to lock the device 108, or operate the device 108 in a Device Lock state, when controlled by the feature control module 104.
  • a parent/guardian may configure a child's device 108 to be controlled in accordance with strict settings and/or preferences.
  • the parent/guardian may determine that a device 108 may be a distraction to a child, while driving, in any state other than the Device Lock state.
  • the parent/guardian can set the device 108 to respond to feature control module 104 controls by operating the device in a Device Lock state.
  • a parent/guardian may wish to configure the settings of a device 108 to be less strict and allow access to other features of the device 108.
  • a parent/guardian may configure a device 108 to only lock specific features associated with the device 108.
  • the device 108 may be controlled at higher levels of strictness than provided by the vehicle-device laws determined in step 312. These higher levels of strictness may be provided by user preferences and/or device 108 settings.
  • the device 108 state may override settings, laws, and/or preferences.
  • the method continues by determining the state of the device 108 (step 320).
  • States of the device 108 may include one or more power state (on, off, and/or locked), orientation (vertical, horizontal, angle, etc.), operation (e.g., input type, running and/or background applications), sensor states, and the like.
  • specific device 108 states may indicate one or more conditions related to the user 112, vehicle 204, and/or the device 108 itself.
  • a device 108 may be in an unpowered, or Device Off, state and as such the condition may preclude control by the feature control module 104.
  • the state information of the device 108 may indicate that the device 108 is operating in a Device On state and may be subject to control via the feature control module 104. It is an aspect that sensor information received from a device 108 may determine control via the feature control module 104. For instance, one or more sensors on a device 108 may detect an impact, shock, and/or other tactile input and may correlate the data (in some instances in combination with other data) to determine a response by the feature control module 104.
  • the vehicle state is determined at step 324.
  • This vehicle state may include but is not limited to vehicle motion (driving, stopped, etc.), position (geographically), speed, acceleration, deceleration, transmission state (in-park, engaged drive, engaged reverse, in-gear, neutral), component status (parking brake, airbag, safety restraint system, engine control unit (ECU) output, CANbus activity), occupants (number, position, weight, and the like), sensor information (temperatures, pressures, etc.), and combinations thereof.
  • a user 112 may be driving a vehicle 204 while attempting to simultaneously operate an associated device 108.
  • the feature control module 104 may control the device 108 and/or features of the device 108 accordingly. Additionally or alternatively, when the vehicle 204 is determined to be in a stationary state (i.e., not moving), and even in-park, the feature control module 104 may determine to cease controlling the device 108.
  • the vehicle 204 state may indicate an emergency condition.
  • the vehicle 204 via one or more sensors 132, 140 may indicate that the vehicle 204 has been subjected to substantial amounts of impact force, the airbag deployed, the anti-lock braking system engaged, the vehicle 204 instantaneously moved in a direction contrary to historical data collected over time, the speed of the vehicle reduced dramatically, and more.
  • These exemplary sensor responses may be indicative of an accident.
  • the feature control module 104 may be configured to address emergency scenarios, especially with respect to the control of one or more devices 108.
  • an emergency state may cause the feature control module 104 to provide unfettered access to the device 108 and/or its features.
  • an emergency state may cause the feature control module 104 to present an emergency message to the one or more devices 108.
  • This emergency message may be sent to emergency services personnel and/or a third party.
  • the emergency message may include details regarding the emergency, the state of the vehicle 204, the state of a user 112, and/or the state of the device 108.
  • the feature control module 104 is configured to control one or more devices 108 based at least in part on rules (step 328).
  • the feature control module may utilize any one or more of the steps presented herein in determining control of the one or more devices 108.
  • the rules may direct that all of the steps disclosed herein be considered before the specific control of a device 108 is initiated.
  • These rules may include at least one algorithm to provide a controlling action response from the feature control module 104.
  • the rules may use sensor information collected, settings, laws, and more in determining a control action.
  • Control of a device 108 may take a number of forms.
  • control of a device 108 may include restricting access to specific applications, programs, and/or features of the device 108.
  • a user 112 whose device is being controlled by a feature control module 104 may be allowed to access the home screen of a device 108 to check the time and/or date.
  • this user 112 may be restricted, by the feature control module 104, from accessing a communications interface (e.g., telephone, texting, SMS, M MS, email, web browsers, and the like).
  • the user may be restricted from accessing programs that require physical input at the device 108.
  • a user 112 may be allowed to use the device 108 to send some form of communication and/or interface with the device 108 using voice commands and/or visual input.
  • the control of a device 108 may include transitioning the device 108 from one state to another.
  • various device 108 states may include Device On, Device Off, and Device Lock.
  • the rules may refer to location of the device 108 to activate and/or deactivate a control action.
  • a control message may be presented to an interface associated with the device 108 to indicate that the device 108 is controlled or released from control.
  • control of a device 108 may include blocking
  • This type of communications control may be activated in one or more of an area 208, a zone 212, and a device 108. For instance, if one or more devices 108 are detected in a given area 208, the feature control module 104 may determine to control all of the devices 108 together. This control may include interfering with the devices' 108 communication abilities.
  • Fig. 4 is a flow diagram depicting a second feature control system method 400 in accordance with embodiments of the present disclosure.
  • the method 400 is directed to detecting a device 108 and any associated settings for the control of the device 108.
  • the method begins at step 404 and proceeds by detecting one or more device 108 (step 408).
  • detection may be achieved through physical and/or wireless techniques.
  • the disclosed detection techniques may be automatically performed and/or manually initiated. If no device 108 is detected, the method ends (step 442).
  • the method 400 continues by determining whether any settings are associated with the device 108 (step 412). These settings may include data associated with a user, device, application, and/or feature control module 104. Typical settings may be stored in device data 120, at the feature control module 104 system data 106, and/or remotely in stored data 124. If no settings are detected, the user 112 may be prompted to enter settings, and/or configure the device 108 (step 416).
  • the user 112 may enter settings as prompted (step 420).
  • the user 112 may enter settings information at one or more of the device 108, interface to the feature control module 104, and/or at a server 122.
  • the settings may be prompted via at least one application running on the device, a server, and/or running as part of the feature control module 104. If the user fails to enter settings as prompted, the method 400 may continue by optionally controlling the device 108 based on default settings (step 424) and/or end the method (step 442).
  • the method 400 may continue by controlling the device 108 based at least in part on the settings and on rules stored in memory (step 428).
  • the feature control module 104 may control one or more behavior of the device 108. For example, rules may dictate that while a vehicle 204 is in motion, the device 108 should be controlled for all communications applications. Additionally or alternatively, a user 112 may enter settings directing that, when controlled by a feature control module 104, the device 104 should be transitioned to a Device Lock state.
  • the user 112 may wish to have an alert/notification pushed to the device 108 interface to indicate that the device 108 is being controlled.
  • This alert/notification may be provided in the form of a message. It is anticipated that vehicle 204 and/or device 108 conditions may be continually monitored by the feature control module 104 to modify the control method 400. Once a device 108 is controlled, the method may return to detecting any available devices 108 (step 408). If no device 108 is found, the method ends (step 442).
  • a flow diagram is shown depicting a third feature control system method 500 in accordance with embodiments of the present disclosure.
  • the method 500 discloses a feature control module 104 utilizing device 108 and/or vehicle 204 location to determine at least one control action.
  • the method begins at step 504 and proceeds by detecting one or more device 108 (step 508). If no device is found, the method ends (step 520).
  • the method 500 continues by determining the location of the device 108 (and/or vehicle 204)(step 512).
  • the location of the device 108 may refer to physical location of the device 108 inside or outside of a vehicle 204.
  • a specific location of the device 108 may be determined. Additionally or alternatively, the device 108 may be determined to be in a general location inside the vehicle 204. Depending on the rules and/or state of the vehicle 204, the specific location of the device 108 may be important to the feature control module 104 in determining to control the device 108 or its features. For example, a device 108 detected in the driver's seat of a vehicle 204 may be controlled differently than a device 108 detected in the rear passenger seat of a vehicle 204. As one example, a device 108 in the driver's seat may be controlled to more strict conditions. On the other hand, a device 108 found in the rear passenger location may be unrestricted or minimally restricted.
  • the location of the device 108 may include a location of the vehicle 204.
  • a location of the device 108 detected inside a vehicle 204 may be provided by a GPS or other location service of the vehicle and/or the device 108 itself.
  • This geographical location of the vehicle 204 may be used by the feature control module 104 in initiating a control action.
  • the feature control module 104 may refer to laws associated with the geographical location of the vehicle 204 in controlling the device 108. In some instances, these laws may be related to traffic and/or vehicle-device use statutes created by a government or third party.
  • the method 500 continues by controlling the device 108 based at least partially on the location of the device 108 and stored rules (step 516). As provided in an example a bove, a device 108 may be controlled in accordance with laws based on the location of the device 108 in the vehicle 204. The method 500 may continue by returning to the step of detecting devices (step 508). If no device is found, the method ends (step 520).
  • Fig. 6 is a flow diagram depicting a fourth feature control system method 600 in accordance with embodiments of the present disclosure.
  • the method 600 is directed to determining a state of a vehicle 204 to provide control action guidance for the feature control module 104.
  • the feature control module 104 may be configured to cease control of a device and/or its applications based on a number of states associated with a vehicle 204. One of these overriding control states is an emergency detected by the feature control module 104.
  • the method 600 begins at step 604 and proceeds by determining whether one or more devices 108 have been detected (step 608). If no device is found, the method ends (step 628).
  • a vehicle state may be determined by one or more inputs provided via the vehicle sensors 132, non-vehicle sensors 140, device 108, and a user 112.
  • the method 600 may interpret the nature of the vehicle state determined in step 612.
  • the feature control module 104 may determine whether the vehicle is in a state of emergency or not (step 616).
  • an emergency state may be determined from a number of vehicle 204 inputs.
  • various vehicle sensors 132 may indicate that an oil line associated with the vehicle 204 is losing pressure, the engine is reaching an unusually high predetermined temperature, and the safety restraint sensors detect impact at the front of the vehicle 204. This combination of sensor inputs may be enough to qualify as an emergency.
  • the user 112 may input an override command to indicate an emergency state. This override command may be in the form of video, voice, tactile, or other input.
  • the feature control module 104 may be directed to override specific controlled features of the device 108 (step 620). In other words, the feature control module 104 may allow access to all, or less than all, of the features of the device 108. For example, in the event of an emergency, a user's 112 access to the communications applications of a device 108 may be considered important if not critical.
  • a detected emergency state may prevent the restricted control of the device's communication hardware and/or software.
  • the method 600 continues by controlling the device 108 based at least partially on the vehicle state and stored rules (step 624). For example, one or more sensors 132, 140 may indicate that a vehicle 204 has reduced speed in a short amount of time. However, the feature control module 104 may determine that this type of scenario is not an emergency. As such, the device 108 may be controlled in accordance with the current vehicle state and rules. For instance, the vehicle state may indicate that the vehicle 204 is stopped and in-park. In this case, the feature control module 104 may allow access to features of the device 108.
  • the feature control module 104 may control the device 108 differently (e.g., restricting access to features of the device 108).
  • the method 600 may continue by returning to the step of detecting devices (step 608). If no device is found the method ends (step 628).
  • the exemplary systems and methods of this disclosure have been described in relation to a feature control module 104 and associated devices 108. As suggested by this disclosure, features may be shared between a feature control module 104 and a device 108. However, to avoid unnecessarily obscuring the present disclosure, the preceding description omits a number of known structures and devices. This omission is not to be construed as a limitation of the scopes of the claims. Specific details are set forth to provide an understanding of the present disclosure. It should however be appreciated that the present disclosure may be practiced in a variety of ways beyond the specific detail set forth herein.
  • exemplary aspects, embodiments, and/or configurations illustrated herein show the various components of the system collocated, certain components of the system can be located remotely, at distant portions of a distributed network, such as a LAN and/or the Internet, or within a dedicated system.
  • a distributed network such as a LAN and/or the Internet
  • the components of the system can be combined in to one or more devices, such as a Personal Computer (PC), laptop, netbook, smart phone, Personal Digital Assistant (PDA), tablet, etc., or collocated on a particular node of a distributed network, such as an analog and/or digital telecommunications network, a packet-switch network, or a circuit-switched network.
  • PC Personal Computer
  • PDA Personal Digital Assistant
  • the components of the system can be arranged at any location within a distributed network of components without affecting the operation of the system.
  • the various components can be located in a switch such as a PBX and media server, gateway, in one or more communications devices, at one or more users' premises, or some combination thereof.
  • one or more functional portions of the system could be distributed between a telecommunications device(s) and an associated computing device.
  • the various links connecting the elements can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements.
  • These wired or wireless links can also be secure links and may be capable of communicating encrypted information.
  • Transmission media used as links can be any suitable carrier for electrical signals, including coaxial cables, copper wire and fiber optics, and may take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
  • any device(s) or means capable of implementing the methodology illustrated herein can be used to implement the various aspects of this disclosure.
  • Exemplary hardware that can be used for the disclosed embodiments, configurations and aspects includes computers, handheld devices, telephones (e.g., cellular, Internet enabled, digital, analog, hybrids, and others), and other hardware known in the art.
  • processors e.g., a single or multiple microprocessors
  • memory e.g., a single or multiple microprocessors
  • nonvolatile storage e.g., a single or multiple microprocessors
  • input devices e.g., keyboards, pointing devices, and output devices.
  • output devices e.g., a display, keyboards, and the like.
  • alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
  • the disclosed methods may be readily implemented in conjunction with software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms.
  • the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this disclosure is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized.
  • the disclosed methods may be partially implemented in software that can be stored on a storage medium, executed on programmed general-purpose computer with the cooperation of a controller and memory, a special purpose computer, a microprocessor, or the like.
  • the systems and methods of this disclosure can be implemented as program embedded on personal computer such as an applet, JAVA ® or CGI script, as a resource residing on a server or computer workstation, as a routine embedded in a dedicated measurement system, system component, or the like.
  • the system can also be implemented by physically incorporating the system and/or method into a software and/or hardware system.
  • the present disclosure in various aspects, embodiments, and/or configurations, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various aspects, embodiments, configurations embodiments, subcombinations, and/or subsets thereof. Those of skill in the art will understand how to make and use the disclosed aspects, embodiments, and/or configurations after understanding the present disclosure.
  • the present disclosure in various aspects, embodiments, and/or configurations, includes providing devices and processes in the absence of items not depicted and/or described herein or in various aspects, embodiments, and/or configurations hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease and ⁇ or reducing cost of implementation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Selective Calling Equipment (AREA)
  • Traffic Control Systems (AREA)

Abstract

Methods and systems for a controlling device features based on vehicle state and device location are provided. Specifically, the device may be any type of electrical device capable of transmitting and/or receiving a signal (such as a phone, tablet, computer, music player, and/or other entertainment device). In some instances, the device may be associated with one or more vehicles. Although the device may be configured to run one or more applications, the functionality of the one or more applications may be controlled by a system associated with the vehicle. In some cases, this control may depend on the device application type, device location (either inside or outside of a vehicle), law, operator state, and/or vehicle state.

Description

CONTROL OF DEVICE FEATURES BASED ON VEHICLE STATE
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims the benefits of and priority, under 35 U.S.C. § 119(e), to U.S. Provisional Application Serial Nos. 61/560,509, filed on November 16, 2011, entitled "Complete Vehicle Ecosystem"; 61/637,164, filed on April 23, 2012, entitled "Complete Vehicle Ecosystem"; 61/646,747, filed on May 14, 2012, entitled "Branding of Electrically Propelled Vehicles Via the Generation of Specific Operating Sou nds"; 61/653,275, filed on May 30, 2012, entitled "Vehicle Application Store for Console"; 61/653,264, filed on May 30, 2012, entitled "Control of Device Featu res Based on Vehicle State"; 61/653,563, filed on May 31, 2012, entitled "Complete Vehicle Ecosystem"; 61/663,335, filed on June 22, 2012, entitled "Complete Vehicle Ecosystem"; 61/672,483, filed on July 17, 2012, entitled "Vehicle Climate Control"; and
61/714,016, filed on October 15, 2012, entitled "Vehicle M iddleware." The entire disclosures of the applications listed a bove are hereby incorporated by reference, in their entirety, for all that they teach and for all purposes.
[0002] This application is also related to U.S. Patent Application Nos. 13/420,236, filed on March 14, 2012, entitled, "Configura ble Vehicle Console"; 13/420,240, filed on March 14, 2012, entitled "Remova ble, Configura ble Vehicle Console"; 13/462,593, filed on May 2, 2012, entitled "Configura ble Dash Display"; 13/462,596, filed on May 2, 2012, entitled "Configura ble Heads-Up
Dash Display"; / , filed on November 16, 2012, entitled "Implementation of Conquest
Functionality in Automotive Console" (Attorney Docket No. 6583-228); / , , filed on
Novem ber 16, 2012, entitled "Gesture Recognition for On-Board Display" (Attorney Docket No.
6583-229); / , filed on November 16, 2012, entitled "Vehicle Application Store for
Console" (Attorney Docket No. 6583-230); / , , filed on Novem ber 16, 2012, entitled
"Sharing Applications/Med ia Between Car and Phone (Hydroid)" (Attorney Docket No. 6583-231);
/ , filed on November 16, 2012, entitled "In-Cloud Connection for Car M ultimedia"
(Attorney Docket No. 6583-232); / , , filed on Novem ber 16, 2012, entitled "Music
Streaming" (Attorney Docket No. 6583-233); / , , filed on Novem ber 16, 2012, entitled
"Insurance Tracking" (Attorney Docket No. 6583-235); / , , filed on Novem ber 16, 2012, entitled "Law Breaking/Behavior Sensor" (Attorney Docket No. 6583-236); / , , filed on
Novem ber 16, 2012, entitled "Etiquette Suggestion" (Attorney Docket No. 6583-237);
/ , , filed on November 16, 2012, entitled "Parking Space Finder Based on Parking Meter
Data" (Attorney Docket No. 6583-238); / , filed on November 16, 2012, entitled
"Parking Meter Expired Alert" (Attorney Docket No. 6583-239); / , , filed on Novem ber 16, 2012, entitled "Object Sensing (Pedestrian Avoidance/Accident Avoidance)" (Attorney Docket
No. 6583-240); / , , filed on November 16, 2012, entitled "Proximity Warning Relative to
Other Cars" (Attorney Docket No. 6583-241); / , , filed on November 16, 2012, entitled
"Street Side Sensors" (Attorney Docket No. 6583-242); / , filed on November 16, 2012, entitled "Car Location" (Attorney Docket No. 6583-243); / , filed on November 16, 2012, entitled "Universal Bus in the Car" (Attorney Docket No. 6583-244); / , filed on
November 16, 2012, entitled "Mobile Hot Spot/Router/Application Share Site or Network"
(Attorney Docket No. 6583-245); / , , filed on November 16, 2012, entitled "Universal
Console Chassis for the Car" (Attorney Docket No. 6583-246); / , , filed on November 16,
2012, entitled "Middleware" (Attorney Docket No. 6583-247); / , , filed on November 16,
2012, entitled "Real Time Traffic" (Attorney Docket No. 6583-248); / , , filed on
November 16, 2012, entitled "Map Updating" (Attorney Docket No. 6583-249); / , , filed on November 16, 2012, entitled "Communications Based on Vehicle Diagnostics and Indications"
(Attorney Docket No. 6583-250); / , , filed on November 16, 2012, entitled "Felon
Identifier" (Attorney Docket No. 6583-251); / , , filed on November 16, 2012, entitled
"Behavioral Tracking and Vehicle Applications" (Attorney Docket No. 6583-252); / , filed on November 16, 2012, entitled "Improvements to Controller Area Network Bus" (Attorney
Docket No. 6583-314); / , , filed on November 16, 2012, entitled "Location Information
Exchange Between Vehicle and Device" (Attorney Docket No. 6583-315); / , , filed on
November 16, 2012, entitled "In Car Communication Between Devices" (Attorney Docket No.
6583-316); / , filed on November 16, 2012, entitled "Configurable Hardware Unit for
Car Systems" (Attorney Docket No. 6583-317); / , , filed on November 16, 2012, entitled
"Feature Recognition for Configuring a Vehicle Console and Associated Devices" (Attorney Docket
No. 6583-318); / , , filed on November 16, 2012, entitled "Configurable Vehicle Console"
(Attorney Docket No. 6583-412); / , , filed on November 16, 2012, entitled "Configurable
Dash Display" (Attorney Docket No. 6583-413); / , , filed on November 16, 2012, entitled
"Configurable Heads-Up Dash Display" (Attorney Docket No. 6583-414); and / , , filed on
November 16, 2012, entitled "Removable, Configurable Vehicle Console" (Attorney Docket No. 6583-415). The entire disclosures of the applications listed above are hereby incorporated by reference, in their entirety, for all that they teach and for all purposes.
BACKGROUND
[0003] Whether using private, commercial, or public transport, the movement of people and/or cargo has become a major industry. In today's interconnected world, daily travel is essential to engaging in commerce. Commuting to and from work can account for a large portion of a traveler's day. As a result, vehicle manufacturers have begun to focus on making this commute, and other journeys, more enjoyable.
[0004] Currently, vehicle manufacturers attempt to entice travelers to use a specific conveyance based on any number of features. Most of these features focus on vehicle safety, or efficiency. From the addition of safety-restraints, air-bags, and warning systems to more efficient engines, motors, and designs, the vehicle industry has worked to appease the supposed needs of the traveler. Recently, however, vehicle manufactures have shifted their focus to user and passenger comfort as a primary concern. Making an individual more comfortable while traveling instills confidence and pleasure in using a given vehicle, increasing an individual's preference for a given manufacturer and/or vehicle type.
[0005] One way to instill comfort in a vehicle is to create an environment within the vehicle similar to that of an individual's home or place of comfort. Integrating features in a vehicle that are associated with comfort found in an individual's home can ease a traveler's transition from home to vehicle. Several manufacturers have added comfort features in vehicles such as the following: leather seats, adaptive and/or personal climate control systems, music and media players, ergonomic controls, and in some cases Internet connectivity. However, because these manufacturers have added features to a conveyance, they have built comfort around a vehicle and failed to build a vehicle around comfort.
SUMMARY
[0006] There is a need for a vehicle ecosystem that can integrate both physical and mental comforts while seamlessly operating with current electronic devices to result in an intuitive and immersive user experience. These and other needs are addressed by the various aspects, embodiments, and/or configurations of the present disclosure. Also, while the disclosure is presented in terms of exemplary embodiments, it should be appreciated that individual aspects of the disclosure can be separately claimed.
[0007] A method of controlling access to one or more features of a communication device associated with a vehicle is described. In some embodiments, the method comprises:
establishing a connection between the communication device and a feature control module, wherein the feature control module is configured to receive input from at least one of a vehicle sensor and a non-vehicle sensor; determining a location of the communication device; and controlling, via the feature control module and based at least partially on the location of the communication device, user access to one or more features of the communication device.
[0008] The present disclosure can provide a number of advantages depending on the particular aspect, embodiment, and/or configuration. Currently, drivers and other vehicle operators can operate their vehicles while texting, talking, surfing the Internet, streaming video, and generally using their mobile phones and/or other connected devices. Using these devices while operating a vehicle may not only be considered unsafe, but may also contradict local, state, federal, and other laws. Moreover, the use of devices, especially communication devices, while driving causes greater distraction and is a leading cause of accidents among teenage drivers.
[0009] Among other things, the present disclosure is directed to an intelligent system that is capable of recognizing a user and device and determining to allow or deny the user access to device features. In particular, the system may recognize one or more characteristics associated with a user and/or device and limit access to device features at least partially based on the one or more characteristics. These characteristics may include but are not limited to location of the user and/or device, user profile settings, user preferences, registration status of the device, device settings, programmed conditions, and the like. For example, a user may be operating a device in the passenger seat of an automobile. Moreover, the user may have established a connection between the device and the vehicle (e.g., via Bluetooth, direct electrical connection, wireless, radio frequency ( F), infrared (IR), etc.). In this example, the vehicle feature control system may utilize one or more of the vehicle/device sensors to determine the location of the device user. These sensors may include cameras, weight sensors, IR detectors, temperature sensors, GPS, triangulation and/or position sensors, and combinations thereof. Many vehicles, especially cars, utilize sensors of this type to activate and/or deactivate airbag and/or safety restraint system components. Upon detecting that the user and/or device in this case is located in a passenger seat, a feature control module may determine that feature access should not be controlled. On the other hand, if the user was seated in a vehicle operation seat (e.g., driver's seat) the feature control module may determine to limit access to one or more features of the device.
[0010] It is anticipated that the feature control module may refer to other factors when determining to allow or deny a user access to a device's features. Among these other factors are jurisdictional and/or federal laws, contractual rules/obligations, programmed conditions, vehicle state, emergency contingencies, and combinations thereof. Contractual rules/obligations may include but are not limited to contract limitations associated with employment contracts, insurance contracts, general agreements, governmental contracts, and the like. These rules and/or laws may be used in determining feature control of a device. For instance, a vehicle may be detected to be "in motion" by the feature control module and various vehicle/device sensors. Moreover, the feature control module may be configured to communicate to a database to determine laws governing the use of communication devices in the current geographical location of the vehicle. For the sake of example, a local law may prohibit the use of communication devices by a driver of a vehicle while that vehicle is in motion. Based on the vehicle state (i.e., in motion), the location of the user (i.e., driver's seat), and the local law (i.e., prohibiting use of devices by drivers of a moving vehicle) the feature control module may determine to deny access to device features. In some em bodiments, the feature control module may communicate with the device to deactivate the features of the device. This deactivation may be coupled with a presented warning in the form of a visual and/or audible alert on the device and/or vehicle dash display. Additionally, it is anticipated that the feature control module may reactivate these deactivated features once the vehicle is in a state of rest and/or parked.
[0011] In some em bodiments, the feature control module may itself receive from a satellite positioning system receiver in the vehicle or from a satellite positioning system receiver in the commu nication device satellite location information alone or in conjunction with vehicle-related state, configuration, and/or operation information (speed, parking sensors, etc.) to determine the current vehicle state, configuration, and/or operation. Exemplary on-board vehicle sensors that may be accessed by the feature control module include a wheel state sensor to sense one or more of vehicle speed, acceleration, deceleration, wheel rotation, wheel speed (e.g., wheel revolutions-per-minute), wheel slip, and the like, a power source energy output sensor to sense a power output of an on-board power sou rce (e.g., an engine or energy storage device) by measuring one or more of current engine speed (e.g., revolutions-per-minute), energy input and/or output (e.g., voltage, current, fuel consumption, and torque), and the like, a switch state sensor to determine a current activation or deactivation state of a power sou rce
activation/deactivation switch, a transmission setting sensor to determine a current setting of the vehicle transmission (e.g., gear selection or setting), a gear controller sensor to determine a current setting of a gear controller, a power controller sensor to determine a current setting of a power controller (e.g., throttle), a brake sensor to determine a current state (braking or non- braking) of a vehicle braking system, a seating system sensor to determine a seat setting and current weight of seated occupant, if any, in a selected seat of the vehicle seating system, a safety system state sensors to determine a current state of a vehicular safety system (e.g., air bag setting (deployed or undeployed) and/or seat belt setting (engaged or not engaged)), a light setting sensor (e.g., current headlight, emergency light, brake light, parking light, fog light, interior or passenger compartment light, and/or tail light state (on or off)), a brake control (e.g., pedal) setting sensor, an accelerator pedal setting sensor, a clutch pedal setting sensor, an emergency brake pedal setting sensor, a door setting (e.g., open, closed, locked or unlocked) sensor, a window setting (open or closed) sensor, and other sensors known to those of skill in the vehicle art. When, for example, a vehicle is in motion, the feature control module can disallow/deactivate use of texting, video streaming, and other applications. Once the vehicle is determined to be in a "parked" condition (e.g., in "Park"), or otherwise motionless, the applications may be allowed and activated. As previously stated, these features may be controlled in accordance with local/state/federal laws as well as administrative agency laws, insurance contract, governmental contracts, general agreements, and/or employment contracts.
[0012] In another embodiment, communication modes, such as texting, tweeting, email, and the like may be enabled or disabled based on vehicle location. Vehicle location may be mapped against applicable laws of a governmental entity, such as a city, municipality, county, province, state, country, and the like. Alternatively, capabilities of the device may be enabled or disabled based on contract requirements, employer rules or policies, etc.
[0013] In yet another embodiment, a feature control module may be programmed to control a specific device, or group of devices, based on settings associated with a user. During a registration process between a device and a vehicle, via the feature control module, the registering party may be prompted to input specific information via a control panel, the device, and/or a dash display interface. The registration of devices may be password-protected and even associated with a master key or pass. In some embodiments, the registration process will grant the feature control module permission to control one or more features of the device. In other embodiments, the feature control module may be configured to control one or more communication features of the device regardless of registration permission. This unauthorized control of device communication features may be achieved by affecting the transmission of signals sent to and/or from the device.
[0014] To better illustrate the concept of controlling device features based on settings, the example of a teenage driver is provided. In this example, a teenage driver may own a particular communication device. This device may have a unique media access control (MAC) address or other unique hardware/software identifier. In one embodiment, the device may be registered with the feature control module by an authorized user (e.g., a parent, guardian, or governmental entity). During the registration process, the authorized user may configure the settings associated with the device and teenage driver to be especially strict. In other words, the authorized user may determine to disable all communication functions of the device while the vehicle is in motion. On the other hand, the authorized user may determine to allow telephonic connections while in motion but disable other features such as texting, emailing, and surfing the Internet (e.g., disable the browser capability). Additionally or alternatively, an authorized user may determine that communication devices inside a vehicle (associated with any person, and even in any area), shall be controlled by the feature control module. In this instance, the feature control module may prevent the exchange of communication signals to and from one or more device inside a vehicle.
[0015] In some embodiments, the feature control module may determine to control one or more features based on vehicle state and/or condition. In one embodiment, access to features of a device may be overridden. This overriding control may be beneficial in the case of an emergency. For instance, the feature control module may determine that a vehicle and/or one or more users are in a state of emergency. If a vehicle has been involved in a collision or accident, one or more sensors associated with the vehicle are configured to report the incident. In accordance with the present disclosure, the feature control module may receive input from the multiple sensors to determine appropriate device feature control. For example, a car may be involved in a roll-over accident. Although the wheels of the car may still be moving, and the vehicle is not in "park," the presence of the accident may be reported by the sensors and therefore functionality of device features may be returned to the one or more devices associated with the vehicle. Alternatively, a user in a vehicle may have suffered a seizure, or illness, that causes the user to shake uncontrollably. This movement and/or condition may be detected by the device associated with that user and as such signal an emergency event associated with the user. The feature control module may receive this input and return device feature functionality for a period of time.
[0016] The phrases "at least one", "one or more", and "and/or" are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions "at least one of A, B and C", "at least one of A, B, or C", "one or more of A, B, and C", "one or more of A, B, or C" and "A, B, and/or C" means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
[0017] The term "a" or "an" entity refers to one or more of that entity. As such, the terms "a" (or "an"), "one or more" and "at least one" can be used interchangeably herein. It is also to be noted that the terms "comprising", "including", and "having" can be used interchangeably.
[0018] The term "automatic" and variations thereof, as used herein, refers to any process or operation done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be "material." [0019] The term "computer-readable medium" as used herein refers to any tangible storage and/or transmission medium that participate in providing instructions to a processor for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, NVRAM, or magnetic or optical disks. Volatile media includes dynamic memory, such as main memory.
Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, magneto-optical medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, a solid state medium like a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read. A digital file attachment to e-mail or other self- contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. When the computer-readable media is configured as a database, it is to be understood that the database may be any type of database, such as relational, hierarchical, object-oriented, and/or the like. Accordingly, the disclosure is considered to include a tangible storage medium or distribution medium and prior art-recognized equivalents and successor media, in which the software implementations of the present disclosure are stored.
[0020] The term "module" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and software that is capable of performing the functionality associated with that element.
[0021]
[0022] The term "satellite positioning system receiver" refers to a wireless receiver or transceiver to receive and/or send location signals from and/or to a satellite positioning system, such as the Global Positioning System ("GPS") (US), GLONASS (Russia), Galileo positioning system (EU), Compass navigation system (China), and Regional Navigational Satellite System (India).
[0023] The terms "determine," "calculate," and "compute," and variations thereof, as used herein, are used interchangeably and include any type of methodology, process, mathematical operation or technique.
[0024] It shall be understood that the term "means" as used herein shall be given its broadest possible interpretation in accordance with 35 U.S.C., Section 112, Paragraph 6. Accordingly, a claim incorporating the term "means" shall cover all structures, materials, or acts set forth herein, and all of the equivalents thereof. Further, the structures, materials or acts and the equivalents thereof shall include all those described in the summary of the invention, brief description of the drawings, detailed description, abstract, and claims themselves. [0025] The term "vehicle" as used herein includes any conveyance, or model of a conveyance, where the conveyance was originally designed for the purpose of moving one or more tangible objects, such as people, animals, cargo, and the like. The term "vehicle" does not require that a conveyance moves or is capable of movement. Typical vehicles may include but are in no way limited to cars, trucks, motorcycles, busses, automobiles, trains, railed conveyances, boats, ships, marine conveyances, submarine conveyances, airplanes, space craft, flying machines, human- powered conveyances, and the like.
[0026] The terms "dash" and "dashboard" and variations thereof, as used herein, are used interchangeably and include any panel and/or area of a vehicle disposed adjacent to an operator, user, and/or passenger. Typical dashboards may include but are not limited to one or more control panel, instrument housing, head unit, indicator, gauge, meter, light, audio equipment, computer, screen, display, HUD unit, and graphical user interface.
[0027] The terms "communication device," "smartphone," and "mobile device," and variations thereof, as used herein, are used interchangeably and include any type of device capable of communicating with one or more of another device and/or across a communications network, via a communications protocol, and the like. Exemplary communication devices may include but are not limited to smartphones, handheld computers, laptops, netbooks, notebook computers, subnotebooks, tablet computers, scanners, portable gaming devices, phones, pagers, GPS modules, portable music players, and other Internet-enabled and/or network-connected devices.
[0028] The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and/or configurations. It is intended neither to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments, and/or configurations of the disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0029] Fig. 1 is a block diagram depicting a feature control system in accordance with one embodiment of the present disclosure;
[0030] Fig. 2 is a block diagram depicting areas and zones associated with a vehicle in accordance with one embodiment of the present disclosure;
[0031] Fig. 3 is a flow diagram depicting a first feature control system method in accordance with embodiments of the present disclosure; [0032] Fig. 4 is a flow diagram depicting a second feature control system method in accordance with embodiments of the present disclosure;
[0033] Fig. 5 is a flow diagram depicting a third feature control system method in accordance with embodiments of the present disclosure; and
[0034] Fig. 6 is a flow diagram depicting a fourth feature control system method in accordance with embodiments of the present disclosure.
[0035] In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
DETAILED DESCRIPTION
[0036] Presented herein are embodiments of a feature control system. The feature control system can comprise one device or a compilation of devices. Furthermore, the feature control system may include one or more communications devices, such as cellular telephones, or other smart devices. This device, or devices, may be capa ble of communicating with other devices and/or to an individual or group of individuals. Further, this device, or these devices, can receive user input in unique ways. As described herein, the device(s) may be electrical, mechanical, electro-mechanical, software-based, and/or combinations thereof.
[0037] For purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the present invention. It should be appreciated, however, that the present invention may be practiced in a variety of ways beyond the specific details set forth herein.
[0038] Referring to Fig. 1, a block diagram is shown depicting a feature control system 100 in accordance with one embodiment of the present disclosure. In general the feature control system 100 comprises a feature control module 104 in communication with one or more of a communication device 108, sensor 136, 140, user 112, memory 106, 120, 124, server 122, and communication network 116. In some embodiments, the feature control module 104 is configured to control one or more device 108 features based on rules and/or input received. It is anticipated that the input received may be from one or more device 108, sensor 136, 140, and or user 112. Moreover, rules may be stored in one or more memory 106, 120, 124 of the feature control system 100. For example, the feature control module 104 may detect the presence of a device 108 by a physical or wireless connection. Upon detecting the device 108, the feature control module 104 may determine to control features of the device 108 based on the stored rules. These stored rules may direct a course of action based on input detected at the sensors 136, 140 and/or device 108. If the sensors 136, 140 report that the device 108 and user 112 are in the driver's seat of the vehicle, the rules may determine to limit access to device 100 features.
[0039] In an exemplary embodiment, a vehicle comprises the feature control module 104 in its software and/or hardware implementation. However, the feature control module 104 may be located remotely from a vehicle and substantially perform all of the functions and operations as described herein. For example, the feature control module 104 may be integrated into the device 108. Additionally or alternatively, the feature control module 104 and/or its functionality could be split between the device 108 and an in-vehicle representation. For instance, the split embodiment may further control the device 108 by limiting the device's 108 ability to perform specific functions while coupled and/or decoupled from the feature control module 104 of the vehicle. Although it can be appreciated that the location of the feature control module 104 may vary, for the purposes of this disclosure, the feature control module 104 will be described as residing locally within a vehicle.
[0040] In some embodiments, the feature control module 104 may be configured to receive one or more inputs. These one or more inputs may be used to determine whether to control features associated with a device such as device 108. In general, a device in wireless and/or physical communication with the feature control module 104 may be controlled. The feature control module 104 may affect the control of a device's features via control of one or more of the device display, communications, state, applications, and/or combinations thereof. In one embodiment, a feature control module 104 may receive permission to control a device 108. This permission may be granted upon a registration of the device 108 with the feature control module 104. Furthermore, this type of registration may be achieved via the installation and/or operation of an application on the device 108. In an exemplary embodiment, the application may at least facilitate communications between the device 108 and the feature control module 104, control the state of the device 108 at the direction of the feature control module 104, and/or control a user's 112 access to one or more features of the device 108. However, it is an aspect of the present disclosure that the feature control module 104 may affect the communications ability of any device 108 within a specific area of the vehicle based on signal attenuation and/or interference techniques.
[0041] The device 108 may include a global positioning system (GPS) receiver. In accordance with embodiments of the present disclosure, the GPS receiver may further comprise a GPS module that is capable of providing absolute location information to other components of the device 108 and/or the feature control module 104. An accelerometer(s)/gyroscope(s) may also be included. In some embodiments, the accelerometer/gyroscope may comprise at least one accelerometer and at least one gyroscope. For example, a signal from the
accelerometer/gyroscope can be used to determine an orientation of the device 108. This orientation may be used by the feature control module to determine a state of the device 108.
[0042] It is anticipated that the device 108 may include a dual-screen phone, smartpad, and/or vehicle console as described in respective U.S. Patent Application Nos. 13/222,921, filed August 31, 2011, entitled "DESKTOP REVEAL EXPANSION," and 13/247,581, filed September 28, 2011, entitled "SMARTPAD ORIENTATION," and 13/420,240, filed March 14, 2012, entitled
"REMOVABLE, CONFIGURABLE VEHICLE CONSOLE." Each of the aforementioned documents is incorporated herein by this reference in their entirety for all that they teach and for all purposes.
[0043] The device 108 may be associated with one or more user 112. In some embodiments, a user 112 may be identified by one or more of characteristics, preferences, identification, and usage. In addition, historical data relating to the one or more user 112 may be stored by the device 108 in a memory 106, 120, 124. As can be appreciated the memory may be local 120, remote 106, 124, and/or combinations thereof.
[0044] The communication network 116 may be any type of known communication medium or collection of communication mediums and may use any type of protocols to transport messages between endpoints. The communication network 116 may include wired and/or wireless communication technologies. The Internet is an example of the communication network 116 that constitutes an IP network consisting of many computers and other communication devices located all over the world, which are connected through many telephone systems and other means. Other examples of the communication network 116 include, without limitation, a standard Plain Old Telephone System (POTS), an Integrated Services Digital Network (ISDN), the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a cellular communication network, a cable communication network, a satellite communication network, any type of enterprise network, and any other type of packet-switched or circuit-switched network known in the art. It can be appreciated that the communication network 116 need not be limited to any one network type, and instead may be comprised of a number of different networks and/or network types. In some embodiments, the communication network 116 may comprise a controller area network, or CANbus, associated with vehicle, automotive, and/or automation communications. Moreover, it is anticipated that
communications between various components of the feature control system 100 can be carried by one or more busses. [0045] The server 122 may comprise a general purpose programmable processor or controller for executing application programming or instructions. In accordance with at least some embodiments, the server 122 may include multiple processor cores, and/or implement multiple virtual processors. In accordance with still other embodiments, the server 122 may include multiple physical processors. As a particular example, the server may comprise a specially configured application specific integrated circuit (ASIC) or other integrated circuit, a digital signal processor, a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like. The server 122 generally functions to run programming code or instructions implementing various functions of the feature control system 100 and/or feature control module 104.
[0046] The vehicle sensors 132 may include but are not limited to one or more of a throttle position sensor, accelerator pedal angle sensor, speed sensor, speedometer, vehicle speed sensor, wind speed, radar, brake position sensor, brake wear sensor, steering/torque sensor, transmission sensor, oxygen sensor, headlight sensor, ambient lighting sensor, vision system sensor, ranging sensor, parking sensor, heating venting and air conditioning (HVAC) sensor, turbine speed sensor, input speed sensor, water sensor, air-fuel ratio meter, blind spot monitor, crankshaft position sensor, engine temperature sensor, cabin temperature sensor, hall effect sensor, manifold absolute pressure sensor, mass flow sensor, microphone, camera sensor, crash detection sensor, safety restraint sensors, weight sensor, radio frequency (RF) sensor, infrared sensor (IR), vehicle control system sensors, location and/or position sensors, Wi-Fi sensor, cellular data sensor, Bluetooth sensor, and the like. In some embodiments, the one or more vehicle sensors 132 may be located in different areas or zones of a vehicle. For instance a first sensor 136a may be located in a proximal portion of a vehicle, while a second sensor 136b may be located in a distal portion of the vehicle. As can be appreciated the number of vehicle sensors 132 may vary according to vehicle type and/or vehicle control system complexity. In an exemplary embodiment, the vehicle sensors 132 may be configured to communicate across a communication network 116 and/or directly with the feature control module 104. One example of a communication network in a typical automotive application may include utilizing the CANbus and associated protocol.
[0047] In some embodiments, the feature control module 104 may employ the use of one or more non-vehicle sensors 140. The non-vehicle sensors 140 may include one or more type of vehicle sensor 132 described herein. However, the non-vehicle sensors 140 may be separated from the vehicle. Additionally or alternatively, the non-vehicle sensors 140 may comprise sensors associated with one or more other devices. For instance, the non-vehicle sensors 140 may be associated with at least one device 108. These sensors may include but are not limited to one or more of an accelerometer/gyroscope, GPS, compass, camera, microphone, audio input/output, temperature sensor, health monitoring sensors, and the like.
[0048] Fig. 2 is a block diagram depicting areas and zones associated with a vehicle 204 in accordance with one embodiment of the present disclosure. In general, a vehicle 204 may comprise one or more areas 208, 216, 220. The areas 208, 216, 220 may in fact be a volume of space and/or a point location (e.g., a docking location, holder, power port, signal port, and so on). These one or more areas 208, 216, 220 may be located inside (208) or outside (216, 220) of a vehicle 204. It is an aspect of the present disclosure that the one or more areas 208, 216, 220 of a vehicle 204 may occupy different, overlapping, or substantially similar physical positions in and/or about the vehicle 204. For instance, the inside of a vehicle 204 may comprise a first area 208a and a second area 208b. As depicted, the first area 208a may occupy a different physical location of the vehicle 204 than the second area 208b. In some embodiments, the areas 208 may be subdivided into one or more zones 212. The one or more zones 212 may completely occupy an area 208 of the vehicle 204. Additionally or alternatively, the one or more zones 212 may occupy a portion of an area 208 of the vehicle 204. It is anticipated that the one or more areas 208 of a vehicle 204 may comprise different zone 212 to area 208 ratios. For example, a vehicle 204 may comprise a first area 208a including a first zone 212a and a second zone 212b. This first area 208a may correspond to the proximal portion of a vehicle 204. The first zone 212a may represent a driver/operator seat of a vehicle 204, while the second zone 212b may represent a proximal passenger seat of a vehicle 204. Continuing the example above, a second area 208b may include a third zone 212c, a fourth zone 212d, and a fifth zone 212e. This second area 208b may represent a passenger area of a vehicle 204. The third zone 212c, fourth zone 212d, and fifth zone 212e may represent individual passenger seats, and/or areas, in the passenger area of the vehicle 204.
[0049] In some embodiments, each area 208, 216, 220 and/or zone 212 associated with a vehicle 204 may comprise one or more sensors to determine a presence in and/or adjacent to each area 208, 216, 220 and/or zone 212. The sensors may include vehicle sensors 132 and/or non-vehicle sensors 140 as described herein. It is anticipated that the sensors may be configured to communicate with a vehicle controls system and/or the feature control module 104.
Additionally or alternatively, the sensors may communicate with a device 108. The
communication of sensors with the vehicle 204 may initiate and/or terminate the control of device 108 features. For example, a vehicle operator may be located in a second outside area 220 associated with a vehicle 204. As the operator approaches the first outside area 216 associated with the vehicle 204, the feature control module 104 may determine to control features associated with one or more device 108. In an exemplary embodiment, the feature control module 104 may determine to control features associated with the device 108 of the vehicle operator. In this scenario, the feature control module 104 may determine to control a vehicle status application on the device 108. Once the vehicle operator enters the vehicle 204, the sensors 132, 140 may determine that the vehicle operator is in an area 208 and/or zone 212. As is further described herein, the feature control module 104 may utilize the device 108, and/or user 112, location information to control features of the device 108 based on rules.
[0050] Figs. 3-6 depict multiple methods of the feature control system 100 operation. In some embodiments, the feature control system 100 methods may be controlled manually via user input and/or automatically via a processor.
[0051] Fig. 3 is a flow diagram depicting a first feature control system method 300 in accordance with embodiments of the present disclosure. The method 300 begins at step 304 by detecting one or more devices 108 associated with the vehicle 204. Detection may include a voluntary registration and/or communication between a vehicle 204 and a device 108. Among other things, this type of registration and/or communication may be facilitated via the installation of an application on the device 108. In some embodiments, the application may provide one or more of a communication protocol, use permissions, and access to the feature control module 104. For example, a user may turn on a newly presented device 108 inside a vehicle 204, and as a result may be prompted to register the device 108 with the vehicle 204. This registration prompt process may be effected automatically and/or manually. In some embodiments, the feature control module 104, utilizing one or more sensors 132, 140, may detect the presence of a device 108 and send a signal to the device 108 in the form of an installation prompt.
[0052] In other embodiments, the feature control module 104 may communicate with a device 108 via a physical electrical connection. For instance, the feature control module 104 may include an electrical interconnection configured to facilitate communications between the feature control module 104 and at least one device 108. In one embodiment of the present disclosure the electrical interconnection may provide power to the device 108 via this electrical interconnection.
[0053] In yet another embodiment, the feature control module 104 may communicate with a device 108 via one or more wireless protocol. It is anticipated that the wireless protocol may include, but is not limited to, one or more existing communications protocols and/or equivalents thereof. Common device 108 communications protocols may include Bluetooth®, Wi-Fi (IEEE 802.11 standards), RF, IR, and variations thereof. In some instances, a device 108 may be paired with one or more sensors used by the feature control module 104 to allow persistent and/or reestablishing communications between the device 108 and the feature control module 104.
[0054] The method 300 continues at step 312 by determining the location of the one or more detected devices 108. In accordance with some embodiments of the present disclosure, the location of a device 108 may be found using vehicle sensors 132 and/or non-vehicle sensors 140. For example, a device 108 may be detected using sensors 132, 140 found inside a vehicle 204. The location of the device 108 inside the vehicle 204 may be obtained via the use of
triangulation, sensing, and/or ranging techniques (e.g., measuring signal strength from different points, ping and response, and/or similar position detecting procedures). The procedure of determining a location associated with a device 108 becomes more streamlined upon the physical connection to a known port/electrical connection of the vehicle 204. Moreover, if the device 108 is registered to a particular user 112, the location of the device 108 may be interpreted using stored preferences and/or settings. It is an aspect of the present disclosure that the device 108 itself may report a position/location. This location may be provided via typical device 108 location services such as GPS, Wi-Fi data, and/or cellular data.
[0055] In some embodiments, different locations of a device 108 may provide different responses from the feature control module 104. For example, a device 108 may be determined to be in a location where use of a device 108 is considered to be highly-restricted. The driver's seat and/or pilot area may be an example of such a highly-restricted use location. As such, the feature control module 104 may limit access to the device 108 and/or features of the device 108 based on rules assigned to this zone 212 and/or area 208. In accordance with the present disclosure, another location of the vehicle 204 may be classified as a restricted location. In such locations, the feature control module 104 may determine to control access to the device 108 and/or features of the device 108 based on less restrictive rules than those used for the highly- restricted location. In some embodiments, a device 108 may be used in an unrestricted location. This unrestricted location may allow a user 112 complete access to a device 108 based on rules defined for the unrestricted location. As can be appreciated, different areas 208 and/or zones 212 of a vehicle 204 may be classified as various levels of restricted use. Although the highly- restricted, restricted, and unrestricted locations have be presented herein, it is an aspect of the present disclosure that may levels of restricted and/or unrestricted use may be utilized by the feature control module 104.
[0056] In some embodiments, the method 300 continues by determining one or more vehicle- device use laws (step 312). These vehicle-device use laws may be provided by an organization, governmental entity, group, individual, and/or combinations thereof. Additionally or alternatively, the laws may be created in response to detected input and/or conditions monitored by the feature control module 104, device 108, and/or sensors 132, 140. The laws may be stored in local memory 106 by the feature control module 104, or the laws may be retrieved from another stored data memory 120, 124. In some cases, the feature control module may refer to a remote memory 120, 124 to determine laws and/or rules associated with a specific locality, region, user 112, and/or device 108.
[0057] In an exemplary embodiment, the laws may be statutes and/or regulations that are enforced by a government entity. These laws may define vehicle, traffic, transportation, and/or safety rules associated with a given geographical region. Moreover, these laws may be stored locally and/or remotely as described herein. Furthermore, the laws may be updated from time to time to, among other things, account for changes in the laws. For example, the State of Idaho may ban the use of texting (i.e., sending a text message via some device 108) while driving, but may allow the use of a handheld mobile phone (e.g., device). In contrast, the State of Oregon may completely ban the use of handheld devices. While the user 112 is traveling in Idaho, the feature control module 104 may refer to the laws of Idaho and determine to control the device 108 in accordance with Idaho law. However, once the user 112 is detected as being in Oregon, the feature control module 104 may control the device 108 based, at least in part, on the laws of Oregon. This procedure will be described further herein, however, it should be noted that the vehicle sensors 132 and/or other sensors 140 may determine at least one location of the device 108, and refer to laws associated with that at least one location to control the device 108 accordingly.
[0058] The method continues at step 316 by determining settings of the one or more associated devices 108. These settings may include data relating to the feature control module 104, communications, permissions, device 108 control, methods, user preferences, historical data, and the like. As can be appreciated, a device 108 may have multiple power states associated with its operation. Most devices, including smartphones, tablets, handheld computers, and the like, do not have simple "On/Off" states. To differentiate between these power states, the following terminology will be used to better define the multiple power states of a device 108. "Device Off" is used to indicate that the device 108 is completely turned off; in other words, virtually no power is being used by the communication device 108 in this state. When "Device Off" the device 108 cannot receive or transmit typical communications, signals, alerts, and the like. "Device On" is used to indicate that the device 108 is turned on, capable of receiving and transmitting communications, signals, and alerts, and power is directed to the device 108 display and all recruited components. In some embodiments, "Device On" may indicate that the device 108 display is fully powered. In another embodiment, a fully powered display may indicate that the device 108 is in a condition to detect input received at all areas of the display (e.g., touch-screen). "Device Lock" is used to indicate that power to the
communication device 108 display is limited, but the device 108 is capable of receiving and transmitting communications, signals, alerts, and the like. Device Lock saves battery power by reducing power supplied to the display while allowing applications to present an alert to the display or other indicator upon direction of the feature control module 104 and/or an application. In an embodiment where the display may comprise a touch-screen, a Device Lock state may cause reduced power to be directed to the display (e.g., in a limited area or section of the display). In accordance with some embodiments of the present disclosure, the feature control module 104 and/or application may transition the device 108 from a Device On state to a Device Lock state and vice versa.
[0059] The settings of a device 108 may be configured to lock the device 108, or operate the device 108 in a Device Lock state, when controlled by the feature control module 104. For instance, a parent/guardian may configure a child's device 108 to be controlled in accordance with strict settings and/or preferences. In this instance, the parent/guardian may determine that a device 108 may be a distraction to a child, while driving, in any state other than the Device Lock state. As such, the parent/guardian can set the device 108 to respond to feature control module 104 controls by operating the device in a Device Lock state. In contrast, a parent/guardian may wish to configure the settings of a device 108 to be less strict and allow access to other features of the device 108. In this case, a parent/guardian may configure a device 108 to only lock specific features associated with the device 108. In any event, the device 108 may be controlled at higher levels of strictness than provided by the vehicle-device laws determined in step 312. These higher levels of strictness may be provided by user preferences and/or device 108 settings. In some cases, the device 108 state may override settings, laws, and/or preferences.
[0060] The method continues by determining the state of the device 108 (step 320). States of the device 108 may include one or more power state (on, off, and/or locked), orientation (vertical, horizontal, angle, etc.), operation (e.g., input type, running and/or background applications), sensor states, and the like. Among other things, specific device 108 states may indicate one or more conditions related to the user 112, vehicle 204, and/or the device 108 itself. For example, a device 108 may be in an unpowered, or Device Off, state and as such the condition may preclude control by the feature control module 104. On the other hand, the state information of the device 108 may indicate that the device 108 is operating in a Device On state and may be subject to control via the feature control module 104. It is an aspect that sensor information received from a device 108 may determine control via the feature control module 104. For instance, one or more sensors on a device 108 may detect an impact, shock, and/or other tactile input and may correlate the data (in some instances in combination with other data) to determine a response by the feature control module 104.
[0061] The vehicle state is determined at step 324. This vehicle state may include but is not limited to vehicle motion (driving, stopped, etc.), position (geographically), speed, acceleration, deceleration, transmission state (in-park, engaged drive, engaged reverse, in-gear, neutral), component status (parking brake, airbag, safety restraint system, engine control unit (ECU) output, CANbus activity), occupants (number, position, weight, and the like), sensor information (temperatures, pressures, etc.), and combinations thereof. In an exemplary embodiment, a user 112 may be driving a vehicle 204 while attempting to simultaneously operate an associated device 108. Upon detecting that the vehicle 204 is moving, the feature control module 104 may control the device 108 and/or features of the device 108 accordingly. Additionally or alternatively, when the vehicle 204 is determined to be in a stationary state (i.e., not moving), and even in-park, the feature control module 104 may determine to cease controlling the device 108.
[0062] In some embodiments, the vehicle 204 state may indicate an emergency condition. For example, the vehicle 204, via one or more sensors 132, 140 may indicate that the vehicle 204 has been subjected to substantial amounts of impact force, the airbag deployed, the anti-lock braking system engaged, the vehicle 204 instantaneously moved in a direction contrary to historical data collected over time, the speed of the vehicle reduced dramatically, and more. These exemplary sensor responses may be indicative of an accident. In any event, the feature control module 104 may be configured to address emergency scenarios, especially with respect to the control of one or more devices 108. In one embodiment, an emergency state may cause the feature control module 104 to provide unfettered access to the device 108 and/or its features. In another embodiment, an emergency state may cause the feature control module 104 to present an emergency message to the one or more devices 108. This emergency message may be sent to emergency services personnel and/or a third party. Furthermore, the emergency message may include details regarding the emergency, the state of the vehicle 204, the state of a user 112, and/or the state of the device 108.
[0063] The feature control module 104 is configured to control one or more devices 108 based at least in part on rules (step 328). In general, the feature control module may utilize any one or more of the steps presented herein in determining control of the one or more devices 108. In some embodiments, the rules may direct that all of the steps disclosed herein be considered before the specific control of a device 108 is initiated. These rules may include at least one algorithm to provide a controlling action response from the feature control module 104. The rules may use sensor information collected, settings, laws, and more in determining a control action.
[0064] Control of a device 108 may take a number of forms. In some embodiments, control of a device 108 may include restricting access to specific applications, programs, and/or features of the device 108. For example, a user 112 whose device is being controlled by a feature control module 104 may be allowed to access the home screen of a device 108 to check the time and/or date. However, this user 112 may be restricted, by the feature control module 104, from accessing a communications interface (e.g., telephone, texting, SMS, M MS, email, web browsers, and the like). Additionally or alternatively, the user may be restricted from accessing programs that require physical input at the device 108. For instance, a user 112 may be allowed to use the device 108 to send some form of communication and/or interface with the device 108 using voice commands and/or visual input.
[0065] In some embodiments, the control of a device 108 may include transitioning the device 108 from one state to another. Among other things, various device 108 states may include Device On, Device Off, and Device Lock. In accordance with the present disclosure, and as previously stated, the rules may refer to location of the device 108 to activate and/or deactivate a control action. Additionally or alternatively, a control message may be presented to an interface associated with the device 108 to indicate that the device 108 is controlled or released from control.
[0066] In other embodiments, the control of a device 108 may include blocking
communications to and/or from the device 108. This type of communications control may be activated in one or more of an area 208, a zone 212, and a device 108. For instance, if one or more devices 108 are detected in a given area 208, the feature control module 104 may determine to control all of the devices 108 together. This control may include interfering with the devices' 108 communication abilities.
[0067] Fig. 4 is a flow diagram depicting a second feature control system method 400 in accordance with embodiments of the present disclosure. In general, the method 400 is directed to detecting a device 108 and any associated settings for the control of the device 108. The method begins at step 404 and proceeds by detecting one or more device 108 (step 408). As disclosed above, detection may be achieved through physical and/or wireless techniques. Moreover, the disclosed detection techniques may be automatically performed and/or manually initiated. If no device 108 is detected, the method ends (step 442).
[0068] Upon detecting a device, however, the method 400 continues by determining whether any settings are associated with the device 108 (step 412). These settings may include data associated with a user, device, application, and/or feature control module 104. Typical settings may be stored in device data 120, at the feature control module 104 system data 106, and/or remotely in stored data 124. If no settings are detected, the user 112 may be prompted to enter settings, and/or configure the device 108 (step 416).
[0069] At this point, the user 112 may enter settings as prompted (step 420). In other words, the user 112 may enter settings information at one or more of the device 108, interface to the feature control module 104, and/or at a server 122. The settings may be prompted via at least one application running on the device, a server, and/or running as part of the feature control module 104. If the user fails to enter settings as prompted, the method 400 may continue by optionally controlling the device 108 based on default settings (step 424) and/or end the method (step 442).
[0070] If settings are available, or if the user 112 enters settings as prompted, the method 400 may continue by controlling the device 108 based at least in part on the settings and on rules stored in memory (step 428). In an exemplary embodiment, the feature control module 104 may control one or more behavior of the device 108. For example, rules may dictate that while a vehicle 204 is in motion, the device 108 should be controlled for all communications applications. Additionally or alternatively, a user 112 may enter settings directing that, when controlled by a feature control module 104, the device 104 should be transitioned to a Device Lock state.
Moreover, the user 112 may wish to have an alert/notification pushed to the device 108 interface to indicate that the device 108 is being controlled. This alert/notification may be provided in the form of a message. It is anticipated that vehicle 204 and/or device 108 conditions may be continually monitored by the feature control module 104 to modify the control method 400. Once a device 108 is controlled, the method may return to detecting any available devices 108 (step 408). If no device 108 is found, the method ends (step 442).
[0071] Referring to Fig. 5, a flow diagram is shown depicting a third feature control system method 500 in accordance with embodiments of the present disclosure. Among other things, the method 500 discloses a feature control module 104 utilizing device 108 and/or vehicle 204 location to determine at least one control action. The method begins at step 504 and proceeds by detecting one or more device 108 (step 508). If no device is found, the method ends (step 520). [0072] Upon detecting a device 108, the method 500 continues by determining the location of the device 108 (and/or vehicle 204)(step 512). The location of the device 108 may refer to physical location of the device 108 inside or outside of a vehicle 204. In the event that a device 108 is determined to be located inside a vehicle 204, a specific location of the device 108 may be determined. Additionally or alternatively, the device 108 may be determined to be in a general location inside the vehicle 204. Depending on the rules and/or state of the vehicle 204, the specific location of the device 108 may be important to the feature control module 104 in determining to control the device 108 or its features. For example, a device 108 detected in the driver's seat of a vehicle 204 may be controlled differently than a device 108 detected in the rear passenger seat of a vehicle 204. As one example, a device 108 in the driver's seat may be controlled to more strict conditions. On the other hand, a device 108 found in the rear passenger location may be unrestricted or minimally restricted.
[0073] In some embodiments, the location of the device 108 may include a location of the vehicle 204. In other words, a location of the device 108 detected inside a vehicle 204, may be provided by a GPS or other location service of the vehicle and/or the device 108 itself. This geographical location of the vehicle 204 may be used by the feature control module 104 in initiating a control action. In particular, the feature control module 104 may refer to laws associated with the geographical location of the vehicle 204 in controlling the device 108. In some instances, these laws may be related to traffic and/or vehicle-device use statutes created by a government or third party.
[0074] When the device 108 is location is determined, the method 500 continues by controlling the device 108 based at least partially on the location of the device 108 and stored rules (step 516). As provided in an example a bove, a device 108 may be controlled in accordance with laws based on the location of the device 108 in the vehicle 204. The method 500 may continue by returning to the step of detecting devices (step 508). If no device is found, the method ends (step 520).
[0075] Fig. 6 is a flow diagram depicting a fourth feature control system method 600 in accordance with embodiments of the present disclosure. In general, the method 600 is directed to determining a state of a vehicle 204 to provide control action guidance for the feature control module 104. In some embodiments, the feature control module 104 may be configured to cease control of a device and/or its applications based on a number of states associated with a vehicle 204. One of these overriding control states is an emergency detected by the feature control module 104. [0076] The method 600 begins at step 604 and proceeds by determining whether one or more devices 108 have been detected (step 608). If no device is found, the method ends (step 628). However, upon detecting a device 108, the method 600 continues by determining a state of the vehicle 204(step 612). A vehicle state may be determined by one or more inputs provided via the vehicle sensors 132, non-vehicle sensors 140, device 108, and a user 112.
[0077] In some embodiments, the method 600 may interpret the nature of the vehicle state determined in step 612. In particular, the feature control module 104 may determine whether the vehicle is in a state of emergency or not (step 616). As described above, an emergency state may be determined from a number of vehicle 204 inputs. For example, various vehicle sensors 132 may indicate that an oil line associated with the vehicle 204 is losing pressure, the engine is reaching an unusually high predetermined temperature, and the safety restraint sensors detect impact at the front of the vehicle 204. This combination of sensor inputs may be enough to qualify as an emergency. In some embodiments, the user 112 may input an override command to indicate an emergency state. This override command may be in the form of video, voice, tactile, or other input.
[0078] Upon detecting an emergency state of the vehicle 204, the feature control module 104 may be directed to override specific controlled features of the device 108 (step 620). In other words, the feature control module 104 may allow access to all, or less than all, of the features of the device 108. For example, in the event of an emergency, a user's 112 access to the communications applications of a device 108 may be considered important if not critical.
Therefore, a detected emergency state may prevent the restricted control of the device's communication hardware and/or software.
[0079] In the event that the vehicle state is not determined to be an emergency, the method 600 continues by controlling the device 108 based at least partially on the vehicle state and stored rules (step 624). For example, one or more sensors 132, 140 may indicate that a vehicle 204 has reduced speed in a short amount of time. However, the feature control module 104 may determine that this type of scenario is not an emergency. As such, the device 108 may be controlled in accordance with the current vehicle state and rules. For instance, the vehicle state may indicate that the vehicle 204 is stopped and in-park. In this case, the feature control module 104 may allow access to features of the device 108. Upon detecting a state change of the vehicle 204, the feature control module 104 may control the device 108 differently (e.g., restricting access to features of the device 108). The method 600 may continue by returning to the step of detecting devices (step 608). If no device is found the method ends (step 628). [0080] The exemplary systems and methods of this disclosure have been described in relation to a feature control module 104 and associated devices 108. As suggested by this disclosure, features may be shared between a feature control module 104 and a device 108. However, to avoid unnecessarily obscuring the present disclosure, the preceding description omits a number of known structures and devices. This omission is not to be construed as a limitation of the scopes of the claims. Specific details are set forth to provide an understanding of the present disclosure. It should however be appreciated that the present disclosure may be practiced in a variety of ways beyond the specific detail set forth herein.
[0081] Furthermore, while the exemplary aspects, embodiments, and/or configurations illustrated herein show the various components of the system collocated, certain components of the system can be located remotely, at distant portions of a distributed network, such as a LAN and/or the Internet, or within a dedicated system. Thus, it should be appreciated, that the components of the system can be combined in to one or more devices, such as a Personal Computer (PC), laptop, netbook, smart phone, Personal Digital Assistant (PDA), tablet, etc., or collocated on a particular node of a distributed network, such as an analog and/or digital telecommunications network, a packet-switch network, or a circuit-switched network. It will be appreciated from the preceding description, and for reasons of computational efficiency, that the components of the system can be arranged at any location within a distributed network of components without affecting the operation of the system. For example, the various components can be located in a switch such as a PBX and media server, gateway, in one or more communications devices, at one or more users' premises, or some combination thereof.
Similarly, one or more functional portions of the system could be distributed between a telecommunications device(s) and an associated computing device.
[0082] Furthermore, it should be appreciated that the various links connecting the elements can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements. These wired or wireless links can also be secure links and may be capable of communicating encrypted information. Transmission media used as links, for example, can be any suitable carrier for electrical signals, including coaxial cables, copper wire and fiber optics, and may take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
[0083] Also, while the flowcharts have been discussed and illustrated in relation to a particular sequence of events, it should be appreciated that changes, additions, and omissions to this sequence can occur without materially affecting the operation of the disclosed embodiments, configuration, and aspects.
[0084] A number of variations and modifications of the disclosure can be used. It would be possible to provide for some features of the disclosure without providing others.
[0085] In some embodiments, the systems and methods of this disclosure can be
implemented in conjunction with a special purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device or gate array such as PLD, PLA, FPGA, PAL, special purpose computer, any comparable means, or the like. In general, any device(s) or means capable of implementing the methodology illustrated herein can be used to implement the various aspects of this disclosure. Exemplary hardware that can be used for the disclosed embodiments, configurations and aspects includes computers, handheld devices, telephones (e.g., cellular, Internet enabled, digital, analog, hybrids, and others), and other hardware known in the art. Some of these devices include processors (e.g., a single or multiple microprocessors), memory, nonvolatile storage, input devices, and output devices. Furthermore, alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
[0086] In yet another embodiment, the disclosed methods may be readily implemented in conjunction with software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms. Alternatively, the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this disclosure is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized.
[0087] In yet another embodiment, the disclosed methods may be partially implemented in software that can be stored on a storage medium, executed on programmed general-purpose computer with the cooperation of a controller and memory, a special purpose computer, a microprocessor, or the like. In these instances, the systems and methods of this disclosure can be implemented as program embedded on personal computer such as an applet, JAVA® or CGI script, as a resource residing on a server or computer workstation, as a routine embedded in a dedicated measurement system, system component, or the like. The system can also be implemented by physically incorporating the system and/or method into a software and/or hardware system.
[0088] Although the present disclosure describes components and functions implemented in the aspects, embodiments, and/or configurations with reference to particular standards and protocols, the aspects, embodiments, and/or configurations are not limited to such standards and protocols. Other similar standards and protocols not mentioned herein are in existence and are considered to be included in the present disclosure. Moreover, the standards and protocols mentioned herein and other similar standards and protocols not mentioned herein are periodically superseded by faster or more effective equivalents having essentially the same functions. Such replacement standards and protocols having the same functions are considered equivalents included in the present disclosure.
[0089] The present disclosure, in various aspects, embodiments, and/or configurations, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various aspects, embodiments, configurations embodiments, subcombinations, and/or subsets thereof. Those of skill in the art will understand how to make and use the disclosed aspects, embodiments, and/or configurations after understanding the present disclosure. The present disclosure, in various aspects, embodiments, and/or configurations, includes providing devices and processes in the absence of items not depicted and/or described herein or in various aspects, embodiments, and/or configurations hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease and\or reducing cost of implementation.
[0090] The foregoing discussion has been presented for purposes of illustration and description. The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, embodiments, and/or configurations of the disclosure may be combined in alternate aspects, embodiments, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure. [0091] Moreover, though the description has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Claims

What is claimed is:
1. A method of controlling access to one or more features of a communication device associated with a vehicle, comprising:
establishing, by a microprocessor executable feature control module, a connection with the communication device, wherein the feature control module is configured to receive input from at least one sensor;
determining, by the feature control module, a location of the communication device relative to the vehicle; and
controlling, via the feature control module and based at least partially on the location of the communication device, user access to one or more features of the communication device.
2. The method of claim 1, wherein the connection between the communication device and feature control module is established via manually registering the communication device with the feature control module.
3. The method of claim 1, wherein the connection between the communication device and feature control module is established via automatically registering the communication device with the feature control module.
4. The method of claim 3, wherein automatically registering the communication device further comprises storing in a memory an identifier associated with the communication device.
5. The method of claim 1, wherein the location of the communication device is determined to be inside the vehicle.
6. The method of claim 5, wherein the inside of the vehicle is arranged into one or more areas, and wherein the communication device is located in a specific area of the one or more areas.
7. The method of claim 6, wherein the specific area is associated with an operating area of the vehicle, and wherein the feature control module restricts access to the one or more features of the communication device.
8. The method of claim 6, wherein the specific area is associated with a passenger area of the vehicle, and wherein the feature control module allows unrestricted access to the one or more features of the communication device.
9. The method of claim 1, further comprising:
referring to one or more rules relating to operating the communication device while operating the vehicle; and wherein user access to the one or more features of the communication device is controlled based at least partially on the one or more rules.
10. The method of claim 9, wherein the one or more rules correspond to laws associated with a geographical region, and wherein the laws are stored in a memory.
11. The method of claim 1, further comprising:
referring to one or more settings associated with the communication device; and wherein user access to the one or more features of the communication device is controlled based at least partially on the one or more settings.
12. The method of claim 1, further comprising:
determining a state of the vehicle associated with the communication device, wherein determining the vehicle state further comprises:
receiving input from the at least one sensor; and
interpreting whether the input received indicates an emergency state associated with the vehicle;
wherein user access to the one or more features of the communication device is controlled based at least partially on the determined state of the vehicle.
13. The method of claim 12, wherein the vehicle is determined to be in an emergency state, and wherein unrestricted user access to the one or more features of the communication device is allowed.
14. The method of claim 12, wherein the vehicle is determined to be in a parked state, and wherein unrestricted user access to the one or more features of the communication device is allowed.
15. The method of claim 12, wherein the vehicle is determined to be in a moving state, and wherein user access to the one or more features of the communication device is restricted.
16. A tangible, non-transitory computer readable medium having instructions stored thereon that, when executed by a processor, perform the method comprising:
establishing a connection with the communication device;
receiving input from at least one sensor;
determining a location of the communication device relative to a vehicle; and controlling based at least partially on the location of the communication device, user access to one or more features of the communication device.
17. The non-transitory computer readable medium of claim 16, wherein the method further comprises: referring to one or more rules relating to operating the communication device while operating the vehicle, wherein the one or more rules correspond to laws associated with a geographical region, and wherein the laws are stored in a memory; and
wherein user access to the one or more features of the communication device is controlled based at least partially on the one or more rules
18. The non-transitory computer readable medium of claim 16, wherein the method further comprises:
referring to one or more settings associated with the communication device; and wherein user access to the one or more features of the communication device is controlled based at least partially on the one or more settings.
19. A system for controlling access to one or more features of a communication device associated with a vehicle, comprising:
a feature control module, wherein the feature control module is configured to control the communication device via communication across the communication network;
at least one sensor; and
a microprocessor executable feature control module operable to:
establish a connection with the communication device, wherein the feature control module is configured to receive input from the at least one sensor;
determine a location of the communication device relative to the vehicle; and control, based at least partially on the location of the communication device, user access to one or more features of the communication device.
20. The system of claim 19, further comprising:
a rules management server, wherein the rules management server is configured to control access to one or more rules relating to operating the communication device while operating the vehicle; and
wherein the method further comprises:
referring to one or more rules relating to operating the communication device while operating the vehicle; and
wherein user access to the one or more features of the communication device is controlled based at least partially on the one or more rules.
PCT/US2013/043211 2012-05-30 2013-05-30 Control of device features based on vehicles state WO2013181310A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP13797358.2A EP2856326A4 (en) 2012-05-30 2013-05-30 Control of device features based on vehicles state
CA2874651A CA2874651A1 (en) 2012-05-30 2013-05-30 Control of device features based on vehicle state

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US201261653264P 2012-05-30 2012-05-30
US201261653275P 2012-05-30 2012-05-30
US61/653,275 2012-05-30
US61/653,264 2012-05-30
US201261653563P 2012-05-31 2012-05-31
US61/653,563 2012-05-31
US201261663335P 2012-06-22 2012-06-22
US61/663,335 2012-06-22
US201261672483P 2012-07-17 2012-07-17
US61/672,483 2012-07-17
US201261714016P 2012-10-15 2012-10-15
US61/714,016 2012-10-15
US13/679,676 2012-11-16
US13/679,676 US20130145065A1 (en) 2011-11-16 2012-11-16 Control of device features based on vehicle state

Publications (2)

Publication Number Publication Date
WO2013181310A2 true WO2013181310A2 (en) 2013-12-05
WO2013181310A3 WO2013181310A3 (en) 2015-05-28

Family

ID=49674049

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/043211 WO2013181310A2 (en) 2012-05-30 2013-05-30 Control of device features based on vehicles state

Country Status (3)

Country Link
EP (1) EP2856326A4 (en)
CA (1) CA2874651A1 (en)
WO (1) WO2013181310A2 (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015134465A3 (en) * 2014-03-04 2015-12-03 Qualcomm Incorporated Managing features associated with a user equipment based on a location of the user equipment within a vehicle
EP2953385A1 (en) * 2014-03-07 2015-12-09 2236008 Ontario Inc. System and method for distraction mitigation
EP2958347A1 (en) * 2014-06-17 2015-12-23 Fujitsu Limited Scheduling method and scheduling controller for wireless-connected apparatus
EP3002929A1 (en) * 2014-09-30 2016-04-06 Toyota Jidosha Kabushiki Kaisha Wireless communication device mountable on mobile object, monitoring control system of wireless communication device mountable on mobile object, monitoring control method of wireless communication device mountable on mobile object, and remote control center
EP3170313A4 (en) * 2014-07-15 2017-05-24 Laird Technologies, Inc. Bluetooth zone control using proximity detection
WO2017209851A1 (en) * 2016-06-01 2017-12-07 Gogo Llc Systems and methods for averting unsanctioned access to on-board vehicle networks
WO2018039976A1 (en) * 2016-08-31 2018-03-08 Bayerische Motoren Werke Aktiengesellschaft Apparatus and method for remote access to personal function profile for vehicle
US9928734B2 (en) 2016-08-02 2018-03-27 Nio Usa, Inc. Vehicle-to-pedestrian communication systems
US9946906B2 (en) 2016-07-07 2018-04-17 Nio Usa, Inc. Vehicle with a soft-touch antenna for communicating sensitive information
US9963106B1 (en) 2016-11-07 2018-05-08 Nio Usa, Inc. Method and system for authentication in autonomous vehicles
US9984572B1 (en) 2017-01-16 2018-05-29 Nio Usa, Inc. Method and system for sharing parking space availability among autonomous vehicles
US10031521B1 (en) 2017-01-16 2018-07-24 Nio Usa, Inc. Method and system for using weather information in operation of autonomous vehicles
US10074223B2 (en) 2017-01-13 2018-09-11 Nio Usa, Inc. Secured vehicle for user use only
US10234302B2 (en) 2017-06-27 2019-03-19 Nio Usa, Inc. Adaptive route and motion planning based on learned external and internal vehicle environment
US10249104B2 (en) 2016-12-06 2019-04-02 Nio Usa, Inc. Lease observation and event recording
US10286915B2 (en) 2017-01-17 2019-05-14 Nio Usa, Inc. Machine learning for personalized driving
US10369966B1 (en) 2018-05-23 2019-08-06 Nio Usa, Inc. Controlling access to a vehicle using wireless access devices
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10380710B2 (en) 2015-12-09 2019-08-13 Copernicus, Llc System and method for monitoring and reporting a person's phone usage while driving
US10410064B2 (en) 2016-11-11 2019-09-10 Nio Usa, Inc. System for tracking and identifying vehicles and pedestrians
US10410250B2 (en) 2016-11-21 2019-09-10 Nio Usa, Inc. Vehicle autonomy level selection based on user context
US10464530B2 (en) 2017-01-17 2019-11-05 Nio Usa, Inc. Voice biometric pre-purchase enrollment for autonomous vehicles
US10471829B2 (en) 2017-01-16 2019-11-12 Nio Usa, Inc. Self-destruct zone and autonomous vehicle navigation
US10606274B2 (en) 2017-10-30 2020-03-31 Nio Usa, Inc. Visual place recognition based self-localization for autonomous vehicles
US10635109B2 (en) 2017-10-17 2020-04-28 Nio Usa, Inc. Vehicle path-planner monitor and controller
US10692126B2 (en) 2015-11-17 2020-06-23 Nio Usa, Inc. Network-based system for selling and servicing cars
US10694357B2 (en) 2016-11-11 2020-06-23 Nio Usa, Inc. Using vehicle sensor data to monitor pedestrian health
US10708547B2 (en) 2016-11-11 2020-07-07 Nio Usa, Inc. Using vehicle sensor data to monitor environmental and geologic conditions
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
US10717412B2 (en) 2017-11-13 2020-07-21 Nio Usa, Inc. System and method for controlling a vehicle using secondary access methods
US10837790B2 (en) 2017-08-01 2020-11-17 Nio Usa, Inc. Productive and accident-free driving modes for a vehicle
US10897469B2 (en) 2017-02-02 2021-01-19 Nio Usa, Inc. System and method for firewalls between vehicle networks
US10935978B2 (en) 2017-10-30 2021-03-02 Nio Usa, Inc. Vehicle self-localization using particle filters and visual odometry

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7937075B2 (en) * 2006-10-06 2011-05-03 At&T Intellectual Property I, L.P. Mode changing of a mobile communications device and vehicle settings when the mobile communications device is in proximity to a vehicle
US20090099720A1 (en) * 2007-10-16 2009-04-16 Elgali Mordechai Monitoring the operation and maintenance of vehicles
US7711468B1 (en) * 2008-01-07 2010-05-04 David Levy System and method for controlling speed of a moving vehicle
US8213962B2 (en) * 2009-07-21 2012-07-03 Verizon Patent And Licensing Inc. Vehicle computer link to mobile phone
US8761821B2 (en) * 2009-07-21 2014-06-24 Katasi Llc Method and system for controlling a mobile communication device in a moving vehicle
US8145199B2 (en) * 2009-10-31 2012-03-27 BT Patent LLC Controlling mobile device functions

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP2856326A4 *

Cited By (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3232650A3 (en) * 2014-03-04 2017-11-08 Qualcomm Incorporated Managing features associated with a user equipment based on a location of the user equipment within a vehicle
WO2015134465A3 (en) * 2014-03-04 2015-12-03 Qualcomm Incorporated Managing features associated with a user equipment based on a location of the user equipment within a vehicle
US10382617B2 (en) 2014-03-04 2019-08-13 Qualcomm Incorporated Managing features associated with a user equipment based on a location of the user equipment within a vehicle
US9537989B2 (en) 2014-03-04 2017-01-03 Qualcomm Incorporated Managing features associated with a user equipment based on a location of the user equipment within a vehicle
EP2953385A1 (en) * 2014-03-07 2015-12-09 2236008 Ontario Inc. System and method for distraction mitigation
US9674337B2 (en) 2014-03-07 2017-06-06 2236008 Ontario Inc. System and method for distraction mitigation
EP2958347A1 (en) * 2014-06-17 2015-12-23 Fujitsu Limited Scheduling method and scheduling controller for wireless-connected apparatus
WO2015192917A1 (en) * 2014-06-17 2015-12-23 Fujitsu Limited Scheduling method and scheduling controller for wireless-connected apparatus
EP3170313A4 (en) * 2014-07-15 2017-05-24 Laird Technologies, Inc. Bluetooth zone control using proximity detection
CN105530602A (en) * 2014-09-30 2016-04-27 丰田汽车(中国)投资有限公司 Wireless communication equipment, remote control center, monitoring and control system and method
AU2015234330B2 (en) * 2014-09-30 2017-09-21 Toyota Jidosha Kabushiki Kaisha Wireless communication device mountable on mobile object, monitoring control system of wireless communication device mountable on mobile object, monitoring control method of wireless communication device mountable on mobile object, and remote control center
RU2635239C2 (en) * 2014-09-30 2017-11-09 Тойота Дзидося Кабусики Кайся Wireless communication device mounted on mobile object, monitoring system of wireless communication device mounted on mobile object, method of managing monitoring device of wireless communication device mounted on mobile object and remote control center
JP2017225184A (en) * 2014-09-30 2017-12-21 豊田汽車(中国)投資有限公司 Radio communication device deployable on mobile body, monitoring control system of radio communication device deployable on mobile body, monitoring control method of radio communication device deployable on mobile body, and remote control center
US9867002B2 (en) 2014-09-30 2018-01-09 Toyota Jidosha Kabushiki Kaisha Wireless communication device mountable on mobile object, monitoring control system of wireless communication device mountable on mobile object, monitoring control method of wireless communication device mountable on mobile object, and remote control center
EP3293947A1 (en) * 2014-09-30 2018-03-14 Toyota Jidosha Kabushiki Kaisha Wireless communication device mountable on mobile object, monitoring control system of wireless communication device mountable on mobile object, monitoring control method of wireless communication device mountable on mobile object, and remote control center
EP3002929A1 (en) * 2014-09-30 2016-04-06 Toyota Jidosha Kabushiki Kaisha Wireless communication device mountable on mobile object, monitoring control system of wireless communication device mountable on mobile object, monitoring control method of wireless communication device mountable on mobile object, and remote control center
US10225693B2 (en) 2014-09-30 2019-03-05 Toyota Jidosha Kabushiki Kaisha Wireless communication device mountable on mobile object, monitoring control system of wireless communication device mountable on mobile object, monitoring control method of wireless communication device mountable on mobile object, and remote control center
US11715143B2 (en) 2015-11-17 2023-08-01 Nio Technology (Anhui) Co., Ltd. Network-based system for showing cars for sale by non-dealer vehicle owners
US10692126B2 (en) 2015-11-17 2020-06-23 Nio Usa, Inc. Network-based system for selling and servicing cars
US10380710B2 (en) 2015-12-09 2019-08-13 Copernicus, Llc System and method for monitoring and reporting a person's phone usage while driving
US10657616B2 (en) 2015-12-09 2020-05-19 Copernicus, Llc System and method for monitoring and reporting a person's phone usage while driving
CN109792591A (en) * 2016-06-01 2019-05-21 Gogo有限责任公司 System and method for avoiding unauthorized access to an onboard vehicle network
CN109792591B (en) * 2016-06-01 2022-06-17 高高商务航空有限责任公司 System and method for avoiding unauthorized access to an onboard vehicle network
WO2017209851A1 (en) * 2016-06-01 2017-12-07 Gogo Llc Systems and methods for averting unsanctioned access to on-board vehicle networks
US10298692B2 (en) 2016-06-01 2019-05-21 Gogo Llc Systems and methods for averting unsanctioned access to on-board vehicle networks
US10388081B2 (en) 2016-07-07 2019-08-20 Nio Usa, Inc. Secure communications with sensitive user information through a vehicle
US10679276B2 (en) 2016-07-07 2020-06-09 Nio Usa, Inc. Methods and systems for communicating estimated time of arrival to a third party
US10685503B2 (en) 2016-07-07 2020-06-16 Nio Usa, Inc. System and method for associating user and vehicle information for communication to a third party
US10032319B2 (en) 2016-07-07 2018-07-24 Nio Usa, Inc. Bifurcated communications to a third party through a vehicle
US11005657B2 (en) 2016-07-07 2021-05-11 Nio Usa, Inc. System and method for automatically triggering the communication of sensitive information through a vehicle to a third party
US10262469B2 (en) 2016-07-07 2019-04-16 Nio Usa, Inc. Conditional or temporary feature availability
US10699326B2 (en) 2016-07-07 2020-06-30 Nio Usa, Inc. User-adjusted display devices and methods of operating the same
US9984522B2 (en) 2016-07-07 2018-05-29 Nio Usa, Inc. Vehicle identification or authentication
US10672060B2 (en) 2016-07-07 2020-06-02 Nio Usa, Inc. Methods and systems for automatically sending rule-based communications from a vehicle
US10304261B2 (en) 2016-07-07 2019-05-28 Nio Usa, Inc. Duplicated wireless transceivers associated with a vehicle to receive and send sensitive information
US10354460B2 (en) 2016-07-07 2019-07-16 Nio Usa, Inc. Methods and systems for associating sensitive information of a passenger with a vehicle
US9946906B2 (en) 2016-07-07 2018-04-17 Nio Usa, Inc. Vehicle with a soft-touch antenna for communicating sensitive information
US9928734B2 (en) 2016-08-02 2018-03-27 Nio Usa, Inc. Vehicle-to-pedestrian communication systems
WO2018039976A1 (en) * 2016-08-31 2018-03-08 Bayerische Motoren Werke Aktiengesellschaft Apparatus and method for remote access to personal function profile for vehicle
US9963106B1 (en) 2016-11-07 2018-05-08 Nio Usa, Inc. Method and system for authentication in autonomous vehicles
US11024160B2 (en) 2016-11-07 2021-06-01 Nio Usa, Inc. Feedback performance control and tracking
US12080160B2 (en) 2016-11-07 2024-09-03 Nio Technology (Anhui) Co., Ltd. Feedback performance control and tracking
US10083604B2 (en) 2016-11-07 2018-09-25 Nio Usa, Inc. Method and system for collective autonomous operation database for autonomous vehicles
US10031523B2 (en) 2016-11-07 2018-07-24 Nio Usa, Inc. Method and system for behavioral sharing in autonomous vehicles
US10694357B2 (en) 2016-11-11 2020-06-23 Nio Usa, Inc. Using vehicle sensor data to monitor pedestrian health
US10410064B2 (en) 2016-11-11 2019-09-10 Nio Usa, Inc. System for tracking and identifying vehicles and pedestrians
US10708547B2 (en) 2016-11-11 2020-07-07 Nio Usa, Inc. Using vehicle sensor data to monitor environmental and geologic conditions
US10949885B2 (en) 2016-11-21 2021-03-16 Nio Usa, Inc. Vehicle autonomous collision prediction and escaping system (ACE)
US10410250B2 (en) 2016-11-21 2019-09-10 Nio Usa, Inc. Vehicle autonomy level selection based on user context
US11922462B2 (en) 2016-11-21 2024-03-05 Nio Technology (Anhui) Co., Ltd. Vehicle autonomous collision prediction and escaping system (ACE)
US10515390B2 (en) 2016-11-21 2019-12-24 Nio Usa, Inc. Method and system for data optimization
US10970746B2 (en) 2016-11-21 2021-04-06 Nio Usa, Inc. Autonomy first route optimization for autonomous vehicles
US10699305B2 (en) 2016-11-21 2020-06-30 Nio Usa, Inc. Smart refill assistant for electric vehicles
US11710153B2 (en) 2016-11-21 2023-07-25 Nio Technology (Anhui) Co., Ltd. Autonomy first route optimization for autonomous vehicles
US10249104B2 (en) 2016-12-06 2019-04-02 Nio Usa, Inc. Lease observation and event recording
US10074223B2 (en) 2017-01-13 2018-09-11 Nio Usa, Inc. Secured vehicle for user use only
US10471829B2 (en) 2017-01-16 2019-11-12 Nio Usa, Inc. Self-destruct zone and autonomous vehicle navigation
US9984572B1 (en) 2017-01-16 2018-05-29 Nio Usa, Inc. Method and system for sharing parking space availability among autonomous vehicles
US10031521B1 (en) 2017-01-16 2018-07-24 Nio Usa, Inc. Method and system for using weather information in operation of autonomous vehicles
US10464530B2 (en) 2017-01-17 2019-11-05 Nio Usa, Inc. Voice biometric pre-purchase enrollment for autonomous vehicles
US10286915B2 (en) 2017-01-17 2019-05-14 Nio Usa, Inc. Machine learning for personalized driving
US10897469B2 (en) 2017-02-02 2021-01-19 Nio Usa, Inc. System and method for firewalls between vehicle networks
US11811789B2 (en) 2017-02-02 2023-11-07 Nio Technology (Anhui) Co., Ltd. System and method for an in-vehicle firewall between in-vehicle networks
US10234302B2 (en) 2017-06-27 2019-03-19 Nio Usa, Inc. Adaptive route and motion planning based on learned external and internal vehicle environment
US10369974B2 (en) 2017-07-14 2019-08-06 Nio Usa, Inc. Control and coordination of driverless fuel replenishment for autonomous vehicles
US10710633B2 (en) 2017-07-14 2020-07-14 Nio Usa, Inc. Control of complex parking maneuvers and autonomous fuel replenishment of driverless vehicles
US10837790B2 (en) 2017-08-01 2020-11-17 Nio Usa, Inc. Productive and accident-free driving modes for a vehicle
US11726474B2 (en) 2017-10-17 2023-08-15 Nio Technology (Anhui) Co., Ltd. Vehicle path-planner monitor and controller
US10635109B2 (en) 2017-10-17 2020-04-28 Nio Usa, Inc. Vehicle path-planner monitor and controller
US10935978B2 (en) 2017-10-30 2021-03-02 Nio Usa, Inc. Vehicle self-localization using particle filters and visual odometry
US10606274B2 (en) 2017-10-30 2020-03-31 Nio Usa, Inc. Visual place recognition based self-localization for autonomous vehicles
US10717412B2 (en) 2017-11-13 2020-07-21 Nio Usa, Inc. System and method for controlling a vehicle using secondary access methods
US10369966B1 (en) 2018-05-23 2019-08-06 Nio Usa, Inc. Controlling access to a vehicle using wireless access devices

Also Published As

Publication number Publication date
EP2856326A4 (en) 2016-07-13
CA2874651A1 (en) 2013-12-05
EP2856326A2 (en) 2015-04-08
WO2013181310A3 (en) 2015-05-28

Similar Documents

Publication Publication Date Title
US20200059413A1 (en) System and Method for a Vehicle Mediating Zone-Specific Control of a Communication Device
WO2013181310A2 (en) Control of device features based on vehicles state
US20190356552A1 (en) System and method for generating a global state information for a vehicle based on vehicle operator information and other contextuals
JP6773046B2 (en) Driving support device, driving support method, and moving object
EP3133800B1 (en) Apparatus and method for controlling portable device in vehicle
US10171529B2 (en) Vehicle and occupant application integration
US9055022B2 (en) On board vehicle networking module
US8949823B2 (en) On board vehicle installation supervisor
US20160269469A1 (en) Vehicle Supervising of Occupant Applications
US20160306766A1 (en) Controller area network bus
US20160255575A1 (en) Network selector in a vehicle infotainment system
EP2918139B1 (en) Intra-vehicular mobile device management
US20130200991A1 (en) On board vehicle media controller
US20130205026A1 (en) Media filter in a vehicle infotainment system
US20130198802A1 (en) On board vehicle media controller
US20130205412A1 (en) On board vehicle media controller
EP2972768A1 (en) Occupant sharing of displayed content in vehicles
EP3693877A1 (en) Methods and systems to limit a vehicle functionality depending on driver profile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13797358

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2874651

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2013797358

Country of ref document: EP