The present invention relates to a controller for providing an electric skateboard with autonomous control interface wirelessly linking to user interface. The controller preferably provides path planning to an autonomous skateboard.
Existing electric skateboards work well only for situations relating to sport and stunt riding. What is needed is a smart skateboard with trucks that are controlled by an autonomous control system with minimal user instruction which will allow the electric skateboard to be capable of path planning autonomously and pick its own path by environmental tracking and object detection sensors. New methodologies are required for path planning involve creating novel skateboards which work to follow from a starting point to an ending point by means of autonomous control system sensors and by GPS waypoints which are manually created from user interface input.
Thus, the present invention provides a manual control mode and an autonomous control mode selection for an operator not on board, or a rider onboard to control an autonomous skateboard, the autopilot methodology programmed to govern one or more navigation processes of an electric motorized skateboard. Preferably, the autonomous skateboard provides WIFI or Bluetooth linking a user interface system with an autonomous skateboard interface, where in each interface method communicates and receives instructions from the operator/rider, such instructions including tasks for path planning, and gathers environmental sensor data from the autonomous skateboard, the sensor data includes including short range LIDAR sensor, cameras, GPS, etc. for calculating skateboard speed, compass heading, absolute position, relative position, and other environment sensor data. Further, the autonomous skateboard controller includes a processing unit having software for computing logic a central processing unit, memory, storage, communication signals and instruction.
Preferably, the autonomous skateboard interface, the user interface, an environment sensor array and processors are combined as a singular integrated unit, whereby, the rider may temporarily require the use an autopilot system to autonomously control the skateboard so the rider can take break or a potential operator wanted to ride an autonomous skateboard may summon the autonomous skateboard to drive directly to the her or him. While riding, the operator utilizes may engage their smartphone, the rider accesses their user interface settings including electronic identification information and instruction, input and output data, and mechanical identifiers based on machine-readable identification information includes and electronic identifiers, or simply the autonomous scooter automatically controls and detects an operator with respect to the autonomous skateboard.
More specifically, Bluetooth links the operator's smartphone to the autonomous skateboard controlling components, and when riding the rider may utilize a Smartphone APP to select a control mode and program their user preference settings, and/or upon working, monitor the autonomous skateboard's operation of motor speed, a battery level, a compass heading, absolute position and relative position based on GPS local mapping, an odometer or trip meter, and environment sensor data. The operator may wish to upload software or review a summary of the important information useful for operator and store performance data to Cloud management network.
The present invention includes an Autonomous Skateboard Controller System 400 provides autonomous control to many different types of vehicles and now to an electric powered autonomous skateboard 100. Autonomous control means that after initialization, the autonomous skateboard 100 moves and/or accomplishes one or more tasks without further guidance from a human operator 101 whilst onboard riding, has stepped off, or even if the operator 100 is located within a vicinity near an autonomous skateboard 100. The period of full autonomous control may range from a less than a minute to an hour or more. In various aspects the Autonomous Skateboard Controller System 400 is associated with an Autonomous Drive Mode 600 setting and a Manual Drive Mode 700 setting, accordingly the Autonomous Drive Mode 600 or the Manual Drive Mode 700 are engaged or disengaged by an operator's smartphone.
In various riding events, during an operation of an autonomous skateboard 100, the operator 101, in one or more events, may opt to utilize their smartphone 801 to access one or more user interface preference settings via a User Interface System 800 wirelessly linking to a mobile app or “smartphone app”. During a riding event, the autonomous skateboard operator 101 is associated with controlling her or his autonomous skateboard either manually or autonomously, when she or he prefers, for short distances the operator may prefer to manually control their autonomous skateboard, and when riding for longer distances the operator may prefer to not want to manually control their autonomous skateboard therefore she or he can disengage the manual drive and engage the autonomous dive mode, accordingly in any riding event the operator 101 decides a drive mode option.
Respectively, the operator 101 of the autonomous skateboard 100A or the autonomous skateboard 100B accesses control settings by her or his Bluetooth connected smartphone 801, the smartphone 801 is configured with user preference settings based on various Smartphone APP software, the software programming is associated with wirelessly controlling one or more electric motors 109 of the autonomous skateboard 100, in retrospect, the Smartphone APP or related mobile app is provided on the internet of things, an example of the Smartphone App 900 is detailed in
Accordingly hereon the autonomous skateboard controller system 300 may be referred to as (ASCS), the autonomous skateboard 100 may be referred to as (AS) or (AS 100) and the autonomous drive mode 600 may be referred to as (ADM), and the manual drive mode 700 may be referred to as (MDM), and the user interface system 800 may be referred to as (UIS).
Suitable autonomous vehicles such as the autonomous skateboard 100A and the autonomous skateboard 100B include WIFI and/or Bluetooth connectivity adapted for linking the User Interface System 800 (UIS) to the ASCS 400, wherein a built-in Bluetooth communication mode 802 is associated with a communication link between the autonomous skateboard 100 and operator's Smartphone APP 900, and provides a wireless link to one or more environmental sensors and processors associated ASCS drive control methodologies, the AS is detailed herein.
In greater detail
In various elements the drive wheel 108 provides an axle configured to couple the electric motor 109 by bearings and bolting means, accordingly a front truck 110a attaches to a base section 106 situated at the front end 103, and a rear truck 110b attaches to a base section 106 situated at the rear end 104.
In various elements the elongated skateboard platform 102 for providing a front and rear footing placement of the operator 101, and a base section 106 for attaching a front and rear truck 110a, 110b.
In various elements, during manual drive mode 700, the operator 101 is associated with controlling the battery power providing an electric motor a speed control based on the power level regulated, via motor controllers 212, to the front truck 110a and regulated the rear truck 110b electric motor arrangements 109a-109b and manual steering control of the autonomous skateboard 100 is also provided by the operator's 101 riding skill, body posture and footing placement.
In greater detail
In various elements the suspension adapter 111 configured for attaching the truck 110 to a base section 106 of the platform 102, the suspension adapter 111 comprising; a truck plate 111a, a hanger 111b, a bushing 111c, a kingpin 111d that connects the hanger, bushing, and truck plate together, and an axle 111e housed in the hanger 111b.
In one aspect the suspension adapter 111a and 111b are connected on the upper portion autonomous skateboard 100A trucks 110a, 110b, the suspension adapter 111a and 111b are utilized to improve ride comfort, traction, stuck wheels, and reduce rider fatigue.
In one element the deformation sensors 112 may be mounted directly to the suspension adapter 111 and/or mounted on the truck 110 to measure an induced stress caused by the operator's weight.
In one or more elements the deformation sensor 112 is linked to a gyroscope sensor 210, and an accelerometer 211 which controls velocity and other motorized operations of the autonomous skateboard 100, respectively, the deformation sensor 112 associated with the truck 110 is configured to sense strain level 112c induced by a rotation speed and twisting angle differences at a connection point generated at an intersection of the truck 110 and the platforms base section 106 and the deformation sensor 112 to sense strain level induced by a rotation speed and twisting angle differences at a connection point generated at an intersection of a top portion of the truck 110 and the base section 106.
Accordingly the truck 110 utilizes the deformation sensor 112 attached to the suspension adapter 111, the deformation sensor to sense operator weight and center of gravity strain induced by forces exerted upon the front and rear trucks 110a, 110b, and the deformation sensor to sense weight imbalance of the operator's 101 center of gravity to move linearly in response to a balance level of the autonomous skateboard 100 via a gyroscope sensor 210 attached to a section of the platform 101, the gyroscope sensor 210 is configured to sense inclination of the platform 102, when working, respectively the electric motor(s) 109 are configured to drive the wheels 108 only when the autonomous skateboard 100 is properly oriented via one or more load sensors 209 in a reasonable riding position, such as substantially level to the ground.
In greater detail
In various connectivity elements, the platform's deck section 105 and the base section 106, wherein the compartment 200 is contained within a platform base section 106, wherein the compartment 200 provides a cavity for containing an electrical wiring array 201 linking to internal devices, wherein the electrical wiring array 201 is connectively linked to a USB port 216, the USB port become connected to an external power source such as AC 110 outlet, via an external USB power cord 217. The electrical wiring array 201 is configured for linking battery power directly to the following Bluetooth devices Bluetooth connected devices including; LED head lamps 202a, LEAD turn signals 202b, braking lamps 202c, a special effects LED cord 203 synchronized to speakers 204, or for brighter illumination, cameras 205, and ASCS environment sensor array including but not limited to; LIDAR sensor 206 (e.g., 2D, 3D, color LIDAR), RADAR 207, sonar 208.
The Bluetooth connected devices, when paired, are linked to the autonomous skateboard controller system 400 by means of a built-in Bluetooth communication module 802, the Bluetooth communication module 802 transmits data signal associated with the motion control sensor array 209-211 and external environment sensor array. The platform 102 and the compartment 200 further contains the wiring array 201 linking to the gyroscopic sensor 210, accelerometer 211, and a motor controller 212. Wherein the platform components via a built-in Bluetooth communication mode 802, the Bluetooth communication module 802 configured as a wireless a link, linking the autonomous skateboard controller system 400 to the user interface system 800, the user interface system being associated with the operator's Smartphone APP 900, detailed in
In greater detail
In various elements the power control module 213 further comprises a receiver 213b and a processor 213c for monitoring the battery charger's a charge level 215a associated with one or more removable battery packs 214 during a charging process. Wherein the battery charger 215, via wiring array 201 connects the battery packs 214 in a series. Respectively, the battery packs 214 when fully charged can be switched out and used later to extend operators riding time, the spent battery pack are placed back in the compartment or recharged later.
Accordingly, the Bluetooth connected devices listed herein attached to platform sections and to compartment portions, wherein the compartment 200 contains an electrical wiring array 202 linking to a battery 214 comprising a power control module 213, and a battery charger 215 the battery charger subsequently connects to the USB power cord 217 and AC outlet or other power source.
Accordingly, the deformation sensor 112 and the internal sensors 209, 210, and 211 are contained between the deck and base sections 103-106, respectively the gyroscopic sensor 210 (with fuzzy logic 210a) and an accelerometer 211 and provide data based on load sensor data 209a, gyroscope sensor data 210a and base on accelerometer sensor data 211a, and the motor controller 212 associated with a server 212a, a processor 212b, and motor controller sensor data 212c. Respectively the gyroscope sensor 210 providing an intelligent weight and motion controlling means, and an accelerometer 211 configured to measure balance which is achieved as soon as the rider steps on the upper deck section 105, subsequently the preferred power level, associated with drive modes, is activated by the motor controller 212.
Respectively, the AS 100 may be self-powered by regenerated power from the electric motors 109, providing a minimal amount of regeneration power is captured to maintain a battery charge level 215a to run the motor controller 212 and allow the low-drag torque control 212d is useful when the battery 214 has been nearly depleted, a regenerative battery charging process is initiated by the braking activity 107 of slowing down or stopping. Accordingly, the velocity of the drive wheels 108 provides a regenerative braking means 107 associated with maintaining a charge level 215a to the battery.
In greater detail
In various environments 430 the gyroscope sensor 210 and the accelerometer sensor 211 may measure a motion signal 301 of an operator's motion 312 by pushing or shaking the foot pad/board of the example on the platform 102 and/or a 3-dimension moving response of the AS's in the x, y, z direction 315 and velocity 316 associated with the operator's motion 312 and/or the example the AS's motion 311.
In one example, the motions 311/312 may include a predefined motion input 301, including for example, the operator 101 hopping on and/or off the AS 100. The operator utilizing one or more riding skills to associated with motion control operator 101 which include; to engage a drive mode 701-704, motion to engage propulsion 705-715, and motion to engage trajectory 709, 715, 716-719, see
In one example, a deformation sensor 112 may be computed by a weight signal 302 and a gravity angle signal 309 generated from one or more move control signals 307, including for example, forward, backward, accelerate, and/or brake signal 109a from a signal processing unit 304. The signal processing unit 304 may combine and process the deformation output signal 306 providing motion signals 301 to produce the one or more move control signals 307 relayed the autonomous drive mode 600.
In some aspects control signals 307 may control the AS 100 to move in a direction, including for example, a forward direction or a backward direction, or an initial orientation direction (IOD) 321. The direction of the AS's motion 311 may be determined based on the deformation output signal 306 associated with the weight signal 308 and the gravity angle signal 309.
In some aspects control signals may control the speed of the AS 100, for example, to accelerate or braking means 107. In one example, the speed of the AS's motion 311 may be determined based on operator's motion 312, such as shaking the AS 100. In another example, the speed of the AS motion 311 may be determined based on the deformation output signal 306 associated with the weight signal 308 and the gravity angle signal 309. Respectively, the deformation sensor comprises a strain gauge 112a configured to sense induced strain by imbalanced forces exerted upon the drive wheel 108 and the deformation sensor 112 to sense strain level 112b induced by an operator's weight exerted on the deformation sensor 112 to sense strain level induced by a rotation speed and twisting angle differences at a connection point generated at an connection of a suspension adapter 111a of the drive wheel 108a attached on the platform's front end 103, and the deformation sensor 112 to sense strain level induced by a rotation speed and twisting angle differences at a connection point generated at a connection of a suspension adapter 111b of the drive wheel 108b attached on the platform's rear end 104.
Accordingly, the internal motion control sensors 209, 210, and 211 are contained between the deck and base sections 105/106, respectively the gyroscopic sensor 210 (with fuzzy logic 210a) and an accelerometer 211, the load sensor 209 providing data based on gyroscope sensor data 210a and base on accelerometer sensor data 211a, and a motor controller 212 configured having; a server 212a, a processor 212b, sensor data 212c and low-drag torque control 212d. Respectively the gyroscope sensor 210 providing an intelligent weight and motion controlling means via the motor controller 212, and an accelerometer 211 configured to measure balance which is achieved as soon as the operator 101 steps on the upper deck section 105, subsequently when operator 101 dismounts or is not detected on the deck section 105 by load sensors 209a, 209b, and accordingly the preferred power level is activated to furnish battery power to the drive wheel's motor 109 via power control module 213 associated with the motor controller 212.
For example, when a motor controller server 212a is configured to sense the drive wheel 108 speeds and adjust the electric motor 109 torque to keep the drive wheel 108 rotational velocities relatively similar, especially in situations when the right drive wheel 108a has more traction compared to the left drive wheel 108b which may be sensed by a motor controller processor 212b which is configured to read the strain gauge sensors on the electric motors 109 contained therein and determine which drive wheel 108 has more operator 101 weight and therefore more traction.
Another example, when an operator 101 of the autonomous skateboard 100A leans his or her body toward the drive wheel 108a, for example, the front truck's deformation sensor 112a may receive a higher pressure compared with the rear truck's deformation sensor 112b. After signal correction and compensation from the gyroscope sensor 110 and the accelerometer sensor 111 in the motion input 501 according to the environment 330, movement and the operator's motion may be acquired and outputted to the PID control 317 and driving control block 318.
In one example the autonomous skateboard 100A (e. g., whilst in manual drive mode 700) may be steered by the operator 101 by shifting his or her weight to the right or left to complete a right turn or a left turn through the mechanical turn movement of the front truck's 110a first and second drive wheels 108a, 108b, wherein the drive wheel's motors 109a-109b provide drive wheel motion 513.
Primarily, to control the velocity setpoint of the autonomous skateboard 100A speed is determined by using the strain gauge(s) of the deformation sensor 112 to establish the center of gravity (CG) of the operator 101; wherein, when the CG is sensed toward the front truck's two motors 109a-109b the desired speed will be incremented faster in the speed loop 320; and when the CG is sensed toward the rear truck's two motors 109a-109b the desired speed will be decremented slower in the speed loop 320; the rate of increment/decrement may be determined by the amplitude of the CG from center. This method has the advantage of allowing the operator 101 to comfortably stand centered on the board while powering forward at the desired speed. The operator 101 would lean forward to accelerate (increase velocity), lean back to slow down (decrease velocity) until zero speed is reached e. g., braking means 107.
Primarily, to control the velocity setpoint of the autonomous skateboard 100B speed is determined by using the strain gauge(s) of the deformation sensor 112 to sense the center of gravity CG of the operator 101; wherein, when the CG is sensed toward the front truck's one motor 109 the desired speed will be incremented faster in the speed controller loop; and when the CG is sensed toward the rear truck's one motor 109 the desired speed will be decremented slower in the speed loop 320; the rate of increment/decrement may be determined by the amplitude of the CG from center. This method has the advantage of allowing the operator 101 to comfortably stand centered on the board while powering forward at the desired speed. The operator 101 would lean forward to accelerate (increase velocity), lean back to slow down (decrease velocity) until zero speed is reached e. g., braking means 107.
Another control method is to use the above described method to sense CG but to increment or decrement a torque set point in a torque controller loop instead of a speed loop 320. The operator 101 would lean forward to increment the commanded torque set point and lean back to decrement the commanded torque set point; the rate of increment/decrement may be determined by the amplitude of the CG from center.
A selectable option would allow an advanced operator 101 to, when leaning back, also continue in reverse after zero speed is reached, the operator would select to travel in a reverse direction to back up whilst steering left or right.
Another control method is to use the sensed CG to directly control the commanded motor drive torque setpoint. The operator 101 would need to continually lean forward to maintain forward torque and maintain a lean back to apply negative torque.
Another control method is to use the sensed CG to directly control the commanded motor drive velocity setpoint. The operator 101 would need to continually lean forward to maintain forward velocity and lean back to reduce velocity.
In greater detail
For example, perception system 407 may receive sensor data 403 from one or more external environmental sensor array situated on section of the platform and compartment 200; wherein the LIDAR sensor 206 (e.g., 2D, 3D, color LIDAR), RADAR 207, sonar 208, based on MEMS technology 322, other data is gathered by one or more video cameras 205 (e.g., image capture devices); whereas, localizer system 416 may receive sensor array data 403 including but not limited to global positioning system (GPS) 408 having data including; inertial measurement unit (IMU) data 409, map data 410, route data 411, Route Network Definition File (RNDF) data 412 and odometry data 413, wheel encoder data 414, and map tile data 415. Accordingly, the localizer system 416, having a planner system 417 having memory 418 and may receive object data 419 from sources other than sensor system, such as utilizing memory 418 via a data store, or Cloud Data Management 443 and Performance Management Network 444.
Accordingly perception system 407 may process sensor data 406 to generate object data 419 that may be received by the planner system 417. Object data 419 may include but is not limited to data representing object classification 420, detecting an object type 421, object track 422, object location 423, predicted object path 424, predicted object trajectory 425, and object velocity 426, in an environment 430.
Accordingly the localizer system 416 may process sensor data 403, and optionally, other data, to generate position and orientation data 427, local pose data 428 that may be received by the planner system 417. The local pose data 428 may include, but is not limited to, data representing a location 429 of the AS 100 in the environment 430 via (GPS) 408, (IMU) data 409, map data 410, route data 411, (RNDF) data 412 and odometry data 413, wheel encoder data 414, and map tile data 415, and a global coordinate system 431 for example.
The following implementations described that includes a no matter in motion (such as video), and no matter at rest (still images), and text, graphics, or whether it be a picture of any that may be configured to display the image device. It may be implemented in devices or systems. More specifically, the implementation to be described include, but are not limited to, mobile phones, multimedia Internet enabled cellular telephones, mobile television receiver, a wireless device, smartphone, Bluetooth connected devices, personal digital assistant (PDA), a wireless e-mail receiver, hand-held or portable computers. Teachings herein also include, but are not limited to, electronic switching devices, radio frequency filters, sensors, accelerometers, gyroscopes, motion sensing devices, magnetometers, inertial components for consumer electronics, parts of consumer electronics products varactor, a liquid crystal device, an electrophoretic device, a driving method, such as manufacturing processes and electronic test equipment, can be used in non-display applications. Accordingly, the present teachings, not just limited to the implementation shown in the figures, has instead, a wide easily such that the apparent applicability to those skilled in the art.
A sensor system of the AS 100 comprising processors for determining, based at least in part on the sensor data, a location of the AS 100 within the environment 430, wherein the location 429 of the AS 100 identifies a position and orientation via load sensors 209 of the AS 100 within the environment 430 according to global coordinate system 431.
The ASCS is associated with calculating, based at least in part on the location 429 of the autonomous skateboard 100 and at least a portion of the sensor data 403 a trajectory 425 of the AS 100, wherein the trajectory 425 indicates a planned path associated GPS 408 with navigating the AS 100 between at least a first location 429a and a second location 429b within the environment 430.
The ASCS is associated with identifying, based at least in part on the sensor data 406, an object 421 within the environment 430; and determining a location of the object 421 in the environment 430, wherein the location 429 of the object 421 identifies a position and orientation 427 of the object within the environment according to the global coordinate system 431; and determining, based at least in part on the location 429 of the object 421 and the location of the AS 100, to provide a visual alert 432 from a light emitter 433.
The ASCS is associated with selecting a light pattern 434 from a plurality of light emitter 433 patterns, wherein a first one of light patterns 434 is associated with a first level of urgency of the visual alert, and a second one of the light patterns is associated with a second level of urgency of the visual alert; selecting, from a plurality of light emitters 433 of the AS 100, a light emitter 433 to provide the visual alert 432; and causing the light emitter 433 to provide the visual alert 432, the light emitter emitting light indicative of the light pattern 434 into the environment 430.
The ASCS is associated with calculating, based at least in part on the location of the object 421 and the trajectory 425 of the AS 100, an orientation 427 of the AS 100 relative to the location 429 of the object 406; selecting the light emitter is based at least in part on the orientation of the AS 100 relative to the location 429 of the object.
The ASCS is associated with estimating, based at least in part on the location 429 of the object 421 and the location 429 of the AS 100, a threshold event 435 associated with causing the light emitter 433 to provide the visual alert 432; and detecting an occurrence of the threshold event 435; and wherein causing the light emitter 433 of the AS 100 to provide the visual alert 432 is based at least in part on the occurrence of the threshold event 435.
The ASCS is associated with calculating, based at least in part on the location 429 of the object 421 and the location 429 of the AS 100, a distance between the AS 100 and the object 421; and wherein selecting the light pattern 434 is based at least in part on the distance, threshold event 335 according to a threshold distance 436 or a threshold time, and a second threshold distance 437.
The ASCS is associated with estimating light and configured with a setting for selecting the light pattern 434 is based at least in part on one or more of a first threshold event 435 according to a threshold distance or a threshold time, wherein the first threshold distance 436 is associated with the light pattern 434a and a second threshold distance 437 is associated with a different light pattern 434b, wherein the first threshold distance and the second threshold distance is less than a distance between the object 421 and the AS 100, and wherein the threshold time 436 is shorter in duration as compared to a time associated with the location 429 of the AS 100 and the location of the object being coincident with each other.
The ASCS is associated with calculating, based at least in part on the location 429 of the object 421 and the trajectory 425 of the AS 100, a time associated with the location 429 of the AS 100 and the location of the object being coincident with each other; and wherein causing the light emitter 433 of the AS 100 to provide the visual alert 432 is based at least in part on the time.
The ASCS is associated with determining an object classification for the object 421, the object classification determined from a plurality of object classifications, wherein the object classifications include a static pedestrian object classification, a dynamic pedestrian object classification, an object classification, and a dynamic car object classification; wherein selecting the light pattern 434 is based at least in part on the object 421 classification.
The ASCS is associated with accessing map data associated with the environment 430, the map data accessed from a data store of the AS 100; and determining position data and orientation data associated with the AS 100; and wherein determining the location 429 of the AS 100 within the environment 430 is based at least in part on the map data 410, the position data and the orientation data.
The ASCS is associated with selecting a different light pattern 434 from the plurality of light patterns based at least in part on a first location of the object before the visual alert is provided and a second location of the object after the visual alert is provided, and causing the light emitter 433 to provide a second visual alert, wherein the light emitter emits light indicative of the different light pattern into the environment 430.
Wherein the light emitter 433 includes a sub-section and the light pattern includes a sub-pattern 438 associated with the sub-section 439, the sub-section being configured to emit light indicative of the sub-pattern 438, wherein at least one of the sub-patterns 438 is indicative of one or more of a signaling functions of the AS 100 or a braking function 107 of the AS 100 and wherein at least one other sub-pattern 438 is indicative of the visual alert 432 receiving data representing a sensor signal 108a indicative of a rate of rotation of a drive wheel 108 of the AS 100; and modulating the light pattern 434 based at least in part on the rate of drive wheel's electric motor 109 rotation.
The planner system 417 may process the object data and the local via GPS 408 providing pose data 428 to compute a motion path (e.g., a trajectory 425 of the AS) for the AS 100 to travel through the environment 430. The computed path being determined in part by object data 421 in the environment 430 that may create an obstacle to one or more other vehicles and/or may pose a collision threat to the AS 100.
In various aspects the autonomous skateboard controller system 400 may employ a micro controller or central processors, memory, and sensors array to provide autonomous control to many different types of the autonomous skateboard 100. Autonomous control means that after initialization, the AS 100 moves and/or accomplishes one or more tasks without further guidance from the operator 101, even if the operator 101 is riding the AS 100, or the operator 101 is located within a few steps of the AS 100, or within the vicinity of the AS 100.
The link to an environmental sensor array link to a processing unit which communicates with the ASCS 400. The communication between the ASCS and the AS 100 may be carried on any suitable data bus with CAN (e.g. ISO 11898-1) and/or PWM buses preferred. Wirelessly via WIFI 440 and/or Bluetooth 441 the ASCS 400 synchronously links the user interface system 800 or (UIS) and to the Internet 442, Cloud Data 443 and Performance Management Network 444.
The ASCS 100 for providing autonomous control to a AS 100, comprising: UIS 800 that communicates with the AS 100 and provides instructions to the vehicle regarding acceleration, braking, steering or a combination thereof; the UIS 800 that communicates with and receives instructions from an operator 101, the instructions including task instructions, path planning information or both.
The ASCS is associated with an environmental sensor array 407 that receives sensor data from the AS 100 and communicates the sensor data 421 to the to the UIS such data including AS 100 speed, compass heading, absolute position, relative position or a combination thereof. The autonomous control continues for a period of time of at least one hour after instructions are provided to the operator interface; establish sensor array monitors electric motor operating conditions and battery charge level 215a, and electrical systems or a combination thereof of the AS 100.
The ASCS is associated at least one sensor that monitors motion of the AS 100 including rate of acceleration, pitch rate, roll rate, yaw rate or a combination thereof and the at least one sensor that monitors motion includes the accelerometer, the gyroscope, and the motor controller 212.
The ASCS is associated with programming for path planning to the AS 100 and such path planning includes marking a path of waypoints on a digitized geospatial representation, the waypoints mark a path that is the perimeter of a scan area that the AS 100 then scans; wherein the AS 100 scans the scan area by traveling to waypoints within the scan area, and respectively the ASCS 400 employs a digitized geospatial representation that provides absolute position of the AS 100 in creating the scan area or provides relative position through the use of an ad hoc grid.
In various aspects the ASCS 400 includes a mechanism for receiving communication from a smartphone 801 or the internet 442 such that a user can communicate with the AS 100 through the mechanism.
In greater detail
As shown
In greater detail
In greater detail
In greater detail
In various aspects the autonomous skateboard controller system 400 may be requested by the operator to assist the operator during riding activity 706, whereby, the operator may instruct ASCS 400 to link to a micro-processor or processor of the user interface system 800 to temporary deactivate the manual control mode 700 and switch over to engage the Autonomous Drive Mode 600, thus allowing selective or minimal supervision from the operator 101 thereby the operator discontinues controlling the autonomous skateboard 100 (e. g, semiautonomous), or vice versa, switch over to manual drive and regains control respectively, these driving modes may be alternated over a period of time during riding events.
In various aspects the autonomous skateboard controller system 400 may be required to assist the operator 710 automatically if the operator 101 steps off 714 or falls 715, whereby the operator verbally instructs 716 the ASCS 400 to move toward the operator 101, this action is achieved by programming via software algorithms disclosed in the ASCS.
In various aspects the autonomous skateboard controller system 400 may be employed to provide full autonomous control accomplished without any further guidance from when the operator 101, this action is achieved once the operator disengages the Manual Drive Mode 700.
In one or more elements the communication established between the ASCS and the autonomous skateboards 100A/100B may be carried on any suitable data bus with CAN (e.g. ISO 11898-1) and/or PWM buses, working wirelessly via WIFI and/or Bluetooth, the autonomous skateboard controller system 400 synchronously links the ASCS the user interface system 800 to compute a motion path in one or more environments 430.
In greater detail
In greater detail
Accordingly, in one or more applications, the Smartphone APP 900 may update future over-the-air software & firmware and manage social media and the internet of things via a Global network 901. The Smartphone APP 900 allows the vehicle rider 101 to select listings on a menu 902 by the finger prompting 903 (i.e., swiping gestures).
Respectively the Smartphone APP 900 controls of the following settings in relation to the virtually controlled electric skateboard components 916 configured to be wirelessly control via user interface, virtual settings listed on the menu 902 as: a power on 903 and off switch 904, a Power switches 905; a Driving modes 906; Beginner Drive Mode A, Normal Drive Mode B, Master Drive Mode C; a Motor controller 907; a Battery power level 908; a Charging gauge 909; GPS 910: a mapping a route 910A, a distance 910B; LED Lamp switch 911/206-207; User Music 912; Speaker setting 913; Camera setting 914; and an Anti-theft alarm for the alarm module switch 915 and Blue tooth connected devices mentioned in
The smartphone's mobile app communicates between the ASCS wirelessly via WIFI 440 and/or Bluetooth 441 the ASCS 400 synchronously links the user interface system 800 or (UIS) and to the Internet 442, Cloud Data 443 and Performance Management Network 444.
Some implementations provide automatic display mode selection by the reference hierarchy. For example, such an implementation may provide automatic display mode selection for mobile display devices can correspond to a set of each display mode displays the parameter setting, the display parameter setting includes color depth setting, brightness setting (brightness setting), the color gamut setting (color gamut setting), the frame rate setting, contrast setting, gamma setting and obtain. Some implementations may involve a trade-off between the display parameter setting and power consumption. In some instances, one of the criteria may correspond to the application or “application” running on the display device. Various battery status conditions, such as ambient light conditions, may correspond to the display mode. In some implementations, the display parameter setting information, or other device configuration information can be updated according to information received by the display device from another device, such as from the server.
In order to optimize the display criteria for a particular user, some implementations, obtained with the method comprising creating a user profile, and controlling the display such as the display of the mobile display device in accordance with the user profile. In some examples, the display criteria may include brightness, contrast, bit depth, resolution, color gamut, frame rate, power consumption, or gamma. In some implementations, audio performance, touch/gesture recognition, speech recognition, target tracking, in order to optimize the other display device operation, such as head tracking, the user profile may be used. In some such instances, volumes, such as the relative amounts of bass and treble, in order to optimize in accordance with personal hearing profile of the audio settings for mobile display device user (personal hearing profile), user profile. In some implementations, the display parameter setting information, or other device configuration information corresponding to data in a user profile may be received by another device from a display device such as a server. In some examples, the corresponding data in the user profile may include demographic data.
In various implications the operator 101 to understand, various It may provide greatly optimized display settings, and the corresponding level of power consumption to the user with respect to the scenario. Implementations involving user profile, according to the wishes or characteristics of a particular user, can result in additional level of optimization. In some implementations, the default display parameter setting information, can be determined according to a known demographic of users, may be used to control the display without the need of an associated user input. Implementations involving be distributed over a period of time that the process of building a user profile, without placing an excessive burden on the user during the initial setup, the detailed user profile may allow it to be built. For example, a plurality of use of mobile display device, a plurality of illumination conditions and use conditions or may include a plurality of applications used, through a series of brief vision testing or A/B image prompt dispersed over a period of time, limited Although not a substantial amount of information about the visual function of the users, including color perception can be obtained through the display device without imposing a significant burden on the user. In some implementations, the period of time, a plurality of the day, may be a week or a month. To optimize the visual quality of the display for the user, visual function information may be used. In some instances, in order to increase the color intensity of the user to struggle to perceive visual function information may be used. In some implementations, in order to reduce the power user spent on color depth I do not care, may visual function information used to optimize the display power consumption. Furthermore, resulting acquired considerable amount of information about the user's intention that wants to obtain an image quality power and exchange, thereby, it may allow additional display power saving. In some examples, the power can be expressed as a battery life. Some of the user interface disclosed herein include, but are not limited to, information about the user's intention that wants to obtain an image quality in exchange for battery life, user preference information, user information, including visual function information, to allow it to be suitably acquired.
As disclosed in more detail elsewhere herein, some methods may involve obtaining various types of user information for the user profile. Some implementations may involve providing a user prompt for user information and obtaining the user information in response to a user prompt. Some such implementations may involve providing via a mobile display device user prompt for user information. For example, user information, biometric information, the user name, may include user identification information such as a user preference data. In some implementations, in order to associate the user information to a specific user profile, user identification information may be used. For example, it may be useful to distinguish user information obtained from a plurality of users of a single mobile display device.
Some user information, the user to respond to prompts, without the need for such entering the information may be obtained “passively”. For example, some user information, how, when, or where mobile display device may be obtained according to whether used. Such user information, the setting of the user selection, mobile display device location information or the mobile display device is used for an application type, mobile display devices that run on the content or mobile display device provided by a mobile display device time, mobile display device may include environmental conditions used. In some instances, setting the user selection for mobile display devices, text size setting may include brightness setting, or audio volume setting. According to some implementations, environmental conditions may include a temperature or ambient light intensity. As in the user information obtained through the user response, information can be passively acquired over a number of time periods which may include the use of mobile display devices.
Some implementations may allow a plurality of user profiles user maintenance. For example, the user may generally have habitual access to outlet to charge the mobile display device. During such time, the user, in accordance with a first user profile prefer the display image quality than battery life, it may want to try to control the mobile display device settings.
The preference data in a user profile. In some such implementations, such as a user profile stored in the memory of the mobile display device may involve creating or updating a user profile maintained locally via the network interface of the mobile display device, it may also involve sending a user profile information to another device. For example, other devices may be capable of creating or updating a user profile server.
The user interface system is the portion of the ASCS that communicates with the operator required to provide instructions, as shown in
Another class of sensors includes antennae for sending and receiving information wirelessly, and includes RF, UWB and antennae for communications such as discussed elsewhere in this application. RFID tags may also be used to send and receive information or otherwise identify the vehicle. Moreover, RFID tags may also be used to receive positioning information or receive instructions and/or task performing information.
Preferably, the sensors are solid state devices based on MEMS technology as these are very small, are light weight and have the necessary accuracy while not being cost prohibitive. Each utilized sensor provides a suitable output signal containing the information measured by the sensor. The sensor output signal may be in any data format useable by the processing unit, but preferably will be digital. Furthermore, wireline or wireless communication links may be utilized to transfer signals between the sensor array and the processing unit.
For all communication that takes place within the ASCS or between the ASCS and outside components, any suitable protocol may be used such as CAN, USB, Firewire, JAUS (Joint Architecture for Unmanned Systems), TCP/IP, or the like. For all wireless communications, any suitable protocol may be used such as standards or proposed standards in the IEEE 802.11 or 802.15 families, related to Bluetooth, WiMax, Ultrawide Band or the like. For communication that takes place between the ASCS and a central computer, protocols like Microsoft Robotics Studio or JAUS may be used. For long range communication between the ASCS and the operator, existing infrastructure like internet or cellular networks may be used. For that purpose, the ASCS may use the IEEE 802.11 interface to connect to the internet 422 or may be equipped with a cellular modem.
In greater detail
Respectively the Smartphone APP 900 controls of the following settings in relation to the virtually control setting; 904. Power ON switch, 905 Power OFF switch 905, Driving modes 906-908, Beginner Drive Mode A 906, Normal Drive Mode B 907, Master Drive Mode C 908, Motor controller 909, Battery power level 910, Charging gauge 911, GPS 912: mapping a route 912A, distance 912B, LED Lamp switch, 913a, 913b, User Music 914, Speaker setting 915, Camera setting 916, Anti-theft alarm for the alarm module switch 917.
The present invention also comprises a method of path planning for an AS Path planning is providing a plurality of waypoints for the AS to follow as it moves. With the current method, path planning can be done remotely from the AS, where remotely means that the human operator is not physically touching the AS and may be meters or kilometers away from the vehicle or locating the operator 101.
The method of path planning comprises marking a path of waypoints on a digitized geospatial representation and utilizing coordinates of the way points of the marked path. Marking a path comprises drawing a line from a first point to a second point.
Any of several commercially available digitized geospatial representations that provide absolute position (e.g. GPS coordinates) may be used in this method and include Google Earth and Microsoft Virtual Earth. Other representations with absolute position information may also be used such as those that are proprietary or provided by the military.
Moreover, digitized geospatial representations with relative position information may also be used such as ad hoc grids like those described in U.S. Patent Publication 20050215269. The ad hoc grids may be mobile, stationary, temporary, permanent or combinations thereof, and find special use within building and under dense vegetative ground cover where GPS may be inaccessible. Other relative position information may be used such as the use of cellular networks to determine relative position of cell signals to one another.
Combinations of ASCS absolute and relative position information may be used, especially in situations where the AS travels in and out of buildings or dense vegetation.
The ASCS 400 coordinates of the waypoints of the marked path are then utilized, whether that means storing the data for later use, caching the data in preparation for near term use or immediately using the data by communicating the data to an outside controller (e.g. an ASCS). For example, the data may be communicated to the processing unit of the ASCS, such as through the operator interface. The processing unit may then issue instructions through the use interface system 800 to operate the AS, or otherwise store the data in the processing unit.
Moreover, other types ASCS path planning may also be utilized for example, recording the movement of the vehicle when operated by a human could be used to generate waypoints. Other types of manual path planning may also be used. In addition, path planning may be accomplished through the use of image recognition techniques. For example, planning a path based on a camera 105 mounted to the platform 102 to avoid objects. In another embodiment, path planning may be accomplished identifying portions of a digitized geospatial representation that is likely to indicate a road or street suitable for the AS to travel on.
With any type of path planning, the generated waypoint data may be manipulated through hardware or software to smooth the data, remove outliers or otherwise clean up or compress the data to ease the utilization of the data.
Moreover, the marked path may include boundary conditions (e.g. increasingly hard boundaries) on either side of the path to permit the AS to select a path that avoids objects that may be found on the original marked path.
In various aspects the autonomous skateboard controller system 400 may employ a micro controller or central processors, memory, sensors to provide autonomous control to many different types of the autonomous skateboard 100. Autonomous control means that after initialization, the vehicle moves and/or accomplishes one or more tasks without further guidance from a human operator, even if the human operator is located on or within the vicinity of the autonomous skateboard 100.
The ASCS also includes an operator interface. The operator interface is the portion of the AS that communicates with the operator (e.g., a human being or central computer system). For all autonomous AS's, at some point, a human operator is required to at least initiate or re-initiate the vehicle. To do this, the operator interface receives instructions (e.g., voice instruction or virtual finger gestures) from the operator, as shown in
Another class of sensors includes antennae for sending and receiving information wirelessly, and includes RF, UWB and antennae for communications such as discussed elsewhere in this application. RFID tags may also be used to send and receive information or otherwise identify the vehicle. Moreover, RFID tags may also be used to receive positioning information or receive instructions and/or task performing information.
Preferably, the sensors are solid state devices based on MEMS technology 322 as these are very small, are light weight and have the necessary accuracy while not being cost prohibitive. Each utilized sensor provides a suitable output signal containing the information measured by the sensor. The sensor output signal may be in any data format useable by the processing unit, but preferably will be digital. Furthermore, wireline or wireless communication links may be utilized to transfer signals between the sensor array and the processing unit.
For all communication that takes place within the ASCS or between the ASCS and outside components, any suitable protocol may be used such as CAN, USB, Firewire, JAUS (Joint Architecture for Unmanned Systems), TCP/IP, or the like. For all wireless communications, any suitable protocol may be used such as standards or proposed standards in the IEEE 802.11 or 802.15 families, related to Bluetooth, WiMax, Ultrawide Band or the like. For communication that takes place between the ASCS and a central computer, protocols like Microsoft Robotics Studio or JAUS may be used. For long range communication between the ASCS and the operator, existing infrastructure like internet or cellular networks may be used. For that purpose, the ASCS may use the IEEE 802.11 interface to connect to the internet or may be equipped with a cellular modem.
The link to an environmental sensor array link to a processing unit which communicates with the autonomous skateboard controller system 300 (ASCS). The communication between the ASCS and the autonomous skateboards 100A/100B may be carried on any suitable data bus with CAN (e.g. ISO 11898-1) and/or PWM buses preferred. Wirelessly via WIFI and/or Bluetooth the autonomous skateboard controller system 300 synchronously links the skateboard interface system 600 with the user interface system 800.
Throughout the present disclosure, a particular embodiment of the example may be initiated in a range format. Range of type descriptions are merely for convenience and brevity and should not be construed as an inflexible limitation on the disclosed range.
The described embodiments of the invention are intended to be merely exemplary and numerous variations and modifications will be apparent to those skilled in the art. All such variations and modifications are intended to be within the scope of the present invention as defined in the appended claims.
A notice of issuance for a continuation in part patent application in reference to application Ser. No. 15/379,474; filing date: Dec. 14, 2017; titled “Powered Skateboard System Comprising Inner-Motorized Wheels”; and relating to patent application Ser. No. 13/872,054; filing date: Apr. 26, 2013, title: “Robotic Omniwheel”, and relating to patent application Ser. No. 12/655,569; title: “Robotic Omniwheel Vehicle” filing date: Jan. 4, 2010, U.S. Pat. No. 8,430,192 B2.
Number | Date | Country | |
---|---|---|---|
Parent | 15379474 | Dec 2016 | US |
Child | 16365644 | US |