The present invention relates to a controller for providing a motorized bicycle wirelessly linking to a user interface system, a mobile app, a mobile phone or a combination thereof. The autonomous bicycle controller system preferably provides path planning to a semi-manual controlled autonomous bicycle.
Existing motorized bicycles work well only for situations relating to joy riding. What is needed is a smarter bicycle with trucks that are controlled by an autonomous control system with minimal user instruction which will allow the motorized bicycle to be capable of path planning autonomously and pick its own path by environmental tracking and object detection sensors. New methodologies are required for path planning for motorized bicycles which work manually and/or autonomously to follow from a starting point to an ending point by means of autonomous control system sensors and by GPS waypoints which are created from user interface input or created by a global network system.
The present invention provides a manual control mode and an autonomous control mode selection for an operator not on board, or a rider onboard to control an autonomous bicycle, the autopilot methodology programmed to govern one or more navigation processes of an autonomous bicycle. Preferably, the autonomous bicycle provides WIFI or Bluetooth linking a user interface system to an autonomous bicycle controller system, the ABCS gathers environmental sensor data from the autonomous bicycle, the sensor data includes including short range LIDAR sensor, cameras, GPS, etc. for calculating motorized speed, compass heading, absolute position, relative position, and other environment sensor data. Further, the autonomous bicycle controller system includes a processing unit having software for computing logic a central processing unit, memory, storage, communication signals and instruction. Preferably, a potential operator wanting to ride an autonomous bicycle may summon the autonomous bicycle to drive directly to the her or him and while riding, the operator utilizes may engage their smartphone to access their personalized user interface system settings. The user interface system including electronic identification information and instruction, input and output data, and mechanical identifiers based on machine-readable identification information and electronic identifiers for automatically controlling the autonomous bicycle. The operator may wish to upload software or review a summary of the important information useful for operator and store performance data to Cloud management network, Global Internet Network providing Cloud Database Management Network(s).
The present invention includes an autonomous bicycle 100 that accomplishes one or more tasks with or without guidance from an operator 101 when riding, when the operator has stepped off, or the autonomous bicycle operator may summon one or more autonomous bicycles to drive to where the operator is or wherever operator directs. The period of full autonomous control may range from a less than a minute to an hour or more. In various aspects the Autonomous Bicycle Controller System 400 is associated with an Autonomous Drive Mode 600 setting and a Manual Drive Mode 700 setting, the Autonomous Drive Mode 600 or the Manual Drive Mode 700 are engaged or disengaged by an operator's instructions by means of a smartphone comprising mobile apps, the mobile apps including personal riding app, or a networking rental system.
In various riding events, during an operation of an autonomous bicycle 100, the operator 101 may opt to utilize their smartphone 801 to access one or more user interface preference settings via a User Interface System 800 wirelessly linking to a mobile phone app or “smartphone app”.
During a networking rental system, the operator is associated with mobile phone operating system providing a user login, a verification of payment for rental minutes, etc. The autonomous bicycle operator/renter 101 is contracted and responsible for controlling the rented autonomous bicycle either manually or autonomously for short distances.
During a personal riding the autonomous bicycle operator 101 is associated with controlling her or his autonomous bicycle either manually or autonomously, when she or he prefers, for short distances the operator may prefer to manually control their autonomous bicycle 100, and when riding for longer distances the operator may prefer to not want to manually control their autonomous bicycle 100 therefore she or he can disengage the manual drive and engage the autonomous dive mode, accordingly in any riding event the operator 101 decides a drive mode option.
Respectively, the operator 101 of the autonomous bicycle 100 accesses control settings by her or his Bluetooth connected smartphone 801, the smartphone 801 is configured with user preference settings based on various Smartphone APP software, the software programming is associated with wirelessly controlling one or more electric motors 109 of the autonomous bicycle 100, in retrospect, the Smartphone APP or related mobile app is provided on the internet of things, an example of the Smartphone App 900 is detailed in
Accordingly hereon the autonomous bicycle controller system 400 may be referred to as (ABCS), the autonomous bicycle 100 or (AB), (AB 100).
In various elements the autonomous bicycle 100 include WIFI and/or Bluetooth connectivity adapted for linking the User Interface System 800 (UIS) to the ABCS 400, wherein a built-in Bluetooth communication mode 802 is associated with a communication link between the autonomous bicycle 100 and operator's Smartphone APP 900, and provides a wireless link to one or more environmental sensors and processors associated ABCS drive control methodologies, the AB 100 is detailed herein.
In greater detail
The framework's front end 103 and a rear end 104 for attaching a front drive wheel 107 and rear drive wheel 108, each drive wheel comprising a motor 109 and a brake 110, the drive wheel comprising a motor sensor 109a and a brake sensor 110a, and suspension fork 111 providing a connection at the drive wheel axis, and a deformation sensor 112 contained within a section on an upper portion of the suspension fork 111, the drive wheels 107 and 108 comprises an axle configured with bearings and bolting means for coupling a motor 109 and brake arrangement thereon. Accordingly, the front and rear drive wheels 107/108 are configured with steering actuator 113 and motor controllers 212a, 212b linking with one or more load sensors 209 and environment sensor array, see
In various elements, the deformation sensor 112 to sense strain level 112b induced by an operator's weight exerted on the front drive wheel 107 and rear drive wheel 108 during autonomous drive mode running maneuvers.
Accordingly, the load sensors 209 are contained on the foot pegs 106a, 106b, respectively the loads sensors link to gyroscope sensor 210 providing an intelligent weight and motion controlling means configured to measure balance which is achieved as soon as the operator 101 steps on one or both feet on the foot pegs 106a, 106b.
Subsequently when operator 101 is detected stepping on one or both foot pegs 106, the load sensors 209 accordingly activate to begin furnishing battery power to the drive wheel motor 109, wherein the load sensors 209 link to the power control module 213, the motor controller 212a or 212b, which governed by operator 101, ABCS 400 or a combination of both.
Subsequently when operator 101 dismounts or is not detected on the foot pegs 106, the load sensors 209 deactivate and to stop furnishing battery power to the drive wheel motor 109.
In various elements the framework comprises a deformation sensor 112 is associated with the suspension fork 111 motion, velocity and trajectory control operations, wherein the front suspension fork 111a supports a front drive wheel 107, wherein a rear suspension fork 111b supports a rear drive wheel 108, (e.g., the drive wheels may be configured having spokes, tires, a rigid rim, a flexing rim or a combination thereof).
In various elements the framework's front end 103 couples the suspension fork 111a to an intersection of the steering column 116, respectively the deformation sensor 112b to sense strain level induced by a rotation speed and twisting angle differences at a connection point generated at the intersection of the steering columns base 121 and the front drive wheel 108. In various elements, the deformation sensor 112b to sense strain level induced by a rotation speed and twisting angle differences at a connection point generated at the intersection of the rear suspension fork 111b and the framework's rear end 105.
In various elements the framework further comprises a steering column, the steering column 116 comprises a and control panel 200 and a right handle 117 and a left handle 118. The control panel 400(CP) containing the control system components disclosed in
The framework further comprising a steering column 116, which is employed to steer the autonomous bicycle during autonomous drive mode operation by means of a steering actuator 205. The steering actuator 205 is utilized when the operator is temporally utilizing the manual drive mode not engaged, is some events the steering actor is autonomously engaged by the ABCS when the operator is distracted or not onboard, whereby the ABCS immediately engages the autonomous drive mode, accordingly the autonomous drive mode works to temporarily steer the AB 100 in the environment 330 until the operator gains manual control, if not the ABCS deactivates the AB autonomous drive mode 600 correspondingly with UIS 800.
The operator 101 during a semi-manual control operation of the manual drive mode 700, the AB operator 101 may select one or more methods for controlling steering motion and velocity motion, whereby the operator may manually engage both foot pegs 106a, 106b in a synchronized manner to adjust the speed to her or his riding level, whereby the AB operator 101 may manually engage the thumb throttle 119 or the left thumb brake lever 120 to control velocity, whereby the operator 101 may manually engage the right handle and the left handle to control steering and trajectory.
The steering columns right handle 117 supports a right thumb throttle 119 and the left handle 118 supports a left thumb brake lever 120, during manual drive mode and the throttle and level assist the operator to control speed and trajectory, and the steering columns handles are utilized by the AB operator 101 for manually steering the AB 100, respectively just like riding in a vehicle with a foot and brake pedal just technologically smarter.
Another control method is to use the above described method to sense CG but to increment or decrement a torque set point in a torque controller loop instead of a speed loop 320. The operator 101 would lean forward to increment the commanded torque set point and lean back to decrement the commanded torque set point; the rate of increment/decrement may be determined by the amplitude of the CG from center of the foot pegs 106a and 106b.
A selectable option would allow an advanced operator 101 to, when leaning back, also continue in reverse after zero speed is reached for braking 110, the brake 110 arrangement may contain a cable 114 and brake pad 115 and an electrical wiring array 201 associated with linking battery power to sensor and power source 203-213.
An motion detection example, during autonomous drive control 600 when a motor controller server 212a is configured to sense the drive wheels 107/108 motor speed may adjust with the motor 109 torque to keep the drive wheel 107/108 rotational velocities relatively similar, especially in situations when the front drive wheel 107 has more traction compared to the rear drive wheel 108 which may be sensed by a motor controller processor 212b which is configured to read the strain gauge sensors on the one or more motors 109a. 209b, and furthermore determine which drive wheel has more operator 101 weight and therefore more traction.
Accordingly the steering columns right handle 117 supports a right thumb throttle 119 assist the operator to control speed and trajectory, and the steering columns handles are utilized by the AB operator 101 for manually steering the AB 100.
As manually controlled AB 100 accordingly during manual drive mode 700 the AB operator 101 may disengage autonomous drive mode 600 settings to manually control the autonomous bicycle 100B. The autonomous drive mode ends operations of the deformation sensors 112 allow the operator to activate the steering columns right handle 117 and left handle 118 for manually steering the AB 100 during manual drive mode 700, and the AB operator 101 utilizes the thumb throttle 119 and the left thumb brake lever 120 for manually controlling motor velocity during manual drive mode 700.
In various elements the framework's front end 103 and a rear end 104 for attaching a front drive wheel 107 and rear drive wheel 108 via the suspension fork 111 providing a connection at the drive wheel axis, and the drive wheels 107 and 108 comprises an axle configured with bearings and bolting means for coupling a motor 109.
The brake arrangement linking to a left thumb brake lever 120 to slow down (decrease velocity) until zero speed is reached as the left thumb brake lever 120 engages a cable associated to a brake pad to slow or stop the drive wheel 109, the autonomous bicycle's brake 110 is common, however the cable is associated with the motor controller 212.
In various element the brake 110 arrangement may contain a cable 114 and brake pad 115 and an electrical wiring array 201 associated with linking battery power to sensor and power source 213-214. The framework further comprising a steering column 116, the steering may be autonomously controlled to steer the autonomous bicycle during autonomous drive mode operation by means of manually steering vis steering column handles 117-118.
In various elements the steering column 116 comprises a and control panel 400(CP) and a right handle 117 and a left handle 118. The control panel 400(CP) containing the control system components disclosed in
The steering column 116 is further configured with a base portion 121 and coupling means 122 for attaching or detaching the steering column 116 onto the front suspension fork 111a, front drive wheel 107 and brake 110 arrangements.
During manual drive mode 700, in one example to control the velocity setpoint of the autonomous bicycle 100A speed the operator 101 could engage the thumb throttle 119 or engage the left thumb brake lever 120.
In various elements the thumb throttle 119, the left thumb brake lever may be defined by different colors to help visually discern the accelerator from the brake this can be useful when operating at high speeds or with distractions.
In various elements the AB 100 framework's environment sensor array components housed within the steering column 116.
In various elements the AB 100 framework's front end 103 supports a front suspension fork 111a, a front drive wheel 107, and a front motor 109a; and the framework's rear end 104 support supports a rear suspension fork 111b, a rear drive wheel 108, and a rear motor 109b, (e.g., the drive wheels may be configured having spokes, tires, a rigid rim, a flexing rim or a combination thereof).
In greater detail
Accordingly wherein ABCS environment sensor array is including but not limited to; LIDAR sensor 206 (e.g., 2D, 3D, color LIDAR), RADAR 207, sonar 208, load sensors 209, gyroscopic sensor 210, accelerometer 211. Respectively the load sensor 209 or “orientation sensor” is configured to measure an orientation of the operator's presence on the seat 105. The steering actuator 113, gyroscopic sensor 210 are adapted to maintain fore-and-aft balancing of the autonomous bicycle 100, and accordingly the accelerometer 211 and the motor controller 212 are associated to control a preferred battery power level. Accordingly, the steering actuator 204, load sensor 209, gyroscopic sensor 210, accelerometer 211b and motor controller 212 are electronically linked via a wiring array 201 to the power control module 213 contained within the compartment 200.
In various connectivity elements, wherein the compartment 200 provides a wired connection means for linking battery power to internal devices and to external devices, wherein the electrical wiring array 201 is connectively linked to a USB port 216, the USB port become connected to an external power source such as AC 110 outlet, via an external USB power cord 217.
In
In various elements the power control module 213 further comprises a receiver 213b and a processor 213c for monitoring the battery charger's a charge level 215a associated with one or more removable battery packs 214a, 214b during a charging process. Wherein the battery charger 215, via wiring array 201 connects the battery packs 214a, 214b in a paralleled series. Respectively, the battery packs 214a, 214b when fully charged can be switched out and used later to extend operators riding time, the spent battery pack are placed back in the compartment or recharged later.
In greater detail
Accordingly, the deformation sensor 112, steering actuators 113 are contained on sections of drive wheel 107, 108, and the motion sensors and cameras 205-211 are situated on sections of the framework 102 and sections of the steering column 216, the motor controller 212 is contained within a section of the compartment 200.
Respectively in autonomous drive mode 600 the gyroscopic sensor 210 (including fuzzy logic 210a) and an accelerometer 211 and provide data based on load sensor data 209a, gyroscope sensor data 210a and base on accelerometer sensor data 211a, and the motor controller 212 associated with a server 212a, a processor 212b, and motor controller sensor data 212c. Respectively the gyroscope sensor 210 providing an intelligent weight and motion controlling means, and an accelerometer 211 configured to measure balance which is achieved as soon as the AB operator 101 steps on the foot pegs 106a,b, and subsequently the preferred power level, associated with the motor controller 212, the deformation sensor 112 and the steering actuator 113.
Respectively, the AB 100 may be self-powered by regenerated power from the one or more drive wheel motors 109a, 109b, providing a minimal amount of regeneration power is captured to maintain a battery charge level 215a to run the motor controller 212 and allow the low-drag torque control 212d is useful when the battery 214 has been nearly depleted, a regenerative battery charging process is initiated by the braking activity 110 of slowing down or stopping. Accordingly, the velocity of the drive wheels 107/108 provides a motor 109 may be associated with a regenerative braking means 110 for maintaining a charge level 215a to the battery 214a and/or 214b.
In greater detail
In various environments 430 the gyroscope sensor 210 and the accelerometer sensor 211 may measure a motion signal 301 of an operator's motion 312 by pushing a foot pegs 106, suspension forks 111, and/or a 3-dimension moving response of the AB's in the x, y, z direction 315 and velocity 316 associated with the operator's motion 312 and/or the example the AB's motion 311, or by a combination thereof.
In one example, the motions 311/312 may include a predefined motion input 301, including for example, the operator 101 hopping on and/or off the AB 100. The operator utilizing one or more riding skills to associated with motion control operator 101 which include; to engage a drive mode 701-704, motion to engage propulsion and motion to engage trajectory, see
In one example, a deformation sensor 112 may be computed by a weight signal 302 and a gravity angle signal 309 generated from one or more move control signals 307, including for example, forward, backward, accelerate, and/or brake signal 109a from a signal processing unit 304. The signal processing unit 304 may combine and process the deformation output signal 306 providing motion signals 301 to produce the one or more move control signals 307 relayed the autonomous drive mode 600.
In some aspects control signals 307 may control the AB 100 to move in a direction, including for example, a forward direction or a backward direction, or an initial orientation direction (IOD) 321. The direction of the AB's motion 311 may be determined based on the deformation output signal 306 associated with the weight signal 308 and the gravity angle signal 309.
In some aspects control signals may control the speed of the AB 100, for example, to accelerate or braking means 110. In one example, the speed of the AB's motion 311 may be determined based on operator's motion 312, such as shaking the AB 100. In another example, the speed of the AB motion 311 may be determined based on the deformation output signal 306 associated with the weight signal 308 and the gravity angle signal 309. Respectively, the deformation sensor comprises a strain gauge 112a configured to sense induced strain by imbalanced forces exerted upon the drive wheel 107, drive wheel 108 and the deformation sensor 112 to sense strain level 112b induced by an operator's weight exerted on the deformation sensor 112 to sense strain level induced by a rotation speed and twisting angle differences at a connection point generated at an connection of a suspension fork 111a of the drive wheel 107 attached on the framework's front end 103, and the deformation sensor 112 to sense strain level induced by a rotation speed and twisting angle differences at a connection point generated at a connection of a suspension fork 111b of the drive wheel 108 attached on the framework's rear end 104.
Accordingly, the load sensors 209 are contained between the seat 105 and foot pegs 106a, 106b, respectively the gyroscopic sensor 210 (with fuzzy logic 210a) and an accelerometer 211, the load sensor 209 providing data based on gyroscope sensor data 210a and base on accelerometer sensor data 211a, and a motor controller 212 configured having; a server 212a, a processor 212b, sensor data 212c and low-drag torque control 212d. Respectively the gyroscope sensor 210 providing an intelligent weight and motion controlling means via the motor 109, a steering actuator 113, the motor controller 212, and an accelerometer 211 configured to measure balance which is achieved as soon as the operator 101 sits on the seat 105 or places one or both feet on the foot pegs 106a, 106b.
Subsequently when operator 101 is detected on the seat 105 or on footing is on one or both foot pegs 106, the load sensors 209 accordingly activate to begin furnishing battery power to the drive wheel motor 109, wherein the load sensors 209 link to the power control module 213, the motor controller 212a or 212b, which governed by operator 101, ABCS 400 or a combination of both.
Another control method is to use the above described method to sense CG but to increment or decrement a torque set point in a torque controller loop instead of a speed loop 320. The operator 101 would lean forward to increment the commanded torque set point and lean back to decrement the commanded torque set point; the rate of increment/decrement may be determined by the amplitude of the CG from center of the foot pegs 106a and 106b.
A selectable option would allow an advanced operator 101 to, when leaning back, may also continue in reverse after zero speed is reached for braking 110 via cable and brake pad 114/115.
In greater detail
For example, perception system 407 may receive sensor system data 406 from one or more external environmental sensor array situated on section of the framework 102 and control panel 200; wherein the LIDAR sensor 206 (e.g., 2D, 3D, color LIDAR), RADAR 207, sonar 208, based on MEMS technology 322, other data is gathered by one or more video cameras 205 (e.g., image capture devices); whereas, localizer system 405 may receive sensor system data 406 including but not limited to global positioning system (GPS) 408 having data 408 including; inertial measurement unit (IMU) data 409, map data 410, route data 411, Route Network Definition File (RNDF) data 412 and odometry data 413, wheel encoder data 414, and map tile data 415. Accordingly, the localizer system 405, having a planner system 416 having memory 417 and may receive object data 418 from sources other than sensor systems, such as utilizing memory 417 via a data store 431 or Cloud Data Management 432 and Performance Management Network 433, a global satellite coordinate system 434.
Accordingly perception system 407 may process sensor data to generate object data 418 that may be received by the planner system 416. Object data 418 may include but is not limited to data representing object classification 419, detecting an object type 420, object track 421, object location 422, predicted object path 423, predicted object trajectory 424, and object velocity 425, object library 426 in an environment 430.
Accordingly the localizer system 404 may process sensor data, and optionally, other data, to generate position and orientation data 427, local pose data 428 that may be received by the planner system 416. The local pose data 428 may include, but is not limited to, data representing a location 429 in the environment 430 via (GPS) 408, (IMU) data 409, map data 410, route data 411, (RNDF) data 412 and odometry data 413, wheel encoder data 414, and map tile data 415, and the global satellite coordinate system 434 for example.
The following implementations described that includes a no matter in motion (such as video), and no matter at rest (still images), and text, graphics, or whether it be a picture of any that may be configured to display the image device. It may be implemented in devices or systems. More specifically, the implementation to be described include, but are not limited to, mobile phones, multimedia Internet enabled cellular telephones, mobile television receiver, a wireless device, smartphone, Bluetooth connected devices 203-212, a personal digital assistant (PDA) 818, a wireless e-mail receiver, hand-held or portable computers. Teachings herein also include, but are not limited to, electronic switching devices, radio frequency filters, sensors, accelerometers, gyroscopes, motion sensing devices, magnetometers, inertial components for consumer electronics, parts of consumer electronics products varactor, a liquid crystal device, an electrophoretic device, a driving method, such as manufacturing processes and electronic test equipment, can be used in non-display applications. Accordingly, the present teachings, not just limited to the implementation shown in the figures, has instead, a wide easily such that the apparent applicability to those skilled in the art.
The sensor system of the AB 100 comprising processors for determining, based at least in part on the sensor data, a location of the AB 100 within the environment 430, wherein the location 429 of the AB 100 identifies a position and orientation via load sensors 209 of the AB 100 within the environment 430 according to global coordinate system 431.
The ABCS is associated with calculating, based at least in part on the location 429 of the autonomous bicycle 100 and at least a portion of the sensor data 403 a trajectory 425 of the AB 100, wherein the trajectory 425 indicates a planned path associated GPS 408 with navigating the AB 100 between at least a first location 429a and a second location 429b within the environment 430.
The ABCS is associated with identifying, based at least in part on the sensor data 406, an object 421 within the environment 430; and determining a location of the object 421 in the environment 430, wherein the location 429 of the object 421 identifies a position and orientation 427 of the object within the environment according to the global coordinate system 431; and determining, based at least in part on the location 429 of the object 421 and the location of the AB 100, to provide a visual alert 432 from a light emitter 433.
The ABCS is associated with selecting a light pattern 434 from a plurality of light emitter 433 patterns, wherein a first one of light patterns 434 is associated with a first level of urgency of the visual alert, and a second one of the light patterns is associated with a second level of urgency of the visual alert; selecting, from a plurality of light emitters 433 of the AB 100, a light emitter 433 to provide the visual alert 432; and causing the light emitter 433 to provide the visual alert 432, the light emitter emitting light indicative of the light pattern 434 into the environment 430.
The ABCS is associated with calculating, based at least in part on the location of the object 421 and the trajectory 425 of the AB 100, an orientation 427 of the AB 100 relative to the location 429 of the object 406; selecting the light emitter is based at least in part on the orientation of the AB 100 relative to the location 429 of the object.
The ABCS is associated with estimating, based at least in part on the location 429 of the object 421 and the location 429 of the AB 100, a threshold event 435 associated with causing the light emitter 433 to provide the visual alert 432; and detecting an occurrence of the threshold event 435; and wherein causing the light emitter 433 of the AB 100 to provide the visual alert 432 is based at least in part on the occurrence of the threshold event 435.
The ABCS is associated with calculating, based at least in part on the location 429 of the object 421 and the location 429 of the AB 100, a distance between the AB 100 and the object 421; and wherein selecting the light pattern 434 is based at least in part on the distance, threshold event 335 according to a threshold distance 436 or a threshold time, and a second threshold distance 437.
The ABCS is associated with estimating light and configured with a setting for selecting the light pattern 434 is based at least in part on one or more of a first threshold event 435 according to a threshold distance or a threshold time, wherein the first threshold distance 436 is associated with the light pattern 434a and a second threshold distance 437 is associated with a different light pattern 434b, wherein the first threshold distance and the second threshold distance is less than a distance between the object 421 and the AB 100, and wherein the threshold time 436 is shorter in duration as compared to a time associated with the location 429 of the AB 100 and the location of the object being coincident with each other.
The ABCS is associated with calculating, based at least in part on the location 429 of the object 421 and the trajectory 425 of the AB 100, a time associated with the location 429 of the AB 100 and the location of the object being coincident with each other; and wherein causing the light emitter 433 of the AB 100 to provide the visual alert 432 is based at least in part on the time.
The ABCS is associated with determining an object classification for the object 421, the object classification determined from a plurality of object classifications, wherein the object classifications include a static pedestrian object classification, a dynamic pedestrian object classification, an object classification, and a dynamic car object classification; wherein selecting the light pattern 434 is based at least in part on the object 421 classification.
The ABCS is associated with accessing map data associated with the environment 430, the map data accessed from a data store of the AB 100, and determining position data and orientation data associated with the AB 100 and wherein determining the location 429 of the AB 100 within the environment 430 is based at least in part on the map data 410, the position data and the orientation data.
The ABCS is associated with selecting a different light pattern 434 from the plurality of light patterns based at least in part on a first location of the object before the visual alert is provided and a second location of the object after the visual alert is provided, and causing the light emitter 433 to provide a second visual alert, wherein the light emitter emits light indicative of the different light pattern into the environment 430.
Wherein the light emitter 433 includes a sub-section and the light pattern includes a sub-pattern 438 associated with the sub-section 439, the sub-section being configured to emit light indicative of the sub-pattern 438, wherein at least one of the sub-patterns 438 is indicative of one or more of a signaling functions of the AB 100 or a braking function 114/115 of the AB 100 and wherein at least one other sub-pattern 438 is indicative of the visual alert 432 receiving data representing a sensor signal 108a indicative of a rate of rotation of a drive wheel 108 of the AB 100; and modulating the light pattern 434 based at least in part on the rate of drive wheel's electric motor 109 rotation.
The planner system 417 may process the object data and the local via GPS 408 providing pose data 428 to compute a motion path (e.g., a trajectory 425 of the AB) for the AB 100 to travel through the environment 430. The computed path being determined in part by object data 421 in the environment 430 that may create an obstacle to one bicycles, skateboards or other vehicles which may pose a collision threat to the AB 100.
In various aspects the autonomous bicycle controller system 400 may employ a micro controller or central processors, memory, and sensors array to provide autonomous control to many different types of the autonomous bicycle 100. Autonomous bicycle control means that after initialization, the AB 100 moves and/or accomplishes one or more tasks without further guidance from the operator 101, even if the operator 101 is riding the AB 100, or the operator 101 is located within a few steps of the AB 100, or within the vicinity of the AB 100.
The link to an environmental sensor array link to a processing unit which communicates with the ABCS 400. The communication between the ABCS and the AB 100 may be carried on any suitable data bus with CAN (e.g. ISO 11898-1) and/or PWM buses preferred. Wirelessly via WIFI 440 and/or Bluetooth 441 the ABCS 400 synchronously links the user interface system 800 or (UIS) and to the Internet 442, Cloud Data 443 and Performance Management Network 444.
The ABCS 100 for providing autonomous control to the AB 100, comprising: UIS 800 that communicates with the AB 100 and provides instructions to the vehicle regarding acceleration, braking, steering or a combination thereof; the UIS 800 that communicates with and receives instructions from an operator 101, the instructions including task instructions, path planning information or both.
The ABCS is associated with an environmental sensor array 407 that receives sensor data from the AB 100 and communicates the sensor data 421 such data including AB 100 speed, compass heading, absolute position, relative position or a combination thereof. The ABCS is associated at least one sensor that monitors motion of the AB 100 including rate of acceleration, pitch rate, roll rate, yaw rate or a combination thereof and the at least one sensor that monitors motion includes the accelerometer, the gyroscope, and the motor controller 212. It said while calculating a friction pie from the tire and the road surface state in the current running state, a command value to the output adjusting means calculates the braking amount corresponding to the braking operation amount, the output adjusting means, controls the operation of the a steering actuator 205 for the front drive wheel 107, the rear drive wheel 108 and brake operations of both by sending a command value to the braking force control means of the motor controller 212 and functions of the motor controller 212a,b, wherein the front brake 110a and rear brake 110b are activated by the brake-by-wire type braking control means 114, the braking control is engage by operators leaning backward motion, by operators engaging a brake throttle/switch 120 or a combination thereof. A rate of acceleration, pitch rate, roll rate, yaw rate output adjusting means constituted in the ABCS, wherein the braking force of the angular velocity detected when the stability limit or greater than the threshold of the friction pie, the general control unit of the command value it is determined that sudden braking send to the braking control means for establishing stability limit or threshold grip of the front and rear wheel tires such that traveling always is controlled.
The ABCS is associated with programming for path planning to the AB 100 and such path planning includes marking a path of waypoints on a digitized geospatial representation, the waypoints mark a path that is the perimeter of a scan area that the AB 100 then scans; wherein the AB 100 scans the scan area by traveling to waypoints within the scan area, and respectively the ABCS 400 employs a digitized geospatial representation that provides absolute position of the AB 100 in creating the scan area or provides relative position through the use of an ad hoc grid.
In various aspects the ABCS 400 includes a mechanism for receiving communication from a smartphone 801 or the internet 442 such that an operator 101 can communicate with the AB 100 through the control panel 400(CP).
In greater detail
As shown
In greater detail
In greater detail
In greater detail
Respectively the autonomous bicycle controller system 400 may be required to assist the operator 710 automatically if the operator 101 steps off foot pegs 106a, 106b, whereby the operator verbally instructs the ABCS 400 to move toward the operator 101, this action is achieved by programming via software algorithms disclosed in the ABCS 400.
In various aspects the autonomous bicycle controller system 400 may be employed to provide full autonomous control accomplished without any further guidance from when the operator 101, this action is achieved once the operator disengages the Manual Drive Mode 700.
In various aspects the autonomous bicycle controller system 400 may be requested by the operator to assist the operator during riding activity 706, whereby, the operator may instruct ABCS 400 to link to a micro-processor or processor of the user interface system 800 to temporary deactivate the manual control mode 700 and switch over to engage the Autonomous Drive Mode 600, thus allowing selective or minimal supervision from the operator 101 thereby the operator discontinues controlling the autonomous bicycle 100 (e. g, semiautonomous), or vice versa, switch over to manual drive and regains control respectively, these driving modes may be alternated over a period of time during riding events.
In one or more elements the communication established between the ABCS and the autonomous bicycle 100 may be carried on any suitable data bus with CAN (e.g. ISO 11898-1) and/or PWM buses, working wirelessly via WIFI 440 and/or Bluetooth 441, the autonomous bicycle controller system 400 synchronously links the ABCS the user interface system 800 to compute a motion path in one or more environments 430.
In greater detail
In greater detail
In greater detail
Accordingly, in one or more applications, the Smartphone APP 900 may update future over-the-air software & firmware and manage social media and the internet of things via a Global network Global Internet Network 815 providing Cloud Database Management Network(s) 816, or Cloud Data Management 432 and Performance Management Network 433, a global satellite coordinate system 434. The Smartphone APP 900 allows the AB operator 101 to select listings on a menu 902 by the finger prompting 903 (i.e., swiping gestures).
Respectively the Smartphone APP 900 controls of the following settings in relation to the virtually controlled autonomous bicycle components 916 configured to be wirelessly control via user interface, virtual settings listed on the menu 902 as: a power on 903 and off switch 904, a Power switches 905; a Driving modes 906; Beginner Drive Mode A, Normal Drive Mode B, Master Drive Mode C; a Motor controller 907; a Battery power level 908; a Charging gauge 909; GPS 910: a mapping a route 910A, a distance 910B; LED Lamp switch 911/206-207; User Music 912; Speaker setting 913; Camera setting 914; and an Anti-theft alarm for the alarm module switch 915 and Bluetooth connected devices mentioned in
The smartphone's mobile app communicates between the ABCS wirelessly via WIFI 440 and/or Bluetooth 441 the ABCS 400 synchronously links the user interface system 800 or (UIS) and to the Internet 442, Cloud Data 443 and Performance Management Network 444, and/or Global Internet Network 815 providing Cloud Database Management Network(s) 816.
Some implementations provide automatic display mode selection by the reference hierarchy. For example, such an implementation may provide automatic display mode selection for mobile display devices can correspond to a set of each display mode displays the parameter setting, the display parameter setting includes color depth setting, brightness setting (brightness setting), the color gamut setting (color gamut setting), the frame rate setting, contrast setting, gamma setting and obtain. Some implementations may involve a trade-off between the display parameter setting and power consumption. In some instances, one of the criteria may correspond to the application or “application” running on the display device. Various battery status conditions, such as ambient light conditions, may correspond to the display mode. In some implementations, the display parameter setting information, or other device configuration information can be updated according to information received by the display device from another device, such as from the server.
In order to optimize the display criteria for a particular user, some implementations, obtained with the method comprising creating a user profile, and controlling the display such as the display of the mobile display device in accordance with the user profile. In some examples, the display criteria may include brightness, contrast, bit depth, resolution, color gamut, frame rate, power consumption, or gamma. In some implementations, audio performance, touch/gesture recognition, speech recognition, target tracking, in order to optimize the other display device operation, such as head tracking, the user profile may be used. In some such instances, volumes, such as the relative amounts of bass and treble, in order to optimize in accordance with personal hearing profile of the audio settings for mobile display device user (personal hearing profile), associated within the user profile. In some implementations, the display parameter setting information, or other device configuration information corresponding to data in a user profile may be received by another device from a display device such as a server. In some examples, the corresponding data in the user profile may include demographic data.
In various implications the AB operator 101 to understand, various It may provide greatly optimized display settings, and the corresponding level of power consumption to the user with respect to the scenario. Implementations involving user profile, according to the wishes or characteristics of a particular user, can result in additional level of optimization. In some implementations, the default display parameter setting information, can be determined according to a known demographic of users, may be used to control the display without the need of an associated user input. Implementations involving be distributed over a period of time that the process of building a user profile, without placing an excessive burden on the user during the initial setup, the detailed user profile may allow it to be built. For example, a plurality of use of mobile display device, a plurality of illumination conditions and use conditions or may include a plurality of applications used, through a series of brief vision testing or A/B image prompt dispersed over a period of time, limited Although not a substantial amount of information about the visual function of the users, including color perception can be obtained through the display device without imposing a significant burden on the user. In some implementations, the period of time, a plurality of the day, may be a week or a month. To optimize the visual quality of the display for the user, visual function information may be used. In some instances, in order to increase the color intensity of the user to struggle to perceive visual function information may be used. In some implementations, in order to reduce the power user spent on color depth, may visual function information used to optimize the display power consumption. Furthermore, resulting acquired considerable amount of information about the user's intention that wants to obtain an image quality power and exchange, thereby, it may allow additional display power saving. In some examples, the power can be expressed as a battery life. Some of the user interface disclosed herein include, but are not limited to, information about the user's intention that wants to obtain an image quality in exchange for battery life, user preference information, user information, including visual function information, to allow it to be suitably acquired.
As disclosed in more detail elsewhere herein, some methods may involve obtaining various types of user information for the user profile. Some implementations may involve providing a user prompt for user information and obtaining the user information in response to a user prompt. Some such implementations may involve providing via a mobile display device user prompt for user information. For example, user information, biometric information, the user name, may include user identification information such as a user preference data. In some implementations, in order to associate the user information to a specific user profile, user identification information may be used. For example, it may be useful to distinguish user information obtained from a plurality of users of a single mobile display device.
Some user information, the user to respond to prompts, without the need for such entering the information may be obtained “passively”. For example, some user information, how, when, or where mobile display device may be obtained according to whether used. Such user information, the setting of the user selection, mobile display device location information or the mobile display device is used for an application type, mobile display devices that run on the content or mobile display device provided by a mobile display device time, mobile display device may include environmental conditions used. In some instances, setting the user selection for mobile display devices, text size setting may include brightness setting, or audio volume setting. According to some implementations, environmental conditions may include a temperature or ambient light intensity. As in the user information obtained through the user response, information can be passively acquired over a number of time periods which may include the use of mobile display devices.
Some implementations may allow a plurality of user profiles user maintenance. For example, the user may generally have habitual access to outlet to charge the mobile display device. During such time, the user, in accordance with a first user profile prefer the display image quality than battery life, it may want to try to control the mobile display device settings.
The preference data in a user profile. In some such implementations, such as a user profile stored in the memory of the mobile display device may involve creating or updating a user profile maintained locally via the network interface of the mobile display device, it may also involve sending a user profile information to another device. For example, other devices may be capable of creating or updating a user profile server.
The user interface system is the portion of the ABCS that communicates with the operator required to provide instructions, as shown in
Another class of sensors includes antennae for sending and receiving information wirelessly, and includes RF, UWB and antennae for communications such as discussed elsewhere in this application. RFID tags may also be used to send and receive information or otherwise identify the AB 100. Moreover, RFID tags may also be used to receive positioning information or receive instructions and/or task performing information.
Preferably, the sensors are solid state devices based on sensor output signal may be in any data format useable by the processing unit, but preferably will be digital. Furthermore, wireline or wireless communication links may be utilized to transfer signals between the sensor array and the processing unit.
For all communication that takes place within the ABCS or between the ABCS and outside components, any suitable protocol may be used such as CAN, USB, Firewire, JAUS (Joint Architecture for Unmanned Systems), TCP/IP, or the like. For all wireless communications, any suitable protocol may be used such as standards or proposed standards in the IEEE 802.11 or 802.15 families, related to Bluetooth, WiMax, Ultrawide Band or the like. For communication that takes place between the ABCS and a central computer, protocols like Microsoft Robotics Studio or JAUS may be used. For long range communication between the ABCS and the operator, existing infrastructure like internet or cellular networks may be used. For that purpose, the ABCS may use the IEEE 802.11 interface to connect to the internet 422 or may be equipped with a cellular modem.
In greater detail
Respectively the autonomous bicycle's Smartphone APP 900 controls of the following settings in relation to the virtually control setting; 904. Power ON switch, 905 Power OFF switch 905, Driving modes 906-908, Beginner Drive Mode A 906, Normal Drive Mode B 907, Master Drive Mode C 908, Motor controller 909, Battery power level 910, Charging gauge 911, GPS 912: mapping a route 912A, distance 912B, LED Lamp switch, 913a, 913b, User Music 914, Speaker setting 915, Camera setting 916, Anti-theft alarm for the alarm module switch 917.
The present invention also comprises a method of path planning for an AB Path planning is providing a plurality of waypoints for the AB to follow as it moves. With the current method, path planning can be done remotely from the AB, where remotely means that the human operator is not physically touching the AB and may be meters or kilometers away from the AB or locating the operator 101.
The method of path planning comprises marking a path of waypoints on a digitized geospatial representation and utilizing coordinates of the way points of the marked path. Marking a path comprises drawing a line from a first point to a second point.
Any of several commercially available digitized geospatial representations that provide absolute position (e.g. GPS coordinates) may be used in this method and include Google Earth and Microsoft Virtual Earth. Other representations with absolute position information may also be used such as those that are proprietary or provided by the military.
Moreover, digitized geospatial representations with relative position information may also be used such as ad hoc grids like those described in U.S. Patent Publication 20050215269. The ad hoc grids may be mobile, stationary, temporary, permanent or combinations thereof, and find special use within building and under dense vegetative ground cover where GPS may be inaccessible. Other relative position information may be used such as the use of cellular networks to determine relative position of cell signals to one another.
Combinations of ABCS absolute and relative position information may be used, especially in situations where the AB travels in and out of buildings or dense vegetation.
The ABCS 400 coordinates of the waypoints of the marked path are then utilized, whether that means storing the data for later use, caching the data in preparation for near term use or immediately using the data by communicating the data to an outside controller (e.g. an ABCS). For example, the data may be communicated to the processing unit of the ABCS, such as through the operator interface. The processing unit may then issue instructions through the use interface system 800 to operate the AB, or otherwise store the data in the processing unit.
Moreover, other types ABCS path planning may also be utilized for example, recording the movement of the vehicle when operated by a human could be used to generate waypoints. Other types of manual path planning may also be used. In addition, path planning may be accomplished through the use of image recognition techniques. For example, planning a path based on a camera 105 mounted to the platform 102 to avoid objects. In another embodiment, path planning may be accomplished identifying portions of a digitized geospatial representation that is likely to indicate a road or street suitable for the AB to travel on.
With any type of path planning, the generated waypoint data may be manipulated through hardware or software to smooth the data, remove outliers or otherwise clean up or compress the data to ease the utilization of the data.
Moreover, the marked path may include boundary conditions (e.g. increasingly hard boundaries) on either side of the path to permit the AB to select a path that avoids objects that may be found on the original marked path.
In various aspects the autonomous bicycle controller system 400 may employ a micro controller or central processors, memory, sensors to provide autonomous control to many different types of the autonomous bicycle 100. Autonomous control means that after initialization, the vehicle moves and/or accomplishes one or more tasks without further guidance from a human operator, even if the human operator is located on or within the vicinity of the autonomous bicycle 100.
The ABCS also includes an operator interface. The operator interface is the portion of the AB that communicates with the operator (e.g., a human being or central computer system). For all autonomous AB's, at some point, a human operator is required to at least initiate or re-initiate the AB 100. To do this, the operator interface receives instructions (e.g., voice instruction or virtual finger gestures) from the operator, as shown in
Another class of sensors includes antennae for sending and receiving information wirelessly, and includes RF, UWB and antennae for communications such as discussed elsewhere in this application. RFID tags may also be used to send and receive information or otherwise identify the AB 100. Moreover, RFID tags may also be used to receive positioning information or receive instructions and/or task performing information.
Preferably, the sensors are solid state devices based on MEMS technology 322 as these are very small, are light weight and have the necessary accuracy while not being cost prohibitive. Each utilized sensor provides a suitable output signal containing the information measured by the sensor. The sensor output signal may be in any data format useable by the processing unit, but preferably will be digital. Furthermore, wireline or wireless communication links may be utilized to transfer signals between the sensor array and the processing unit.
The link to an environmental sensor array link to a processing unit which communicates with the autonomous bicycle controller system 400 (ABCS). The communication between the ABCS and the autonomous bicycle 100 may be carried on any suitable data bus with CAN (e.g. ISO 11898-1) and/or PWM buses preferred. Wirelessly via WIFI 440 and/or Bluetooth 441 the autonomous bicycle controller system 400 synchronously links the manual drive mode 700 with the user interface system 800.
Throughout the present disclosure, a particular embodiment of the example may be initiated in a range format. Range of type descriptions are merely for convenience and brevity and should not be construed as an inflexible limitation on the disclosed range.
The described embodiments of the invention are intended to be merely exemplary and numerous variations and modifications will be apparent to those skilled in the art. All such variations and modifications are intended to be within the scope of the present invention as defined in the appended claims.
A notice of issuance for a continuation in part patent application in reference to application Ser. No. 15/451,405; filing date: Mar. 6, 2017; title: Vehicle Comprising Autonomous Steering Column System; and relating to patent applications; Ser. No. 13/872,054; filing date: Apr. 26, 2013, title: “Robotic Omniwheel”, and in reference to patent application Ser. No. 12/655,569; title: “Robotic Omniwheel Vehicle” filing date: Jan. 4, 2010, U.S. Pat. No. 8,430,192 B2.
Number | Date | Country | |
---|---|---|---|
Parent | 15451405 | Mar 2017 | US |
Child | 16370981 | US |