This application claims the benefit of Japanese Patent Application No. 2018-052996, filed on Mar. 20, 2018, which is hereby incorporated by reference herein in its entirety.
The present disclosure relates to a work support system which supports a user of a mobile body which functions as a mobile office, and an information processing method.
In recent years, study of providing service using a mobile body which autonomously travels has been underway. For example, Patent document 1 discloses a mobile office which is achieved by causing a plurality of vehicles in which office equipment is disposed in a car so as to be able to be used, to gather at a predetermined location and coupling the vehicles to a connection car.
[Patent document 1] Japanese Patent Laid-Open No. 9-183334
As a service form of a mobile body which autonomously travels, for example, a form can be considered in which space within the mobile body is provided as space in which a user does predetermined work. For example, by predetermined facility such as office equipment to be used by the user to execute predetermined work being placed within the mobile body, space within the mobile body can be provided as space in which the user. executes the predetermined work. The user of the service can, for example, move to a destination (for example, a place of work or a business trip destination) while executing predetermined work within the mobile body.
By the way, content of work to be executed within the mobile body varies depending on the user. Further, efficiency of work to be executed within the mobile body is also different depending on the user. Therefore, there can occur a case where the user is unable to appropriately complete work to be executed within the mobile body depending on content of work to be dealt with by the user.
The present disclosure has been made in view of such a problem, and an object of the present disclosure is to provide a support technique which enables the user to appropriately execute predetermined work within the mobile body.
To achieve the above-described object, one aspect of the present disclosure is exemplified by a work support system. The present work support system supports execution of work of a user using predetermined facility in a first mobile body including the predetermined facility among one or more mobile bodies. The present work support system may include a judging unit configured to judge whether a user needs a break on the basis of user information relating to the user who is executing work within the first mobile body, and a managing unit configured to instruct the first mobile body to provide predetermined service to the user when it is judged that the user needs a break.
According to such a configuration, when it is judged that the user who is executing work within an office vehicle is put into a state where the user needs a break, it is possible to provide predetermined service for a break to the user in accordance with preference. It is possible to provide a support technique which enables the user to appropriately execute predetermined work within the mobile body.
Further, in another aspect of the present disclosure, the user information may include at least biological information acquired from the user who is executing work within the first mobile body. According to such a configuration, it is possible to judge whether the user needs a break on the basis of transition and change of a state indicated by the biological information.
Further, in another aspect of the present disclosure, the user information may include acquired time, the biological information may include an image acquired from the user, and the judging unit may judge whether or not the user needs a break on the basis of at least one of the number of times of occurrence of a biological phenomenon in the acquired biological information per unit period and a proportion of a work period which is determined from the image in an elapsed time period. According to such a configuration, it is possible to improve judgement accuracy as to whether the user needs a break.
Further, in another aspect of the present disclosure, when it is judged that the user needs a break, the managing unit may notify the first mobile body of one of acoustic data, image data and news data selected in accordance with preference of the user. According to such a configuration, it is possible to provide service for a break for allowing the user to get refreshed on the basis of data selected in accordance with intention of the user.
Further, in another aspect of the present disclosure, when it is judged that the user needs a break, the managing unit may notify the first mobile body of a control instruction of controlling at least one of: lighting, daylighting, and air conditioning within the first mobile body, view from the first mobile body, and tilt of a chair to be used by the user in accordance with preference of the user. According to such a configuration, it is possible to provide an environment state appropriate for a break via equipment controlled in accordance with intention of the user.
Further, in another aspect of the present disclosure, when it is judged that the user needs a break, the managing unit may instruct a second mobile body which is able to provide goods or service among the one or more mobile bodies to provide the goods or the service to the user within the first mobile body. According to such a configuration, it is possible to provide various kinds of service of the second mobile body which provides goods or service to the user during a break of the user.
Further, another aspect of the present disclosure is exemplified by an information processing method executed by a computer of a work support system. Still further, another aspect of the present disclosure is exemplified by a program to be executed by a computer of an information system. Note that the present disclosure can be regarded as a work support system or an information processing apparatus which includes at least part of the above-described processing and means. Further, the present disclosure can be regarded as an information processing method which executes at least part of processing performed by the above-described means. Still further, the present disclosure can be regarded as a computer readable storage medium in which a computer program for causing a computer to execute this information processing method is stored. The above-described processing and means can be freely combined and implemented unless technical contradiction occurs.
According to the present disclosure, it is possible to provide a support technique which enables a user to appropriately execute predetermined work within a mobile body.
A work support system according to an embodiment will be described below with reference to the drawings. The following configuration of the embodiment is an example, and the present work system is not limited to the configuration of the embodiment.
<1. System Configuration>
First, outline of the mobile body system will be described. The mobile body system includes a plurality of autonomously traveling vehicles 30a, 30b and 30c which autonomously travel on the basis of a provided command, and a center server 20 which issues the command. In the following description, the autonomously traveling vehicle will be also referred to as a “vehicle”, and the plurality of autonomously traveling vehicles 30a, 30b and 30c will be also collectively referred to as a “vehicle 30”.
The vehicle 30 is an automated driving vehicle which provides predetermined mobility service in accordance with various needs of the user, and is a vehicle which can autonomously travel on a road. Further, the vehicle 30 is a multipurpose mobile body which can change exterior and interior of the own vehicle and can arbitrarily select a vehicle size in accordance with application and purpose of mobility service to be provided. Examples of such a multipurpose mobile body which can autonomously travel can include, for example, a self-propelled electric vehicle called an Electric Vehicle (EV) palette. The vehicle 30 provides predetermined mobility service such as a work environment by a mobile office, movement of the user or transport of baggage and sales of goods to the user in accordance with needs of the user via a user terminal 40 or an arbitrary user.
The center server 20, which is an apparatus which manages the plurality of vehicles 30 constituting the mobile body system, issues an operation command to each vehicle 30. The vehicle 30 creates an operation plan in response to the operation command from the center server 20 and autonomously travels to a destination in accordance with the operation plan. Note that the vehicle 30 includes means for acquiring location information, acquires location information with a predetermined period and transmits the location information to the center server 20 and a management server 50. The user terminal 40 is, for example, a small computer such as a smartphone, a mobile phone, a tablet terminal, a personal digital assistant and a wearable computer (such as a smart watch). However, the user terminal 40 may be a PC (Personal Computer) connected to the center server 20 and a work support management server 50 via a network N.
In the mobile body system illustrated in
The vehicle 30 which constitutes the mobile body system includes an information processing apparatus and a communication apparatus for controlling the own vehicle, providing a user interface with the user who utilizes the own vehicle and transmitting and receiving information to/from various kinds of servers on the network. In addition to processing which can be executed by the vehicle 30 alone, the vehicle 30 provides functions and service added by the various kinds of servers on the network to the user in cooperation with the various kinds of servers on the network.
For example, the vehicle 30 has a user interface controlled by a computer, accepts a request from the user, responds to the user, executes predetermined processing in response to the request from the user, and reports a processing result to the user. The vehicle 30 accepts speech, an image or an instruction from the user from input/output equipment of the computer, and executes processing. However, the vehicle 30 notifies the center server 20 and the management server 50 of the request from the user for a request which is unable to be processed alone among the requests from the user, and executes processing in cooperation with the center server 20 and the management server 50. The request which is unable to be processed by the vehicle 30 alone can include, for example, requests for acquisition of information from a database on the center server 20 and the management server 50, recognition or inference by learning machine 60 which cooperates with the management server 50, or the like.
Note that the vehicle 30 does not always have to be an unmanned vehicle, and a sales staff, a service staff, a maintenance staff, or the like, who provide the service may be on board in accordance with application and purpose of the mobility service to be provided. Further, the vehicle 30 does not always have to be a vehicle which always autonomously travels. For example, the above-described staff may drive the vehicle or assist driving of the vehicle in accordance with circumstances. While, in
The work support system 1 according to the present embodiment includes the vehicle 30 which functions as a mobile type office (hereinafter, also referred to as an “office vehicle 30W”) and the management server (work support management server) 50 in the configuration. At the office vehicle 30W, for example, predetermined facility such as office equipment to be used by the user who reserves utilization to execute predetermined work is placed within the vehicle. Then, the office vehicle 30W provides space within the vehicle in which the predetermined facility is placed to the user as space in which the user executes predetermined work. A utilization form of the office vehicle 30W is arbitrary, and, for example, work such as a task and movie watching may be executed while the office vehicle 30W moves to a destination such as a place of work and a business trip destination from home, or the above-described predetermined work may be executed while the vehicle is parked or stopped at a destination such as on a road and open space at which the office vehicle 30W can be parked or stopped. The “office vehicle 30W” is one example of the “first mobile body”.
In the work support system 1 according to the present embodiment, the management server 50 manages a state of the user who executes predetermined work at the office vehicle 30W. For example, the management server 50 acquires biological information indicating the state of the user from the user who is executing work, and judges a degree of fatigue of the user who is executing the work on the basis of the biological information. Such judgement of the degree of fatigue while the user is executing the work can be, for example, recognized or inferred by the learning machine 60 which cooperates with the management server 50.
For example, the learning machine 60 is an information processing apparatus which has a neural network having a plurality of layers and which executes deep learning. The learning machine 60 executes inference processing, recognition processing, or the like, upon request from the management server 50. For example, the learning machine 60 executes convolution processing of receiving input of a parameter sequence {xi, i=1, 2, . . . , N} and performing product-sum operation on the input parameter sequence with a weighting coefficient (wi, j, l, (here, j is a value between 1 and an element count M to be subjected to convolution operation, and l is a value between 1 and the number of layers L)) and pooling processing which is processing of decimating part from an activating function for determining a result of the convolution processing and a determination result of the activating function for the convolution processing. The learning machine 60 repeatedly executes the processing described above over a plurality of layers L and outputs an output parameter (or an output parameter sequence) {yk, k=1, . . . , P} at a fully connected layer in a final stage. In this case, the input parameter sequence {xi} is, for example, a pixel sequence which is one frame of an image, a data sequence indicating a speech signal, a string of words included in natural language, or the like. Further, the output parameter (or the output parameter sequence) {yk} is, for example, a characteristic portion of an image which is an input parameter, a defect in the image, a classification result of the image, a characteristic portion in speech data, a classification result of speech, an estimation result obtained from a string of words, or the like.
The learning machine 60 receives input of a number of combinations of existing input parameter sequences and correct output values (training data) and executes learning processing in supervised learning. Further, the learning machine 60, for example, executes processing of clustering or abstracting the input parameter sequence in unsupervised learning. In learning processing, coefficients {wi, j, l} in the respective layers are adjusted so that a result obtained by executing convolution processing (and output by an activating function) in each layer, pooling processing and processing in the fully connected layer on the existing input parameter sequence approaches a correct output value. Adjustment of the coefficients {wi, j, l} in the respective layers is executed by letting an error based on a difference between output in the fully connected layer and the correct output value propagate from an upper layer to a lower input layer. Then, by inputting an unknown input parameter sequence {xi} in a state where the coefficients {wi, j, l} in the respective layers are adjusted, the learning machine 60 outputs a recognition result, a determination result, a classification result, an inference result, or the like, for the unknown input parameter sequence {xi}.
For example, the learning machine 60 extracts a face portion of the user from an image frame acquired by the office vehicle 30W. Further, the learning machine 60 recognizes speech of the user from speech data acquired by the office vehicle 30W and accepts a command by the speech. Further, the learning machine 60 determines the state of the user who is executing the work from an image of the face of the user to generate state information. The state information generated by the learning machine 60 is, for example, classification for classifying behavior (such as, for example, frequency and an interval of yawning, and a size of a pupil) for determining fatigue based on an image of a face portion of the user. As such classification, for example, a state is classified into evaluation values in four stages such that a favorable state for executing work is classified as “4”, a slightly favorable state is classified as “3”, a slightly fatigued state is classified as “2” and a fatigued state is classified as “1”. The image may be, for example, one which indicates temperature distribution of a face surface obtained from an infrared camera. The learning machine 60 reports the determined state information of the user to the management server 50, and the management server 50 judges that the user who is executing predetermined work is put into a state where the user needs a break on the basis of the reported state information. The management server 50 then provides service for a break for allowing the user to get refreshed and alleviating fatigue accumulated during execution of work to the user who is judged to be in a state where the user needs a break. Note that, in the present embodiment, learning executed by the learning machine 60 is not limited to machine learning by deep learning, and the learning machine 60 may execute learning by typical perceptron, learning by other neural networks, search using genetic algorithm, or the like, statistical processing, or the like. However, the management server 50 may determine the state of the user in accordance with the number of times of interruption of work, the number of times of yawning, frequency of the size of the pupil exceeding a predetermined size, and frequency of a period while the eyelid is closed exceeding a predetermined time period, from the image of the face of the user.
As a form of the service for a break, the management server 50, for example, adjusts an environment within the office vehicle during a break to an environment state which is different from that upon execution of work and which enables the user to get refreshed. Here, the environment refers to physical, chemical or biological conditions which are felt by the user through five senses and which affect a living body of the user. Examples of the environment can include, for example, brightness of lighting, and dimming within the office vehicle, daylighting from outside, view from the office vehicle, a temperature, humidity, and an air volume of air conditioning within the office vehicle, tilt of a chair within the office vehicle, and the like. The management server 50 can provide service for a break, for example, adjusting an environment within a car in accordance with preference of the user by adjusting the environment state within the office vehicle in accordance with the request from the user.
Further, as another form of the service for a break, the management server 50 may provide service of distributing acoustic data such as sound and music, image data, news, or the like, in accordance with the request from the user. The acoustic data includes classic, ambient music such as healing, music such as pops, sound such as chirrup of a bird, murmur of a stream, and sound of waves lapping against a sand beach, and a speech message registered in advance (for example, a message from a family member). The image data includes a moving image in accordance with the season such as floating of petals of cherry blossoms and autumn color of leaves, an image of shot landscape of a mountain, a river, a lake, the moon, or the like, a recorded image which is impressive for the user (for example, a scene in which a player who the user cheers for did well in the Olympic), a moving image registered in advance (for example, a scene in which a child of the user is playing), or the like. The management server 50 can provide the service for a break for allowing the user to get refreshed by providing the above-described distributed data selected in accordance with preference of the user via an in-vehicle display or acoustic equipment such as a speaker.
Further, as another form of the service for a break, the management server 50 may provide service such as tea service of providing light meal, tea, coffee, or the like, or service such as massage, sauna and shower in accordance with the request from the user. The above-described service is, for example, provided via the vehicle 30 which functions as a mobile type store (hereinafter, the vehicle 30 which functions as a store will be also referred to as a “store vehicle 30S”) directed to selling goods or providing service to the user. The store vehicle 30S includes, for example, facility, equipment, or the like, for operation of the store within the vehicle, and provides store service of the own vehicle to the user. The “store vehicle 30S” is one example of the “second mobile body”.
The management server 50, for example, selects the store vehicle 30S which provides tea service or store service such as massage, sauna and shower from the vehicles 30 constituting the mobile body system. For example, the management server 50 acquires location information, vehicle attribute information, or the like, from the store vehicle 30S. The management server 50 then selects the store vehicle 30S which is located around the office vehicle 30W and which provides tea service or store service such as massage, sauna and shower on the basis of the location information, the vehicle attribute information, or the like, from the store vehicle 30S. The management server 50 notifies the center server 20 of an instruction for causing the selected store vehicle to meet at a point where the office vehicle 30W is located. Here, meeting refers to dispatching the store vehicle 30S to the location of the office vehicle 30W and causing the store vehicle 30S to provide service in cooperation with the office vehicle 30W. An operation command for causing the store vehicle to meet at a point where the office vehicle 30W is located is issued via the center server 20 which cooperates with the management server 50.
When the center server 20 accepts a dispatch instruction from the management server 50, the center server 20 acquires location information of the store vehicle 30S to be dispatched and the office vehicle 30W which is a dispatch destination at the present moment. The center server 20 specifies a moving route in which, for example, a point where the store vehicle 30S is located is set as a starting point and the office vehicle 30W is set as a destination point after movement. The center server 20 then transmits an operation command indicating “moving from the starting point to the destination point” to the store vehicle 30S. By this means, the center server 20 can dispatch the store vehicle 30S by causing the store vehicle 30S to travel along a predetermined route from a current location to a destination which is a point where the office vehicle 30W is located, and provide service of the store vehicle to the user. Note that the operation command may include a command to the store vehicle 30S for providing service such as “temporarily dropping by a predetermined point (for example, the point where the office vehicle 30W is located)”, “letting the user get on or off the vehicle” and “providing tea service” to the user, in addition to the command for traveling.
The management server 50 can provide tea service of providing light meal, tea, coffee, or the like, or service such as massage, sauna and shower to the user via the store vehicle 30S. In the work support system 1 according to the present embodiment, it is possible to provide service for a break suitable for allowing the user to get refreshed and alleviating fatigue in accordance with preference of the user. As a result, in the work support system 1 according to the present embodiment, it is possible to provide a support technique which enables the user to appropriately execute predetermined work within the mobile body.
<2. Equipment Configuration>
The EV palette includes a boxlike body 1Z, and four wheels TR-1 to TR-4 provided at anterior and posterior portions in a traveling direction at both sides of a lower part of the body 1Z. The four wheels TR-1 to TR-4 are coupled to a drive shaft which is not illustrated and are driven by a drive motor 1C illustrated in
As illustrated in
In the present embodiment, the EV palette adjusts an environment state of the in-vehicle space in accordance with a control instruction notified from the management server 50. For example, the chair C1 has an actuator which adjusts a height of a seating surface and tilt of a back. The EV palette varies the height of the seating surface and the tilt of the back of the chair C1 in accordance with the control instruction. Further, the windows W1 to W4 respectively have actuators which drive opening and closing of curtains or window shades. The EV palette adjusts daylighting from outside the vehicle and view of the outside the vehicle by adjusting opening of a window shade, or the like, in accordance with the control instruction. Further, the EV palette adjusts dimming of the ceiling light L1 which is in-vehicle lighting and a temperature, humidity, an air volume, or the like, of the in-vehicle space by the air conditioner AC1 in accordance with the control instruction. At the EV palette, the environment state of the in-vehicle space which functions as the mobile type office is adjusted to an environment state which is suitable for allowing the user to get refreshed and alleviating fatigue.
Further, the EV palette of the present embodiment acquires speech, an image and biological information of the user with the microphone 1F, the image sensor 1H and the biosensor 1J illustrated in
It is assumed in
As illustrated in
Further, the EV palette includes the steering motor 1B, the drive motor 1C, and a secondary battery 1D which supplies power to the steering motor 1B and the drive motor 1C. Further, the EV palette includes a wheel encoder 19 which detects a rotation angle of the wheel each second, and a steering angle encoder 1A which detects a steering angle which is the traveling direction of the wheel. Still further, the EV palette includes the control system 10, a communication unit 15, a GPS receiving unit 1E, a microphone 1F and a speaker 1G. Note that, while not illustrated, the secondary battery 1D supplies power also to the control system 10, or the like. However, a power supply which supplies power to the control system 10, or the like, may be provided separately from the secondary battery 1D which supplies power to the steering motor 1B and the drive motor 1C.
The control system 10 is also referred to as an Electronic Control Unit (ECU). As illustrated in
The obstacle sensor 18 is an ultrasonic sensor, a radar, or the like. The obstacle sensor 18 emits an ultrasonic wave, an electromagnetic wave, or the like, in a detection target direction, and detects existence, a location, relative speed, or the like, of an obstacle in the detection target direction on the basis of a reflected wave.
The camera 17 is an imaging apparatus using an image sensor such as Charged-Coupled Devices (CCD), a Metal-Oxide-Semiconductor (MOS), or a Complementary Metal-Oxide-Semiconductor (CMDS). The camera 17 acquires an image at predetermined time intervals called a frame period, and stores the image in a frame buffer which is not illustrated, within the control system 10. An image stored in the frame buffer with a frame period is referred to as frame data.
The steering motor 1B controls a direction of a cross line on which a plane of rotation of the wheel intersects with a horizontal plane, that is, an angle which becomes a traveling direction by rotation of the wheel, in accordance with an instruction signal from the control system 10. The drive motor 1C, for example, drives and rotates the wheels TR-1 to TR-4 in accordance with the instruction signal from the control system 10. However, the drive motor 1C may drive one pair of wheels TR-1 and TR-2 or the other pair of wheels TR-3 and TR-4 among the wheels TR-1 to TR-4. The secondary battery 1D supplies power to the steering motor 1B, the drive motor 1C and parts connected to the control system 10.
The steering angle encoder 1A detects a direction of the cross line on which the plane of rotation of the wheel intersects with the horizontal plane (or an angle of the rotating shaft of the wheel within the horizontal plane), which becomes the traveling direction by rotation of the wheel, at predetermined detection time intervals, and stores the direction in a register which is not illustrated, in the control system 10. In this case, for example, a direction to which the rotating shaft of the wheel is orthogonal with respect to the traveling direction (direction of the arrow AR1) in
The communication unit 15 is a communication unit for communicating with, for example, various kinds of servers, or the like, on a network N1 through a mobile phone base station and a public communication network connected to the mobile phone base station. The communication unit 15 performs wireless communication with a wireless signal and a wireless communication scheme in accordance with predetermined wireless communication standards.
The Global Positioning System (GPS) receiving unit 1E receives radio waves of time signals from a plurality of satellites (Global Positioning Satellites) which orbit the earth and stores the radio waves in a register which is not illustrated, in the control system 10. The microphone 1F detects sound or speech (also referred to as acoustic), converts the sound or speech into a digital signal and stores the digital signal in a register which is not illustrated, in the control system 10. The speaker 1G is driven by a D/A converter and an amplifier connected to the control system 10 or a signal processing unit which is not illustrated, and reproduces acoustic including sound and speech.
The CPU 11 of the control system 10 executes a computer program expanded at the memory 12 so as to be able to be executed, and executes processing as the control system 10. The memory 12 stores a computer program to be executed by the CPU 11, data to be processed by the CPU 11, or the like. The memory 12 is, for example, a Dynamic Random Access Memory (DRAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), or the like. The image processing unit 13 processes data in the frame buffer obtained for each predetermined frame period from the camera 17 in cooperation with the CPU 11. The image processing unit 13, for example, includes a Graphics Processing Unit (GPU) and an image memory which becomes the frame buffer. The external storage device 14, which is a non-volatile storage device, is, for example, a Solid State Drive (SSD), a hard disk drive, or the like.
For example, as illustrated in
Further, the control system 10 processes images acquired from the camera 17 for each frame data in cooperation with the image processing unit 13, for example, detects change based on a difference in images and recognizes an obstacle. Further, the control system 10 analyzes a speech signal obtained from the microphone 1F and responds to intention of the user obtained through speech recognition. Note that the control system 10 may transmit frame data of an image from the camera 17 and speech data obtained from the microphone 1F from the communication unit 15 to the center server 20 and the management server 50 on the network. Then, it is also possible to cause the center server 20 and the management server 50 to share analysis of the frame data of the image and the speech data.
Still further, the control system 10 displays an image, characters and other information on the display 16. Further, the control system 10 detects operation to the display with the touch panel 16A and accepts an instruction from the user. Further, the control system 10 responds to the instruction from the user via the display with the touch panel 16A, the camera 17 and the microphone 1F, from the display 16, the display with the touch panel 16A or the speaker 1G.
Further, the control system 10 acquires a face image of the user in the indoor space from the image sensor 1H and notifies the management server 50. The image sensor 1H is an imaging apparatus by the image sensor as with the camera 17. However, the image sensor 1H may be an infrared camera. Further, the control system 10 acquires the biological information of the user via the biosensor 1J and notifies the management server 50. Further, the control system 10 adjusts the environment of the indoor space via the environment adjusting unit 1K in accordance with the control instruction notified from the management server 50. Further, the control system 10 outputs the acoustic data, the image data, the news, or the like, distributed from the management server 50 to the display 16 and the speaker 1G via the environment adjusting unit 1K.
While the interface IF1 is illustrated in
As illustrated in
The heart rate sensor J1, which is also referred to as a heart rate meter or a pulse wave sensor, irradiates blood vessels of the human body with a Light Emitting Diode (LED), and specifies a heart rate from change of the blood flow with the reflected light. The heart rate sensor J1 is, for example, worn on the body such as the wrist of the user. Note that the blood flow sensor J3 has a light source (laser) and a light receiving unit (photodiode) and measures a blood flow rate on the basis of Doppler shift from scattering light from moving hemoglobin. Therefore, the heart rate sensor J1 and the blood flow sensor J3 can share a detecting unit.
The blood pressure sensor J2 has a compression garment (cuff) which performs compression by air being pumped after the compression garment is wound around the upper arm, a pump which pumps air to the cuff, and a pressure sensor which measures a pressure of the cuff, and determines a blood pressure on the basis of fluctuation of the pressure of the cuff which is in synchronization with heart beat of the heart in a depressurization stage after the cuff is compressed once (oscillometric method). However, the blood pressure sensor J2 may be one which shares a detecting unit with the above-described heart rate sensor J1 and blood flow sensor J3 and which has a signal processing unit that converts the change of the blood flow detected at the detecting unit into a blood pressure.
The electrocardiographic sensor J4 has an electrode and an amplifier, and acquires an electrical signal generated from the heart by being worn on the breast. The body temperature sensor J5, which is a so-called electronic thermometer, measures a body temperature in a state where the body temperature sensor J5 contacts with a body surface of the user. However, the body temperature sensor J5 may be infrared thermography. That is, the body temperature sensor J5 may be one which collects infrared light emitted from the face, or the like, of the user, and measures a temperature on the basis of luminance of the infrared light radiated from a surface of the face.
The environment adjusting unit 1K includes at least one of a light adjusting unit K1, a daylighting control unit K2, a curtain control unit K3, a volume control unit K4, an air conditioning control unit K5, a chair control unit K6 and a display control unit K7. That is, the environment adjusting unit 1K is a combination of one or a plurality of these control units. However, the environment adjusting unit 1K of the present embodiment is not limited to the configuration in
The light adjusting unit K1 controls the LED built in the ceiling light L1 in accordance with a light amount designated value and a light wavelength component designated value included in the control instruction and adjusts a light amount and a wavelength component of light emitted from the ceiling light L1. The daylighting control unit K2 instructs the actuators of the window shades provided at the windows W1 to W4 and adjusts daylighting and view from the windows W1 to W4 in accordance with a daylighting designated value included in the control instruction. Here, the daylighting designated value is, for example, a value designating an opening (from fully opened to closed) of the window shade. In a similar manner, the curtain control unit K3 instructs the actuators of the curtains provided at the windows W1 to W4 and adjusts opened/closed states of the curtains at the windows W1 to W4 in accordance with an opening designated value for the curtain included in the control instruction. Here, the opening designated value is, for example, a value designating an opening (fully opened to closed) of the curtain.
The volume control unit K4 adjusts sound quality and a volume of sound output by the control system 10 from the speaker 1G in accordance with a sound designated value included in the control instruction. Here, the sound designated value is, for example, whether or not a high frequency or a low frequency is emphasized, a degree of emphasis, a degree of an echo effect, a volume maximum value, a volume minimum value, or the like.
The air conditioning control unit K5 adjusts an air volume and air direction from the air conditioner AC1 and a set temperature in accordance with an air conditioning designated value included in the control instruction. Further, the air conditioning control unit K5 controls ON or OFF of a dehumidification function at the air conditioner AC1 in accordance with the control instruction. The chair control unit K6 instructs the actuator of the chair C1 to adjust a height of the seating surface and tilt of the back of the chair C1 in accordance with the control instruction. The display control unit K7 reproduces the distributed acoustic data, image data, news, or the like, and outputs the acoustic data, the image data, the news, or the like, to the display 16 and the speaker 1G.
Functional configurations of the center server 20 and the management server 50 in the work support system 1 will be described next using
In
The operation command generating unit 202 issues a command of an operation route and estimated time of arrival at a destination to the office vehicle 30W in accordance with the operation plan of the office vehicle 30W. Further, in response to a notification of the vehicle attribute information of a vehicle to be dispatched from the management server 50 with which the vehicle is to cooperate, the operation command generating unit 202 generates an operation command for the vehicle. The notification of the vehicle to be dispatched from the management server 50 includes information of the office vehicle 30W which is a dispatch destination, and the store vehicle 30S which can provide predetermined service to a user who is on board the office vehicle. The operation command generating unit 202 acquires location information of the office vehicle 30W and the store vehicle 30S at the present moment. The operation command generating unit 202 then specifies a moving route in which a point where the store vehicle 30S is located at the present moment is set as a starting point, and a point where the office vehicle 30W is located is set as a destination, for example, with reference to map data stored in an external storage device, or the like. The operation command generating unit 202 then generates an operation command to the destination from the vehicle location at the present moment for the store vehicle 30S. Note that the operation command includes instructions, or the like, such as “temporarily dropping by”, “letting the user get on or off the vehicle” and “providing tea service” for the user who utilizes the store vehicle 30S at the destination.
In the vehicle management DB 203, vehicle operation information regarding the plurality of vehicles 30 which autonomously travel is stored.
The management server 50 will be described next using
The support instruction generating unit 502 collects biological information of the user transmitted from the vehicle 30 which functions as the office vehicle 30W with a predetermined period, and accumulates the biological information in the work support management DB 503 which will be described later as user state management information. The biological information of the user is, for example, measured with the biosensor 1J over a certain period of time at intervals of a predetermined period. The support instruction generating unit 502 accumulates the biological information measured with the biosensor 1J in the work support management DB 503 and hands over the measured biological information to the learning machine 60 which cooperates with the management server 50. The learning machine 60 reports the state information indicating the state of the user who is executing work to the management server 50 on the basis of the biological information which is handed over. The management server 50 stores the state information reported from the learning machine 60 in the user state management information and judges that the user who is executing predetermined work is put into a state where the user needs a break on the basis of the state information.
The support instruction generating unit 502, for example, encourages the user who is executing work within the office vehicle 30W to notice concerning a break. The support instruction generating unit 502 makes a recommendation for encouraging the user to temporarily stop work which is being executed and have a break to the user via, for example, a speech message or a display message. The support instruction generating unit 502 then presents menu of service for a break which can be provided at the office vehicle 30W. The menu is, for example, displayed as a display screen of a display with a touch panel 16A.
When the support instruction generating unit 502, for example, accepts selection operation of the “relaxing environment (5021)”, the support instruction generating unit 502 further displays a list of environment adjustment service which can be provided in accordance with predetermined facility provided at the office vehicle 30W. The support instruction generating unit 502 then issues a control instruction to facility equipment selected from the list via the operation input to the display with the touch panel 16A or the microphone 1F. At the office vehicle 30W, the height of the seating surface and tilt of the back of the chair C1, daylighting from outside the car, view of outside of the car, dimming of lighting within the car, a temperature, humidity, or an air volume of the in-vehicle space by the air conditioner AC1, or the like, are adjusted in accordance with the control instruction.
Further, when the support instruction generating unit 502, for example, accepts selection operation of the “healing image/music (5022)”, the support instruction generating unit 502 further displays a distribution list of acoustic data and image data which can be provided via the display 16 and the speaker 1G. The support instruction generating unit 502 then distributes the acoustic data and the image data selected from the distribution list via operation input to the display with the touch panel 16A or the microphone IF. Various kinds of data distributed to the office vehicle 30W is, for example, reproduced via a volume control unit K4 and a display control unit K7 and output to the display 16 and the speaker 1G.
Processing in the case where the “news (5023)” is selected is similar to that in the case where the “healing image/music (5022)” is selected. The support instruction generating unit 502 displays a watch list of news images and speech which can be provided via the display 16 and the speaker 1G. The watch list includes a TV channel and a radio channel which provide news images and speech. The support instruction generating unit 502 distributes the news image and speech selected from the watch list via operation input to the display with the touch panel 16A or the microphone 1F.
Further, when the support instruction generating unit 502 accepts selection operation of the “utilization of store (5024)”, the support instruction generating unit 502 further displays a list of store service which can be provided to the user within the office vehicle. The support instruction generating unit 502 accepts service selected from the list of the store service via operation input to the display with the touch panel 16A or the microphone 1F. The support instruction generating unit 502 then generates a dispatch instruction of dispatching the store vehicle 30S which can provide the selected store service to the office vehicle 30W and notifies the center server 20 with which the vehicle cooperates of the dispatch instruction. The center server 20 generates an operation command to the store vehicle 30S on the basis of the accepted dispatch instruction from the management server 50.
The work support management DB 503 will be described next. As illustrated in
The office vehicle information is management information which manages vehicle attribute information for each vehicle which functions as the office vehicle 30W.
The store vehicle information is management information managing the vehicle attribute information for each vehicle which functions as the store vehicle 30S.
The distributed data management information is information managing image data, acoustic data, news, or the like, to be distributed while the user is on a break.
The user state management information is information managing a state of the user who utilizes the office vehicle 30W. The user state management information is managed for each user. In the user state management information, information regarding the state of the user who is executing work, acquired from the biosensor 1J, the microphone 1F and the image sensor 1H mounted on the office vehicle 30W is managed.
<3. Processing Flow>
Processing relating to work support in the present embodiment will be described next with reference to
The processing in the flowchart in
The management server 50 acquires the location information and the vehicle attribute information transmitted from each of the office vehicle 30W and the store vehicle 30S (S1). The management server 50 stores the vehicle attribute information acquired from the office vehicle 30W in the office vehicle information of the work support management DB 503 in association with the vehicle ID (S2). The vehicle attribute information acquired from the office vehicle 30W includes a user ID, reservation time, and work schedule, of the user who utilizes the office vehicle 30W, and identification number of predetermined facility provided within a car to execute work. Further, the management server 50 stores the vehicle attribute information acquired from the store vehicle 30S in the store vehicle information of the work support management DB 503 in association with the vehicle ID (S3). The vehicle attribute information acquired from the store vehicle 30S includes a type of service to be provided by the store vehicle, a line of goods, opening hours, or the like. After the processing in S3, the processing illustrated in
The management server 50 acquires the biological information of the user who is executing work, which is transmitted from the office vehicle 30W (S11). The management server 50 stores the acquired biological information in the user state management information of the work support management DB 503 in association with time information and the user ID (S12). The user ID is specified on the basis of the vehicle ID. The management server 50 then estimates the state of the user who is executing work on the basis of the acquired biological information (S13). The state of the user who is executing work is, for example, estimated by the learning machine 60 which cooperates with the management server 50. The learning machine 60, for example, estimates the state of the user at the present moment on the basis of tendency to exhibit a relatively favorable state (duration of a favorable state, a proportion of a favorable state, a proportion of a favorable state with respect to duration, or the like) from temporal transition of the acquired biological information (a pulse, speech, an image, or the like). For example, the management server 50 may instruct the learning machine 60 illustrated in
The management server 50 stores the evaluation value indicating the state of the user which is the estimation result in the work support management DB 503 (S14). The management server 50 stores the evaluation value indicating the state of the user in a state field in a row corresponding to the biological information, in the user state management information. After the processing in S14, the processing illustrated in
In the case where the evaluation value indicating the state of the user is equal to or greater than the predetermined threshold (S22: “Yes”), the management server 50 judges that it is possible to continuously execute work, and finishes the processing illustrated in
The office vehicle 30W, for example, accepts operation input from the user with respect to the menu of the service for a break displayed at the display with the touch panel 16A, and specifies an item of the service for a break selected from the menu. The office vehicle 30W transmits the item of the service for a break selected in accordance with preference of the user to the management server 50 connected to the network N via the communication unit 15. In the example in
In the processing in S25 in
Here, the processing in S26 executed by the management server 50 is one example of a “managing unit that instructs the first mobile body to provide predetermined service to a user when it is judged that the user needs a break”.
In the flowchart in
(Provision of Environment Adjustment Service)
In the processing in S32, the management server 50 notifies the office vehicle of a list of items of environments which can be adjusted at the office vehicle 30W. At the office vehicle 30W, for example, the list of items of environments which can be adjusted is presented to the user via the display 16, the display with the touch panel 16A and the speaker 1G. At the office vehicle 30W, the list of items of environments which can be adjusted via the environment adjusting unit 1K is presented.
The office vehicle 30W, for example, accepts operation input from the user with respect to the list of items of environments displayed at the display with the touch panel 16A. Further, the office vehicle 30W accepts a selection instruction from the list of items of environments displayed at the display 16 using speech via the microphone IF and a selection instruction of an item of an environment notified via the speaker 1G using speech. The office vehicle 30W transmits the item of the environment selected in accordance with preference of the user to the management server 50. The management server 50 receives the item of the environment which is selected in accordance with preference of the user and which is transmitted from the office vehicle 30W.
In the processing in S33, the management server 50 notifies the office vehicle 30W of a control instruction of adjusting the environment within the car selected in accordance with preference of the user. The management server 50, for example, notifies the office vehicle 30W of a control instruction of controlling at least one of: lighting, daylighting, and air conditioning within the office vehicle, view of outside from the office vehicle 30W, and a height and tilt of the chair. The environment adjusting unit 1K of the office vehicle 30W controls a light adjusting unit K1, a daylighting control unit K2, a curtain control unit K3, an air conditioning control unit K5 and a chair control unit K6 in accordance with the notified control instruction to adjust the environment within the car to a state appropriate for a break. After the processing in S33, the processing illustrated in
(Provision of Distribution Service)
In the processing in S34, the management server 50 notifies the office vehicle of a list of distributed data which can be provided at the office vehicle 30W. The management server 50, for example, acquires information of a channel in which acoustic data, image data, image and speech of a news are to be provided with reference to the distributed data management information in the work support management DB 503. The management server 50 then makes a notification of the acquired distributed data management information as a list. For example, the distributed data management information described using
The office vehicle 30W accepts operation input from the user with respect to the list of distributed data in similar manner to provision of the environment adjustment service. The office vehicle 30W transmits the item of distributed data selected in accordance with preference of the user to the management server 50. The management server 50 receives the item of distributed data which is selected in accordance with preference of the user, and which is transmitted from the office vehicle 30W.
In the processing in S35, the management server 50 notifies the office vehicle 30W of the distributed data such as the image data and the acoustic data selected in accordance with preference of the user. The environment adjusting unit 1K of the office vehicle 30W controls a volume control unit K4 and a display control unit K7 to reproduce the notified distributed data. The reproduced distributed data is provided to the user via acoustic equipment such as the display 16 and the speaker 1G. At the office vehicle 30W, it is possible to provide service for a break of allowing the user to get refreshed via the distributed data selected in accordance with preference of the user.
(Provision of Store Utilization Service)
In the processing in S36, the management server 50 searches for the store vehicle 30S which is available to the user on the basis of the location information of the store vehicle 30S and the office vehicle 30W, and the vehicle attribute information of the store vehicle 30S with reference to the work support management DB 503. As a result of the search, a plurality of store vehicles 30S located around the office vehicle 30W are searched for. The management server 50 extracts types of service and lines of goods which can be provided from the store vehicle information of the searched store vehicle 30S and generates a list of service which can be provided to the user. The management server 50 then notifies the office vehicle 30W of the generated list of service to be provided by each store vehicle (S37). At the office vehicle 30W, for example, the list of service to be provided by each store vehicle is presented to the user via the display 16, the display with the touch panel 16A or the speaker 1G.
The office vehicle 30W accepts operation input from the user with respect to the list of service to be provided by each store vehicle in a similar manner to provision of the environment adjustment service. The office vehicle 30W transmits service utilizing the store vehicle 30S selected in accordance with preference of the user to the management server 50. The management server 50 receives the service utilizing the store vehicle 30S selected in accordance with preference of the user.
In the processing in S38, the management server 50 specifies the store vehicle 30S which provides the service selected in accordance with preference of the user. The store vehicle 30S which provides tea service of providing light meal, tea, coffee, or the like, or service of massage, sauna, shower, or the like, is specified.
The management server 50 extracts a vehicle ID of the specified store vehicle 30S from the store vehicle information and generates a dispatch instruction of dispatching the store vehicle to the office vehicle 30W (S39). The dispatch instruction includes, for example, vehicle IDs of the store vehicle 30S and the office vehicle 30W, and an instruction of instructing the store vehicle 30S to “provide tea service” to the office vehicle 30W. The management server 50 notifies the center server 20 with which the management server 50 cooperates of the generated dispatch instruction. After the processing in S39, the processing illustrated in
As described above, it is possible to manage the state of the user who is executing work within the office vehicle on the basis of the user information including the biological information, work schedule, or the like, in the work support system 1 of the present embodiment. When it is judged that the user who is executing work is put into a state where the user needs a break, the work support system 1 provides predetermined service for a break to the user in accordance with preference.
For example, the work support system 1 of the present embodiment can judge whether the user who is executing work within the office vehicle needs a break on the basis of transition and change of a state indicated by the biological information included in the user information.
For example, the management server 50 of the work support system 1 notifies the office vehicle 30W of a control instruction of controlling brightness of lighting, and dimming within the vehicle, daylighting from outside, view from the office vehicle, a temperature, humidity, and an air volume of air conditioning within the office vehicle, tilt of the chair within the office vehicle, and the like. At the office vehicle 30W, the corresponding lighting, curtains, window shades, air conditioner, chair, or the like, are controlled in accordance with the control instruction, so that it is possible to provide an environment state appropriate for a break.
Further, for example, the management server 50 notifies the office vehicle 30W of distributed data such as acoustic data such as sound and music, image data and news. At the office vehicle 30W, the distributed data is reproduced via acoustic equipment such as the display and the speaker, so that it is possible to provide service for a break for allowing the user to get refreshed.
Further, for example, the management server 50 instructs the store vehicle 30S which provides tea such as light meal, tea and coffee, massage, sauna, shower, or the like, to meet the office vehicle 30W and provide store service of the own vehicle to the user. At the office vehicle 30W, it is possible to allow the user to utilize the store service via the met store vehicle 30S. According to the work support system 1 of the present embodiment, it is possible to provide a support technique which enables the user to appropriately execute predetermined work within the mobile body.
While, in the processing in the above-described S13, to estimate the state of the user who is executing work, a processing example by deep learning has been described, the processing is not limited to this. For example, it is also possible to judge whether the user needs a break in accordance with duration of work of the user, behavior for determining fatigue, such as yawning, frequency of a time period while the eyelid is closed exceeding a predetermined time period, or the like. The management server 50, for example, specifies interruption of work by the user on the basis of the image acquired via the image sensor 1H. For example, action that the user leaves the chair C1 on which the user is seated is specified as interruption of work through image recognition. In a similar manner, yawning and opening and closing of the eyelid are specified from change of a mouth portion, change of an eye portion, tilt of the face in a front-back direction and in a right and left direction, or the like, in the face image. The management server 50 then may calculate an evaluation value based on frequency of the above-described behavior, or the like, within a predetermined time period as a point.
In processing in S42, a point value with respect to the counted number of times of yawning, which is obtained by counting the number of times of yawning specified from the image is calculated in a similar manner. The point value “NA×w2” in which the counted number of times of yawning is set as “NA”, and a weighting coefficient is set as “w2”, is calculated. In a similar manner, also in processing in S43, a point value (“NS×w3”) obtained with the number of times of detection of drowsiness (“NS”) counted from the image and a weighting coefficient (“w3”) is calculated.
The management server 50 then performs comprehensive evaluation indicating the state of the user who is executing predetermined work, on the basis of the respective point values calculated in the processing from S41 to S43 (S44). The result of the comprehensive evaluation is, for example, expressed as an evaluation value classified into four stages to be stored in the state field of the user state management information. For example, the management server 50 adds up the respective point values calculated in processing from S41 to S43 to calculate a total point value. The management server 50 then, for example, may classify the calculated total point value less than “5” into the evaluation value of “4” indicating a favorable state for executing work. In this case, for example, the total point value of “from 5 to 10” is classified into the evaluation value of “3” indicating a slightly favorable state, the total point value of “from 10 to 20” is classified into the evaluation value of “2” indicating a slightly fatigued state, and the total point value equal to or greater than “20” is classified into the evaluation value of “1” indicating a fatigued state. The management server 50 stores the evaluation result in the state field of the user state management information illustrated in
Further, the management server 50 may calculate average duration of work in a predetermined time period from the counted number of times of interruption of work (NBR). Here, the calculated average duration of work is indicated as (predetermined time period/the number of times of interruption of work). The management server 50 then may classify the state of the user on the basis of the average duration of work.
In S45, the management server 50 determines elapse of the predetermined time period, and, in the case where the predetermined time period is finished (S45: “Yes”), finishes the processing illustrated in
As described above, in the work support system 1 of the present embodiment, it is possible to determine that the user who is executing work is put into a state where the user needs a break on the basis of at least one of the number of times of occurrence of a biological phenomenon in the biological information per unit period and a proportion of a work period which is determined from the image in an elapsed time period.
[Computer Readable Recording Medium]
It is possible to record a program which causes a computer, or other machine, apparatuses (hereinafter, a computer, or the like) to implement one of the above-described functions in a computer readable recording medium. Then, by causing the computer, or the like, to load and execute the program in this recording medium, it is possible to provide the function.
Here, the computer readable recording medium refers to a non-transitory recording medium in which information such as data and programs is accumulated through electric, magnetic, optical, mechanical or chemical action and from which the information can be read from a computer, or the like. Among such a recording medium, examples of a recording medium which is detachable from the computer, or the like, can include, for example, a flexible disk, a magneto-optical disk, a CD-ROM, a CD-R/W, a DVD, a blu-ray disk, a DAT, an 8 mm tape, a memory card such as a flash memory, or the like. Further, examples of a recording medium fixed at the computer, or the like, can include a hard disk, a ROM (read only memory), or the like. Still further, an SSD (Solid State Drive) can be utilized both as a recording medium which is detachable from the computer, or the like, and a recording medium which is fixed at the computer, or the like.
Number | Date | Country | Kind |
---|---|---|---|
2018-052996 | Mar 2018 | JP | national |