This application is based upon and claims the benefit of priority under 35 USC 119 of Japanese Patent Application No. 2023-158177, filed on Sep. 22, 2023, the entire disclosure of which, including the description, claims, drawings, and abstract, is incorporated herein by reference in its entirety.
The present disclosure relates generally to a robot, a control method, and a recording medium.
Electronic devices that imitate living creatures such as pets, human beings, and the like are known in the related art. For example, Unexamined Japanese Patent Application Publication No. 2003-159681 describes a robot device that, when a specific input is provided, exhibits a specific action associated with the specific input.
A robot according to an embodiment of the present disclosure includes:
A more complete understanding of this application can be obtained when the following detailed description is considered in conjunction with the following drawings, in which:
Hereinafter, embodiments of the present disclosure are described while referencing the drawings Note that, in the drawings, identical or corresponding components are denoted with the same reference numerals.
The robot 200 according to Embodiment 1 includes an exterior 201, decorative parts 202, bushy fur 203, head 204, coupler 205, torso 206, housing 207, touch sensor 211, acceleration sensor 212, microphone 213, illuminance sensor 214, and speaker 231 identical to those of the robot 200 disclosed in Unexamined Japanese Patent Application Publication No. 2023-115370 and, as such, description thereof is foregone. Note that the shape of the head 204 may be the shape illustrated in
The robot 200 according to Embodiment 1 includes a twist motor 221 and a vertical motor 222 identical to those of the robot 200 disclosed in Unexamined Japanese Patent Application Publication No. 2023-115370 and, as such, description thereof is foregone. The twist motor 221 and the vertical motor 222 of the robot 200 according to Embodiment 1 operate in the same manner as those of the robot 200 disclosed in Unexamined Japanese Patent Application Publication No. 2023-115370.
The robot 200 includes a gyrosensor 215. By using the acceleration sensor 212 and the gyrosensor 215, the robot 200 can detect a change of an attitude of the robot 200 itself, and can detect being picked up, the orientation being changed, being thrown, and the like by the user.
Note that, at least a portion of the acceleration sensor 212, the microphone 213, the gyrosensor 215, the illuminance sensor 214, and the speaker 231 is not limited to being provided on the torso 206 and may be provided on the head 204, or may be provided on both the torso 206 and the head 204.
Next, the functional configuration of the robot 200 is described while referencing
The control device 100 is a device that controls the robot 200. The control device 100 includes a controller 110 that is an example of control means, a storage 120 that is an example of storage means, and a communicator 130 that is an example of communication means.
The controller 110 includes a central processing unit (CPU). In one example, the CPU is a microprocessor or the like and is a central processing unit that executes a variety of processing and computations. In the controller 110, the CPU reads out a control program stored in the ROM and controls the behavior of the entire robot 200 while using the RAM as working memory. Additionally, while not illustrated in the drawings, the controller 110 is provided with a clock function, a timer function, and the like, and can measure the date and time, and the like. The controller 110 may also be called a “processor.”
The storage 120 includes read-only memory (ROM), random access memory (RAM), flash memory, and the like. The storage 120 stores an operating system (OS), application programs, and other programs and data used by the controller 110 to perform the various processes. Moreover, the storage unit 120 stores data generated or acquired as a result of the controller 110 performing the various processes.
The communicator 130 includes an interface for communicating with external devices of the robot 200. In one example, the communicator 130 communicates with external devices including the terminal device 50 in accordance with a known communication standard such as a wireless local area network (LAN), Bluetooth Low Energy (BLE, registered trademark), Near Field Communication (NFC), or the like.
The sensor 210 includes the touch sensor 211, the acceleration sensor 212, the gyrosensor 215, the illuminance sensor 214, and the microphone 213 described above. The sensor 210 is an example of detection means that detects an external stimulus.
The touch sensor 211 includes, for example, a pressure sensor and a capacitance sensor, and detects contacting by some sort of object. The controller 110 can, on the basis of detection values of the touch sensor 211, detect that the robot 200 is being pet, is being struck, and the like by the user.
The acceleration sensor 212 detects an acceleration applied to the torso 206 of the robot 200. The acceleration sensor 212 detects acceleration in each of the X axis direction, the Y axis direction, and the Z axis direction. That is, the acceleration sensor 212 detects acceleration on three axes.
In one example, the acceleration sensor 212 detects gravitational acceleration when the robot 200 is stationary. The controller 110 can detect the current attitude of the robot 200 on the basis of the gravitational acceleration detected by the acceleration sensor 212. In other words, the controller 110 can detect whether the housing 207 of the robot 200 is inclined from the horizontal direction on the basis of the gravitational acceleration detected by the acceleration sensor 212. Thus, the acceleration sensor 212 functions as an incline detection means that detects the inclination of the robot 200.
Additionally, when the user picks up or throws the robot 200, the acceleration sensor 212 detects, in addition to the gravitational acceleration, acceleration caused by the move of the robot 200. Accordingly, the controller 110 can detect the move of the robot 200 by removing the gravitational acceleration component from the detection value detected by the acceleration sensor 212.
The gyrosensor 215 detects an angular velocity from when rotation is applied to the torso 206 of the robot 200. Specifically, the gyrosensor 215 detects the angular velocity on three axes of rotation, namely rotation around the X axis direction, rotation around the Y axis direction, and rotation around the Z axis direction. It is possible to more accurately detect the move of the robot 200 by combining the detection value detected by the acceleration sensor 212 and the detection value detected by the gyrosensor 215.
Note that, at a synchronized timing (for example every 0.25 seconds), the touch sensor 211, the acceleration sensor 212, and the gyrosensor 215 respectively detect the strength of contact, the acceleration, and the angular velocity, and output the detection values to the controller 110.
The microphone 213 detects ambient sound of the robot 200. The controller 110 can, for example, detect, on the basis of a component of the sound detected by the microphone 213, that the user is speaking to the robot 200, that the user is clapping their hands, and the like.
The illuminance sensor 214 detects the illuminance of the surroundings of the robot 200. The controller 110 can detect that the surroundings of the robot 200 have become brighter or darker on the basis of the illuminance detected by the illuminance sensor 214.
The controller 110 acquires, via the bus line BL and as an external stimulus, detection values detected by the various sensors of the sensor 210. The external stimulus is a stimulus that acts on the robot 200 from outside the robot 200. Examples of the external stimulus include “there is a loud sound”, “spoken to”, “petted”, “picked up”, “turned upside down”, “became brighter”, “became darker”, and the like.
In one example, the controller 110 acquires the external stimulus of “there is a loud sound” or “spoken to” by the microphone 213, and acquires the external stimulus of “petted” by the touch sensor 211. Additionally controller 110 acquires the external stimulus of “picked up” or “turned upside down” by the acceleration sensor 212 and the gyrosensor 215, and acquires the external stimulus of “became brighter” or “became darker” by the illuminance sensor 214.
Note that a configuration is possible in which the sensor 210 includes sensors other than the touch sensor 211, the acceleration sensor 212, the gyrosensor 215, and the microphone 213. The types of external stimuli acquirable by the controller 110 can be increased by increasing the types of sensors of the sensor 210.
The driver 220 includes the twist motor 221 and the vertical motor 222, and is driven by the controller 110. The twist motor 221 is a servo motor for rotating the head 204, with respect to the torso 206, in the left-right direction (the width direction) with the front-back direction as an axis. The vertical motor 222 is a servo motor for rotating the head 204, with respect to the torso 206, in the up-down direction (height direction) with the left-right direction as an axis. The robot 200 can express movements of turning the head 204 to the side by using the twist motor 221, and can express movements of lifting/lowering the head 204 by using the vertical motor 222.
The outputter 230 includes the speaker 231, and sound is output from the speaker 231 as a result of sound data being input into the outputter 230 by the controller 110. For example, the robot 200 emits a pseudo-animal sound as a result of the controller 110 inputting animal sound data of the robot 200 into the outputter 230.
A configuration is possible in which, instead of the speaker 231, or in addition to the speaker 231, a display such as a liquid crystal display, a light emitter such as a light emitting diode (LED), or the like is provided as the outputter 230, and emotions such as joy, sadness, and the like are displayed on the display, expressed by the color and brightness of the emitted light, or the like.
The operator 240 includes an operation button, a volume knob, or the like. In one example, the operator 240 is an interface for receiving user operations such as turning the power ON/OFF, adjusting the volume of the output sound, and the like.
A battery 250 is a rechargeable secondary battery, and stores power to be used in the robot 200. The battery 250 is charged when the robot 200 has moved to a charging station.
A position information acquirer 260 includes a position information sensor that uses a global positioning system (GPS), and acquires current position information of the robot 200. Note that the position information acquirer 260 is not limited to GPS, and a configuration is possible in which the position information acquirer 260 acquires the position information of the robot 200 by a common method that uses wireless communication, or acquires the position information of the robot 200 through an application/software of the terminal device 50.
The controller 110 functionally includes an action information acquirer 111 that is an example of action information acquiring means and acquiring means, a state parameter acquirer 112 that is an example of state controlling means, and an action controller 113 that is an example of action controlling means and controlling means. In the controller 110, the CPU performs control and reads the program stored in the ROM out to the RAM and executes that program, thereby functioning as the various components described above.
Additionally, the storage 120 stores action information 121, a state parameter 122, log information 123, a coefficient table 124, and an individuality table 125.
Next, the configuration of the terminal device 50 is described while referencing
The controller 510 includes a CPU. In the controller 110, the CPU reads a control program stored in the ROM and controls the operations of the entire terminal device 50 while using the RAM as working memory. The controller 510 may also be called a “processor.”
The storage 520 includes a ROM, a RAM, a flash memory, and the like. The storage 520 stores programs and data used by the controller 510 to perform various processes. Moreover, the storage 520 stores data generated or acquired as a result of the controller 510 performing the various processes.
The operator 530 includes an input device such as a keyboard, a mouse, a touch pad, a touch panel, a touch pad, and the like, and receives operation inputs from the user.
The display 540 includes a display device such as a liquid crystal display or the like, and displays various images on the basis of control by the controller 510. The display 540 is an example of display means.
The communicator 550 includes a communication interface for communicating with external devices of the terminal device 50. In one example, the communicator 550 communicates with external devices including the robot 200 in accordance with a known communication standard such as a wireless LAN, BLE (registered trademark), NFC, or the like.
The controller 510 functionally includes an action information creator 511 that is an example of action information creating means. In the controller 510, the CPU performs control and reads the program stored in the ROM out to the RAM and executes that program, thereby functioning as the various components described above.
Returning to
The “movement” refers to a physical motion of the robot 200, executed by driving of the driver 220. Specifically, the “movement” corresponds to moving the head 204 relative to the torso 206 by the twist motor 221 or the vertical motor 222. The “sound output” refers to outputting of various sounds such as animal sounds or the like from the speaker 231 of the outputter 230.
The action information 121 defines, by combinations of such movements and sound outputs (animal sounds), the actions that the robot 200 is to execute. A configuration is possible in which the action information 121 is incorporated into the robot 200 in advance, but it is possible for the user to freely create the action information 121 by operating the terminal device 50.
Returning to
Specifically, the user operates the operation 530 to start up a programming application/software installed in advance on the terminal device 50. As a result, the action information creator 511 displays an action information 121 creation screen such as illustrated in
The execution order and execution timing of a movement and a sound output (animal sound) that the robot 200 is to be caused to execute can respectively be set in a movement field and a sound field of the creation screen. The user can, by selecting and combining movements and sounds from menus while viewing this creation screen, freely program an action that the robot 200 is to be caused to execute.
Specifically, in the example of
The action information 121 created by such user operations more specifically has the configuration illustrated in
The trigger is a condition for the robot 200 to execute the action. Upon the trigger defined for a certain action being met, that action is executed by the robot 200. In the example of
Here, “Upon speech recognition” corresponds to a case in which the action name is recognized, by a speech recognition function of the robot 200, from speech of the user detected by the microphone 213. Additionally, “Upon head being petted” corresponds to a case in which the user petting of the head 204 of the robot 200 is detected by the touch sensor 211.
Note that the trigger is not limited to the examples described above, and various conditions may be used. For example, the trigger may be a case in which “There is a loud sound” is detected by the microphone 213, a case in which “Picked up” or “Turned upside down” is detected by the acceleration sensor 212 and the gyrosensor 215, or a case in which
“Became brighter” or “Became darker” is detected by the illuminance sensor 214. These can be called triggers based on external stimuli detected by the sensor 210. Alternatively, the trigger may be “A specific time arrived” or “The robot 200 moved to a specific location.” These can be called triggers not based on external stimuli.
Note that a configuration is possible in which, when an execution command is received from the terminal device 50, each action is executed regardless of the trigger defined in the action information 121.
The action control parameters are parameters for causing the robot 200 to execute each action. The action control parameters includes various items, namely a movement, an animal sound, an execution start timing, a movement parameter, and an animal sound parameter.
The movement item defines the types and order of the movements constituting each action. The animal sound item defines the types and order of the sound outputs constituting each action. The execution start timing defines a timing at which to execute each of the movements or animal sounds constituting each action. Specifically, the execution start timing defines, for each movement or animal sound, a timing that is an origin point for execution, and an amount of execution time.
The movement parameter defines, for each of the movements constituting each action, an amount of movement time and a movement distance of the twist motor 221 or the vertical motor 222 when executing that movement. The animal sound parameter defines, for each animal sound constituting each action, a volume of the sound output from the speaker 231 when executing that animal sound.
The execution count is a cumulative number of times that each action has been executed by the robot 200. An initial value of the execution count is 0. The execution count of each action is increased by 1 each time the robot 200 executes that action. The previous execution date and time is the date and time at which the robot 200 last executed each action.
The action information creator 511 creates, on the basis of user commands, the action information 121 having the data configuration described above. When the action information 121 is created, the action information creator 511 communicates with the robot 200 via the communicator 550, and sends the created action information 121 to the robot 200. In the robot 200, the action information acquirer 111 communicates with the terminal device 50 via the communicator 130, and acquires and saves, in the storage 120, the action information 121 created in the terminal device 50. Thus, the action information acquirer 111 acquires a plurality of action control parameters corresponding to the plurality of actions executable by the robot 200. Note that the action control parameters are an example of action control information for causing the robot 200 to execute an action.
Returning to
The emotion parameter is a parameter that represents a pseudo-emotion of the robot 200. The emotion parameter is expressed by coordinates (X, Y) on an emotion map 300.
As illustrated in
The emotion parameter represents a plurality (in the present embodiment, four) of mutually different pseudo-emotions. In
Note that, in
The state parameter acquirer 112 calculates an emotion change amount that is an amount of change of that the X value and the Y value of the emotion parameter are increased or decreased. The emotion change amount is expressed by the following four variables: DXP and DXM respectively increase and decrease the X value of the emotion parameter. DYP and DYM respectively increase and decrease the Y value of the emotion parameter.
The state parameter acquirer 112 calculates an emotion change amount that is an amount of change that each of the X value and the Y value of the emotion parameter is increased or decreased. The emotion change amount is expressed by the following four variables.
DXP: Tendency to relax (tendency to change in the positive value direction of the X value on the emotion map)
DXM: Tendency to worry (tendency to change in the negative value direction of the X value on the emotion map)
DYP: Tendency to be excited (tendency to change in the positive value direction of the Y value on the emotion map)
DYM: Tendency to be disinterested (tendency to change in the negative value direction of the Y value on the emotion map)
The state parameter acquirer 112 updates the emotion parameter by adding or subtracting a value, among the emotion change amounts DXP, DXM, DYP, and DYM, corresponding to the external stimulus to or from the current emotion parameter. For example, when the head 204 is petted, the pseudo-emotion of the robot 200 is relaxed and, as such, the state parameter acquirer 112 adds the DXP to the X value of the emotion parameter. Conversely, when the head 204 is struck, the pseudo-emotion of the robot 200 is worried and, as such, the state parameter acquirer 112 subtracts the DXM from the X value of the emotion parameter. Which emotion change amount is associated with the various external stimuli can be set as desired. An example is given below.
The head 204 is petted (relax): X=X+DXP
The head 204 is struck (worry): X=X−DXM
(these external stimuli can be detected by the touch sensor 211 of the head 204)
The torso 206 is petted (excite): Y=Y+DYP
The torso 206 is struck (disinterest): Y=Y−DYM
(these external stimuli can be detected by the touch sensor 211 of the torso 206)
Held with head upward (happy): X=X+DXP and Y=Y+DYP
Suspended with head downward (sad): X=X−DXM and Y=Y−DYM
(these external stimuli can be detected by the touch sensor 211 and the acceleration sensor 212)
Spoken to in kind voice (peaceful): X=X+DXP and Y=Y−DYM
Yelled at in loud voice (upset): X=X−DXM and Y=Y+DYP
(these external stimuli can be detected by the microphone 213)
The sensor 210 acquires a plurality of external stimuli of different types by a plurality of sensors. The state parameter acquirer 112 derives various emotion change amounts in accordance with each individual external stimulus of the plurality of external stimuli, and sets the emotion parameter in accordance with the derived emotion change amounts.
The initial value of these emotion change amounts DXP, DXM, DYP, and DYM is 10, and the amounts increase to a maximum of 20. The state parameter acquirer 112 updates the various variables, namely the emotion change amounts DXP, DXM, DYP, and DYM in accordance with the external stimuli detected by the sensor 210.
Specifically, when the X value of the emotion parameter is set to the maximum value of the emotion map 300 even once in one day, the state parameter acquirer 112 adds 1 to the DXP, and when the Y value of the emotion parameter is set to the maximum value of the emotion map 300 even once in one day, the state parameter acquirer 112 adds 1 to the DYP. Additionally, when the X value of the emotion parameter is set to the minimum value of the emotion map 300 even once in one day, the state parameter acquirer 112 adds 1 to the DXM, and when the Y value of the emotion parameter is set to the minimum value of the emotion map 300 even once in one day, the state parameter acquirer 112 adds 1 to the DYM.
Thus, the state parameter acquirer 112 changes the emotion change amounts in accordance with a condition based on whether the value of the emotion parameter reaches the maximum value or the minimum value of the emotion map 300 (first condition based on external stimulus). As an example, assume that all of the initial values of the various variables of the emotion change amount are set to 10. The state parameter acquirer 112 increases the various variables to a maximum of 20 by updating the emotion change amounts described above. Due to this updating processing, each emotion change amount, that is, the degree of change of emotion, changes.
For example, when only the head 204 is petted multiple times, only the emotion change amount DXP increases and the other emotion change amounts do not change. As such, the robot 200 develops a personality of having a tendency to be relaxed. When only the head 204 is struck multiple times, only emotion change amount DXM increases and the other emotion change amounts do not change. As such the robot 200 develops a personality of having a tendency to be worried. Thus, the state parameter acquirer 112 changes the emotion change amounts in accordance with various external stimuli.
The personality parameter is a parameter expressing the pseudo-personality of the robot 200. The personality parameter includes a plurality of personality values that express degrees of mutually different personalities. The state parameter acquirer 112 changes the plurality of personality values included in the personality parameter in accordance with external stimuli detected by the sensor 210.
Specifically, the state parameter acquirer 112 calculates four personality values on the basis of (Equation 1) below. Specifically, a value obtained by subtracting 10 from DXP that expresses a tendency to be relaxed is set as a personality value (chipper), a value obtained by subtracting 10 from DXM that expresses a tendency to be worried is set as a personality value (shy), a value obtained by subtracting 10 from DYP that expresses a tendency to be excited is set as a personality value (active), and a value obtained by subtracting 10 from DYM that expresses a tendency to be disinterested is set as a personality value (spoiled).
Personality value (chipper)=DXP−10
Personality value (shy)=DXM−10
Personality value (active)=DYP−10
Personality value (spoiled)=DYM−10 (Equation 1)
As a result, as illustrated in
Since the initial value of each of the personality values is 0, the personality at the time of birth of the robot 200 is expressed by the origin of the personality value radar chart 400. Moreover, as the robot 200 grows, the four personality values change, with an upper limit of 10, due to external stimuli and the like (manner in which the user interacts with the robot 200) detected by the sensor 210. Therefore, 11 to the power of 4=14,641 types of personalities can be expressed.
Thus, the robot 200 assumes various personalities in accordance with the manner in which the user interacts with the robot 200. That is, the personality of each individual robot 200 is formed differently on the basis of the manner in which the user interacts with the robot 200.
These four personality values are fixed when the juvenile period elapses and the pseudo-growth of the robot 200 is complete. In the subsequent adult period, the state parameter acquirer 112 adjusts four personality correction values (chipper correction value, active correction value, shy correction value, and spoiled correction value) in order to correct the personality in accordance with the manner in which the user interacts with the robot 200.
The state parameter acquirer 112 adjusts the four personality correction values in accordance with a condition based on where the area in which the emotion parameter has existed the longest is located on the emotion map 300. Specifically, the four personality correction values are adjusted as in (A) to (E) below.
Note that the various areas of relaxed, excited, worried, disinterested, and center are examples and, for example, a configuration is possible in which the emotion map 300 is divided into more detailed areas such as happy, excited, upset, sad, peaceful, normal, and the like.
When setting the four personality correction values, the state parameter acquirer 112 calculates the four personality values in accordance with (Equation 2) below.
Personality value (chipper)=DXP−10+chipper correction value
Personality value (shy)=DXM−10+shy correction value
Personality value (active)=DYP−10+active correction value
Personality value (spoiled)=DYM−10+spoiled correction value (Equation 2)
The battery level is the remaining amount of power stored in the battery 250, and is a parameter expressing a pseudo degree of hunger of the robot 200. The state parameter acquirer 112 acquires information about the current battery level by a power supply controller that controls charging and discharging of the battery 250.
The current location is the location at which the robot 200 is currently positioned. The state parameter acquirer 112 acquires information about the current position of the robot 200 by the position information acquirer 260.
More specifically, the state parameter acquirer 112 references past position information of the robot 200 stored in the log information 123. The log information 123 is data in which past action data of the robot 200 is recorded. Specifically, the log information 123 includes the past position information, emotion parameters, and personality parameters of the robot 200, data expressing changes in the state parameters 122 such as the battery level, and sleep data expressing past wake-up times, bed times, and the like of the robot 200.
The state parameter acquirer 112 determines that the current location is home when the current location matches a position where the record frequency is the highest. When the current location is not the home, the state parameter acquirer 112 determines, on the basis of the past record count of that location in the log information 123, whether the current location is a location visited for the first time, a frequently visited location, a location not frequently visited, or the like, and acquires determination information thereof. For example, when the past record count is five times or greater, the state parameter acquirer 112 determines that the current location is a frequently visited location, and when the past record count is less than five times, the state parameter acquirer 112 determines that the current location is a location not frequently visited.
The current time is the time at present. The state parameter acquirer 112 acquires the current time by a clock provided to the robot 200. Note that, as with the acquisition of the position information, the acquisition of the current time is not limited to this method.
More specifically, the state parameter acquirer 112 references a present day wake-up time and a past average bed time recorded in the log information 123 to determine whether the current time is immediately after the wake-up time of the present day or immediately before the bed time.
The log information 123 includes sleep data. While not illustrated in the drawings, the sleep data includes a sleep log and compiled sleep data. The past wake-up time and bed time of the robot 200 are recorded every day in the sleep log. The compiled sleep data is data compiled from the sleep log, and the average wake-up time and the average bed time for every day are recorded in the compiled sleep data.
in one example, when the current time is within 30 minutes after the wake-up time of the present day, the state parameter acquirer 112 determines that the current time is immediately after the wake-up time of the present day. Additionally, when the current time is within 30 minutes before the past average bed time, the state parameter acquirer 112 determines that it is immediately before the bed time.
While not illustrated in the drawings, the past nap times of the robot 200 are recorded in the sleep data. The state parameter acquirer 112 references the past nap times recorded in the log information 123 to determine whether the current time corresponds to the nap time.
The growth days count expresses the number of days of pseudo-growth of the robot 200. The robot 200 is pseudo-born at the time of first start up by the user after shipping from the factory, and grows from a juvenile to an adult over a predetermined growth period. The growth days count corresponds to the number of days since the pseudo-birth of the robot 200.
An initial value of the growth days count is 1, and the state parameter acquirer 112 adds 1 to the growth days count for each passing day. In one example, the growth period in which the robot 200 grows from a juvenile to an adult is 50 days, and the 50-day period that is the growth days count since the pseudo-birth is referred to as a “juvenile period (first period).” When the juvenile period elapses, the pseudo-growth of the robot 200 ends. A period after the completion of the juvenile period is called an “adult period (second period).”
During the juvenile period, each time the pseudo growth days count of the robot 200 increases one day, the state parameter acquirer 112 increases the maximum value and the minimum value of the emotion map 300 both by two. Regarding an initial value of the size of the emotion map 300, as illustrated by a frame 301, a maximum value of both the X value and the Y value is 100 and a minimum value is-100. When the growth days count exceeds half of the juvenile period (for example, 25 days), as illustrated by a frame 302, the maximum value of the X value and the Y value is 150 and the minimum value is-150. When the juvenile period elapses, the pseudo-growth of the robot 200 ends. At this time, as illustrated by a frame 303, the maximum value of the X value and the Y value is 200 and the minimum value is-200. Thereafter, the size of the emotion map 300 is fixed.
A settable range of the emotion parameter is defined by the emotion map 300. Thus, as the size of the emotion map 300 expands, the settable range of the emotion parameter expands Due to the settable range of the emotion parameter expanding, richer emotion expression becomes possible and, as such, the pseudo-growth of the robot 200 is expressed by the expanding of the size of the emotion map 300.
Returning to
The action controller 113 determines, on the basis of detection results and the like from the sensor 210, whether any trigger among the plurality of triggers defined in the action information 121 is met. For example, the action controller 113 determines whether speech of the user is recognized, whether the head 204 of the robot 200 is petted, whether a specific time has arrived, and whether any predetermined trigger in the action information 121, such as the robot 200 moved to a specific location, is met. When, as a result of the determination, any trigger is met, the robot 200 is caused to execute the action corresponding to the met trigger.
When any trigger is met, the action controller 113 references the action information 121 and identifies the action control parameters set for the action corresponding to the met trigger. Specifically, the action controller 113 identified, as the action control parameters, a combination of movements or animal sounds that are elements constituting the action corresponding to the met trigger, the execution start timing of each element, and the movement parameter or the animal sound parameter that is the parameter of each element. Then, on the basis of the identified action control parameters, the action controller 113 drives the driver 220 or outputs the sound from the speaker 231 to cause the robot 200 to execute the action corresponding to the met trigger.
More specifically, the action controller 113 corrects, on the basis of the state parameters 122 acquired by the state parameter acquirer 112, the action control parameters identified from the action information 121. By doing this, it is possible to add changes to the actions in accordance with the current state of the robot 200, and it is possible to realistically imitate a living creature.
The action controller 113 references the coefficient table 124 to correct the action control parameters. As illustrated in
The correction coefficients are coefficients for correcting the action control parameters identified from the action information 121. Specifically, each correction coefficients are defined by an action direction and a weighting coefficient for each of a speed and an amplitude of a vertical movement by the vertical motor 222, a speed and an amplitude of a left-right movement by the twist motor 221, and a movement start time lag.
More specifically, the action controller 113 determines, for the following (1) to (5), that state to which the current state of the robot 200, expressed by the state parameters 122 acquired by the state parameter acquirer 112, corresponds. Then, the action controller 113 corrects the action control parameters using the correction coefficients corresponding to the current state of the robot 200.
As an example, in the coefficient table 124 illustrated in
In the coefficient table 124 illustrated in
In addition to (5) the current time described above, the action controller 113 identifies, for each state, namely (1) the emotion parameter, (2) the personality parameter, (3) the battery level, and (4) the current location, the correction coefficients of the corresponding state from the coefficient table 124. Then, the action controller 113 corrects the action control parameters using the sum total of the correction coefficients corresponding to all of (1) to (5).
Next, a specific example is described in which (1) the current emotion parameter corresponds to happy, (2) the current personality parameter corresponds to chipper, (3) the current battery level corresponds to 30% or less, (4) the current location corresponds to location visited for the first time, and (5) the current time corresponds to immediately after waking up.
In this case, when referencing the coefficient table 124 illustrated in
The sum total of the correction coefficients of the movement start time lag is calculated as “+0+0+0.3+0.2+0.2=+0.7.” As such, on the basis of the values acquired from the action information 121, the action controller 113 slows the execution start timing by 70% of normal.
Note that, while omitted from the drawings, the coefficient table 124 defines a correction coefficient for the animal sound in the same manner as for the movement. Specifically, the action controller 113 uses the correction coefficient corresponding to the state parameter 122 acquired from the state parameter acquirer 112 to correct the volume. Here, the volume is the animal sound parameter set for the action corresponding to the met trigger in the action information 121.
Thus, the action controller 113 corrects the action control parameters on the basis of the state parameters 122 acquired by the state parameter acquirer 112. Then, the action controller 113 causes the robot 200 to execute the action corresponding to the met trigger by causing the driver 220 to drive or outputting a sound from the speaker 231 on the basis of the corrected action control parameters.
Additionally, when causing the robot 200 to execute the action corresponding to the met trigger, the action controller 113 performs one of (1A) a first control for causing the robot 200 to correctly execute that action, and (1B) a second control for causing the robot 200 to incorrectly execute that action or for causing the robot 200 to not execute that action.
Here, incorrectly executing the action or, in other words, executing at least a portion of the action incorrectly means executing by a sequence that deviates from the sequence defined for that action. Specifically, incorrectly executing the action means, for the speed at which the action is executed, executing at a speed different from the speed based on the action control parameters, or executing the action by control different from the control based on the action control parameters. More specifically, incorrectly executing the action corresponds to omitting the executing of at least one element of the plurality of elements (movements or animal sounds) constituting that action, switching the execution order of at least one element with that of another element, or changing the action control parameters of at least one element.
As a specific example, for the action of “Test 1” illustrated in
Thus, the action controller 113 executes the first control or the second control in accordance with the situation and, as such, not only executes the action correctly every time, but also mistakes or omits a portion of the action, depending on the situation. Due to this, the actions of the robot 200 are not uniform, which allows the robot 200 to express individuality and express lifelikeness.
More specifically, in the case of (1A) in which the robot 200 is caused to execute at least one action of the plurality of actions executable by the robot 200, the action controller 113 causes the robot 200 to correctly execute the action. That is, in this case, the action controller 113 performs the first control. In contrast, in the case of (1B) in which the robot 200 is caused to execute an action other than the at least one action of the plurality of actions executable by the robot 200, the action controller 113 causes the robot 200 to incorrectly execute the action. That is, in this case, the action controller 113 performs the second control.
Here, the plurality of actions executable by the robot 200 are the actions of the action information 121 created by the action information creator 511 or are actions of the action information 121 integrated in advance into the robot 200. When causing the robot 200 to execute each of the plurality of actions executable by the robot 200, depending in the action to be executed, the action controller 113 switches between causing the robot 200 to execute that action correctly (performing the first control) and executing that action incorrectly (performing the second control).
In Embodiment 1, the action controller 113 controls, on the basis of the state of the robot 200, the number of the at least one action that is subjected to the first control. For example, when the state of the robot 200 is a first state, the action controller 113 causes the robot 200 to execute all of the executable actions correctly but, when the state of the robot 200 is a second state, the action controller 113 causes the robot 200 to execute 80% of the action among the plurality of executable actions correctly, and causes the robot 200 to execute 20% of the actions incorrectly. Thus, the action controller 113 changes, in accordance with the state of the robot 200, the number of actions subjected to the first control and to the second control.
Here, the state of the robot 200 is, specifically, expressed by the state parameters 122 acquired by the state parameter acquirer 112. The action controller 113 controls, on the basis of at least one of the state parameters 122, namely (1) the emotion parameter, (2) the personality parameter, (3) the battery level, (4) the current location, and (5) the current time, the number of actions, among the plurality of actions executable by the robot 200, that are subjected to the first control and to the second control.
In one example, the number of the at least one actions subjected to the first control may increase as the value of a predetermined pseudo-emotion of the emotion parameter increases.
Additionally, the number of the at least one actions subjected to the first control may decrease as the value of the predetermined pseudo-emotion decreases. Here, the value of the predetermined pseudo-emotion corresponds, for example, to the degree of relaxation or the degree of excitement on the emotion map illustrated in
Alternatively, the number of the at least one actions subjected to the first control may change in accordance with the area, among the happy, upset, sad, disinterested, normal, or other area on the emotion map 300, in which the coordinates (X, Y) expressing the current emotion parameter of the robot 200 are positioned. For example, the number of the at least one actions subjected to the first control may increase more when the emotion parameter is positioned in the happy area than when the emotion parameter is positioned in the upset area, the sad area, and the disinterested area.
In another example, the number of the at least one actions subjected to the first control may change in accordance with the greatest personality value among the plurality of personality values of the personality parameter. As illustrated in
For example, the action controller 113 may set all of the plurality of actions executable by the robot 200 to be subjected to the first control when the personality value of active or chipper is the greatest among the four personality values. Moreover, the action controller 113 may set 80% of the actions among the plurality of actions executable by the robot 200 to be subjected to the first control, and the remaining 20% of the actions to be subjected to the second control when the shy or spoiled personality value is the greatest among the four personality values.
Additionally, the number of the actions subjected to the first control may increase as the personality value of active or chipper among the four personality values increases, and the actions subjected to the first control may decrease as the personality value of shy or spoiled increases.
Alternatively, the action controller 113 may, on the basis of another parameter of the state parameters 122, change the number of actions subjected to the first control and the second control. For example, the action controller 113 may increase the number of the actions subjected to the first control as the battery level increases. Additionally, regarding the current location of the robot 200, the action controller 113 may set the number of the actions to be subjected to the first control to the greatest when the current location is home, and sequentially decrease the number of the actions to be subjected to the first control as the current location changes from a frequently visited location, to a location not frequency visited, and to a location visited for the first time. Furthermore, the action controller 113 may increase the number of the actions to be subjected to the first control more when the current time is immediately before bedtime than when the current time is immediately after waking up or is a nap time.
Correlation between the state parameters 122 and the number of the actions to be subjected to the first control is defined in, for example, an individuality table 125, and is stored in advance in the storage 120. As illustrated in
In a case in which a determination to perform the first control is made, the action controller 113, when causing the robot 200 to execute the action corresponding to the met trigger, drives the driver 220 or outputs sounds from the speaker 231 correctly for all of the plurality of elements (movement or animal sounds) constituting that action, in accordance with the action control parameters corrected with the correction coefficients.
In contrast, in a case in which a determination to perform the second control is made, the action controller 113, when causing the robot 200 to execute a portion of the elements of the plurality of elements (movement or animal sounds) constituting the action corresponding to the met trigger, drives the driver 220 or outputs sounds from the speaker 231 not correctly in accordance with the action control parameters corrected with the correction coefficients.
Specifically, the action controller 113 omits execution, switches the execution order, changes the action control parameters, or the like of a portion of the elements of the plurality of elements constituting the action corresponding to the met trigger.
Note that, when causing the robot 200 to executing an element other than the portion of elements of the plurality of elements constituting the met trigger, the action controller 113 drives the driver 220 or outputs the sound from the speaker 231 correctly in accordance with the action control parameters corrected with the correction coefficients.
Here, the portion of elements not correctly executed of the plurality of elements constituting the action corresponding to the met trigger may be randomly selected, or may be selected in accordance with a specific rule.
Next, the flow of robot control processing is described while referencing
When the robot control processing starts, the controller 110 sets the state parameters 122 (step S101). When the robot 200 is started up for the first time (the time of the first start up by the user after shipping from the factory), the controller 110 sets the various parameters, namely the emotion parameter, the personality parameter, and the growth days count to initial values (for example, 0). Meanwhile, at the time of starting up for the second and subsequent times, the controller 110 reads out the values of the various parameters stored in step S106, described later, of the robot control processing to set the state parameter 122. However, a configuration is possible in which the emotion parameters are all initialized to 0 each time the power is turned ON.
When the state parameters 122 are set, the controller 110 communicates with the terminal device 50 and acquires the action information 121 created on the basis of user operations performed on the terminal device 50 (Step S102). Note that, when the action information 121 is already stored in the storage 120, step S102 may be skipped. Step S102 is an example of acquisition step.
When the action information 121 is acquired, the controller 110 determines whether any trigger among the triggers of the plurality of actions defined in the action information 121 is met (step S103).
When any trigger is met (step S103; YES), the controller 110 executes the action control processing and causes the robot 200 to execute the action corresponding to the met trigger (step S104). Details about the action control processing of step S104 are described while referencing the flowchart of
When the action control processing illustrated in
When the state parameters 122 are updated, the controller 110 references the action information 121 and acquires the action control parameters of the action corresponding to the met trigger (step S202). Specifically, the controller 110 acquires, from the action information 121, a combination of movements or animal sounds that are elements constituting the action corresponding to the met trigger, the execution start timing of each element, and the movement parameter or the animal sound parameter that is the parameter of each element.
When the action control parameters are acquired, the controller 110 corrects the action control parameters on the basis of the correction coefficients defined in the coefficient table 124 (step S203). Specifically, the controller 110 calculates the sum total of the correction coefficients corresponding to the state parameters 122 updated in step S201 among the correction coefficients defined in the coefficient table 124 for each of (1) the emotion parameter, (2) the personality parameter, (3) the battery level, (4) the current location, and (5) the current time. Then, the controller 110 corrects the movement parameter, the animal sound parameter, and the execution start timing with the calculated sum total of the correction coefficients.
When the action control parameters are corrected, the controller 110 determine whether to correctly execute the action corresponding to the met trigger (step S204). Specifically, the controller 110 references the individuality table 125 and, in the state parameters 122 updated in step S201, determines whether the action corresponding to the met trigger is subjected to the first control.
When the action is to be executed correctly (step S204; YES), the controller 110 causes the robot 200 to correctly execute the action corresponding to the met trigger (step S205). Specifically, the controller 110 causes the driver 220 to drive or outputs the sound from the speaker 231 correctly in accordance with the action control parameters corrected in step S204.
In contrast, when the action is not to be executed correctly (step S204; NO), the controller 110 determines, of the action corresponding to the met trigger, the movement or the animal sound to not execute correctly (step S206). Specifically, the controller 110 randomly or, in accordance with a specific rule, determines the portion of elements, among the plurality of elements (movements or animal sounds) constituting the action corresponding to the met trigger, to not execute correctly.
Next, the controller 110 causes the robot 200 to incorrectly execute the action corresponding to the met trigger (step S207). Specifically, the controller 110 omits execution, switches the execution order, changes the action control parameters, or the like for the movement or the animal sound determined in step S206. Then, for the other movements or the animal sound 1, the controller 110 drives the driver 220 or outputs the sound from the speaker 231 correctly in accordance with the action control parameters.
When the action is executed, the controller 110 updates the action information 121 (step S208). Specifically, the controller 110 adds 1 to the execution count of the executed action in the action information 121, and updates the previous execution date and time of the executed action in the action information 121 to the current date and time. Thus, the action control processing illustrated in
Returning to
Next, the controller 110 determines whether to end the processing (step S105). For example, when the operator 240 receives a power OFF command of the robot 200 from the user, the processing is ended. When ending the processing (step S105; YES), the controller 110 stores the current the state parameters 122 in the non-volatile memory of the storage 120 (step S106), and ends the robot control processing illustrated in
When not ending the processing (step S105; NO), the controller 110 uses the clock function to determine whether a date has changed (step S107). When the date has not changed (step S107; NO), the controller 110 executes step S103.
When the date has changed (step S107; YES), the controller 110 updates the state parameters 122 (step S108). Specifically, when it is during the juvenile period (for example, 50 days from birth), the controller 110 changes the values of the emotion change amounts DXP, DXM, DYP, and DYM in accordance with whether the emotion parameter has reached the maximum value or the minimum value of the emotion map 300. Additionally, when in the juvenile period, the controller 110 increases both the minimum value and the maximum value of the emotion map 300 by a predetermined increase amount (for example, 2). In contrast, when in the adult period, the controller 110 adjusts the personality correction values.
When the state parameters 122 are updated, the controller 110 adds 1 to the growth days count (step S109), and executes step S103. Then, as long as the robot 200 is operating normally, the controller 110 repeats the processing of steps S103 to S109.
As described above, when causing the robot 200 to execute at least one action among a plurality of executable actions, the robot 200 according to Embodiment 1 is controlled in accordance with a sequence defined for the action to be executed and, when the robot 200 is caused to execute an action other than the at least one action among the plurality of executable actions, the robot 200 is controlled in a sequence in which at least a portion differs from the sequence defined for the action to be executed. At this time, the number of the at least one action that the robot 200 is controlled in accordance with the defined sequence changes in accordance with the state parameters 122.
Thus, the number of actions that are executed correctly changes in accordance with the current state of the robot 200 and, as such, the actions of the robot 200 do not become uniform. Due to this, it is possible to express individuality of the robot 200 and enhance the lifelikeness of the robot 200. In particular, the correctness differs due to the physical condition, environment, and the like of a real living creature. With the robot 200 according to Embodiment 1, the number of actions executed correctly changes in accordance with the current state of the robot 200 and, as such, it is possible to express differences in the details of real living creatures.
Next, Embodiment 2 is described. In Embodiment 2, as appropriate, descriptions of configurations and functions that are the same as described in Embodiment 1 are forgone.
In Embodiment 1, the number of the actions to be executed correctly of the plurality of actions executable by the robot 200 changes in accordance with the state of the robot 200. In contrast, in Embodiment 2, the number of elements to be executed correctly, of the plurality of elements (movements or animal sounds) constituting the action to be executed by the robot 200, changes in accordance with the state of the robot 200.
In Embodiment 2, when causing the robot 200 to execute each of the plurality of elements (movements or animal sounds) constituting the action corresponding to the met trigger, the action controller 113 performs one of (2A) a first element control for causing the robot 200 to correctly execute that element, and (2B) a second element control for causing the robot 200 to incorrectly execute that element or for causing the robot 200 to no execute that element.
Here, the first element control and the second element control respectively correspond to control in which the “action” in the first control and the second control described in Embodiment 1 is replaced with “element.” In other words, in Embodiment 1, the action is the unit for which the determination of whether to correctly execute is carried out, but in Embodiment 2, the element is the unit for which the determination of whether to correctly execute is carried out.
More specifically, in the case of (2A) in which the robot 200 is caused to execute at least one element of the plurality of elements constituting the action corresponding to the met trigger, the action controller 113 causes the robot 200 to correctly execute the element. That is, in this case, the action controller 113 performs the first element control. In contrast, in the case of (2B) in which the robot 200 is caused to execute an element other than the at least one element of the plurality of elements constituting the action corresponding to the met trigger, the action controller 113 causes the robot 200 to incorrectly execute the element. That is, in this case, the action controller 113 performs the second element control.
In Embodiment 2, the action controller 113 controls, on the basis of the state of the robot 200, the number of the at least one element that is subjected to the first element control. Here, as in Embodiment 1, the state of the robot 200 is expressed by the state parameters 122 acquired by the state parameter acquirer 112. The number of elements to be subjected to the first element control is defined for each of a plurality of states expressed by the state parameters 122. The correspondence between the state parameters 122 and the number of the elements subjected to the first element control in Embodiment 2 is the same as the correspondence between the state parameters 122 and the number of the elements subjected to the first control in Embodiment 1.
In one example, the number of the at least one element subjected to the first element control may increase as the value of a predetermined pseudo-emotion of the emotion parameter increases, and may decrease as the value of the predetermined pseudo-emotion decreases. Additionally, the number of the at least one element subjected to the first element control may change in accordance with the greatest personality value among the plurality of personality values of the personality parameter. Alternatively, the number of the at least one element subjected to the first element control may change in accordance with the battery level, the current location, the current time, or the like.
When the action control processing illustrated in
When the action control parameters are corrected, the controller 110 determines the number of elements to be executed correctly of the plurality of elements (movements or animal sounds) constituting the action corresponding to the met trigger (step S304). Specifically, the controller 110 determines, on the basis of the state parameters 122 updated in step S201, the number of the elements subjected to the first element control.
Next, the controller 110 determines the elements to be executed correctly and the elements to be executed incorrectly from among the plurality of elements constituting the action corresponding to the met trigger (step S305). Specifically, the controller 110 determines, of the plurality of elements constituting the action corresponding to the met trigger, the plurality of elements determined in step S304 as the elements to be executed correctly, and determines the other elements as the elements to be executed incorrectly. At this time, the controller 110 may randomly determine which of the elements, of the plurality of elements, to execute correctly, or may make the determination in accordance with a specific rule.
Next, the controller 110 causes the robot 200 to incorrectly execute the action corresponding to the met trigger (step S306). Specifically, the controller 110 omits execution, switches the execution order, changes the action control parameters, or the like for the movement or the animal sound determined in step S305. Then, for the other movements or animal sounds, the controller 110 drives the driver 220 or outputs the sound from the speaker 231 correctly in accordance with the action control parameters.
When the action is executed, the controller 110 updates the action information 121 (step S307). Specifically, the controller 110 adds 1 to the execution count of the executed action in the action information 121, and updates the previous execution date and time of the executed action in the action information 121 to the current date and time. Thus, the action control processing illustrated in
As described above, when causing the robot 200 to execute at least one element action among a plurality of elements constituting an action, the robot 200 according to Embodiment 2 is controlled in accordance with a sequence defined for the elements to be executed and, when the robot 200 is caused to execute an element other than the at least one element, the robot 200 is controlled in a sequence in which at least a portion differs from the sequence defined for the elements to be executed. At this time, the number of the at least one element that the robot 200 is controlled in accordance with the defined sequence changes in accordance with the state parameters 122.
Thus, the number of elements that are executed correctly changes in accordance with the current state of the robot 200 and, as such, the actions of the robot 200 do not become uniform. Due to this, it is possible to express individuality of the robot 200 and enhance the lifelikeness of the robot 200. In particular, in Embodiment 2, the number of elements to be executed correctly in one action changes and, as such, it is possible to express the individuality of the robot 200 with more detail than in Embodiment 1.
Embodiments of the present disclosure are described above, but these embodiments are merely examples and do not limit the scope of application of the present disclosure. That is, various applications of the embodiments of the present disclosure are possible, and all embodiments are included in the scope of the present disclosure.
For example, a configuration is possible in which the features of Embodiment 1 and the features of Embodiment 2 are combined. In other words, a configuration is possible in which the number of actions to be executed correctly and the number of elements to be executed correctly both change in accordance with the state of the robot 200. Specifically, when executing an action subjected to the second control determined by the features of Embodiment 1, the action controller 113 determines the number of actions subjected to the first element control and the second element control in accordance with the state of the robot 200 as described in Embodiment 2. Moreover, the action controller 113 may determine, for each of the plurality of elements (movements or animal sounds) constituting the action to be executed, whether to perform the first element control or to perform the second element control.
In the embodiment described above, the control device 100 is installed in the robot 200, but a configuration is possible in which the control device 100 is not installed in the robot 200 but, rather, is a separated device (for example, a server). When the control device 100 is provided outside the robot 200, the control device 100 communicates with the robot 200 via the communicator 130, the control device 100 and the robot 200 send and receive data to and from each other, and the control device 100 controls the robot 200 as described in the embodiments described above.
In the embodiment described above, the exterior 201 is formed in a barrel shape from the head 204 to the torso 206, and the robot 200 has a shape as if lying on its belly. However, the robot 200 is not limited to resembling a living creature that has a shape as if lying on its belly. For example, a configuration is possible in which the robot 200 has a shape provided with arms and legs, and resembles a living creature that walks on four legs or two legs.
Furthermore, the electronic device is not limited to a robot 200 that imitates a living creature. For example, provided that the electronic device is a device capable of expressing individuality by executing various actions, a configuration is possible in which the electronic device is a wristwatch or the like. Even for an electronic device other than the robot 200, it is possible to described that electronic device in the same manner as in the embodiments described above by providing the same configurations and functions as with the robot 200 described above,
In the embodiment described above, in the controller 110, the CPU executes programs stored in the ROM to function as the various components, namely, the action information acquirer 111, the state parameter acquirer 112, the action controller 113, and the like. Additionally, in the controller 510, the CPU executes programs stored in the ROM to function as the various components, namely, the action information creator 511 and the like. However, in the present disclosure, the controller 110, 510 may include, for example, dedicated hardware such as an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), various control circuitry, or the like instead of the CPU, and this dedicated hardware may function as the various components, namely the action information acquirer 111 and the like. In this case, the functions of each of the components may be realized by individual pieces of hardware, or the functions of each of the components may be collectively realized by a single piece of hardware. Additionally, the functions of each of the components may be realized in part by dedicated hardware and in part by software or firmware.
It is possible to provide a robot 200 or a terminal device 50, provided in advance, with the configurations for realizing the functions according to the present disclosure, but it is also possible to apply a program to cause an existing information processing device or the like to function as the robot 200 or the terminal device 50 according to the present disclosure. That is, a configuration is possible in which a CPU or the like that controls an existing information processing apparatus or the like is used to execute a program for realizing the various functional components of the robot 200 or the terminal device 50 described in the foregoing embodiments, thereby causing the existing information processing device to function as the robot 200 or the terminal device 50 according to the present disclosure.
Additionally, any method may be used to apply the program. For example, the program can be applied by storing the program on a non-transitory computer-readable recording medium such as a flexible disc, a compact disc (CD) ROM, a digital versatile disc (DVD) ROM, and a memory card. Furthermore, the program can be superimposed on a carrier wave and applied via a communication medium such as the internet. For example, the program may be posted to and distributed via a bulletin board system (BBS) on a communication network. Moreover, a configuration is possible in which the processing described above is executed by starting the program and, under the control of the operating system (OS), executing the program in the same manner as other applications/programs.
The foregoing describes some example embodiments for explanatory purposes. Although the foregoing discussion has presented specific embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. This detailed description, therefore, is not to be taken in a limiting sense, and the scope of the invention is defined only by the included claims, along with the full range of equivalents to which such claims are entitled.
Number | Date | Country | Kind |
---|---|---|---|
2023-158177 | Sep 2023 | JP | national |