OBJECT MANIPULATION APPARATUS, HANDLING METHOD, AND PROGRAM PRODUCT

Information

  • Patent Application
  • 20240033905
  • Publication Number
    20240033905
  • Date Filed
    February 28, 2023
    a year ago
  • Date Published
    February 01, 2024
    4 months ago
Abstract
According to one embodiment, an object manipulation apparatus includes one or more hardware processors functioning as a feature calculation unit, a region calculation unit, and a grasp configuration (GC) calculation unit. The feature calculation unit serves to calculate a feature map indicating a feature of a captured image of grasping target objects. The region calculation unit serves to calculate, on the basis of the feature map, a position and a posture of a handling tool by a first parameter on a circular anchor in the image. The handling tool is capable of grasping the grasping target object. The GC calculation unit serves to calculate a GC of the handling tool by converting the position and the posture indicated by the first parameter into a second parameter indicating a position and a posture of the handling tool on the image.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2022-122589, filed on Aug. 1, 2022; the entire contents of which are incorporated herein by reference.


FIELD

Embodiments described herein relate generally to an object manipulation apparatus, a handling method, and a program product.


BACKGROUND

A robot system has been conventionally known, which automates an object handling work, such as a picking automation system for handling baggage or the like stacked in a physical distribution warehouse.


Such a robot system automatically calculates a grasping position or posture of an object and a boxing position and posture of an input destination on the basis of sensor data, such as image information, and actually executes grasping or boxing by a robot having a manipulation planning mechanism.


In recent years, with the development of a machine learning technology, a technology of realizing appropriate actuation of a robot by learning has been used.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic diagram illustrating an example of a configuration of a system for object manipulation task according to an embodiment;



FIG. 2 is a diagram illustrating an example of a functional configuration of a controller according to the embodiment;



FIG. 3 is a diagram illustrating an example of a functional configuration of a planning unit according to the embodiment;



FIG. 4 is a diagram illustrating an example of a grasp configuration (GC) according to the embodiment;



FIG. 5 is a diagram illustrating an example of a functional configuration of a GC candidate calculation unit according to the embodiment;



FIG. 6 is a diagram illustrating an example of processing in a feature calculation unit according to the embodiment;



FIG. 7 is a diagram illustrating an example of processing in a GC region calculation unit according to the embodiment;



FIG. 8 is a flowchart illustrating an example of a handling method according to the embodiment; and



FIG. 9 is a diagram illustrating an example of a hardware configuration of the controller according to the embodiment.





DETAILED DESCRIPTION

An object manipulation apparatus according to an embodiment includes one or more hardware processors coupled to a memory and configured to function as a feature calculation unit a region calculation unit, and a grasp configuration (GC) calculation unit. The feature calculation unit serves to calculate a feature map indicating a feature of an image on the basis of captured image of grasping target objects. The region calculation unit serves to calculate, on the basis of the feature map, the expression of position and a posture of a handling tool by a first parameter on a circular anchor in the image. The GC region calculation unit calculates the approximate region of GC by convert a first parameter on a circular anchor to GC region. The GC calculation unit serves to calculate a GC pf the handling tool which is expressed as a second parameter indicating a position and a posture of the handling tool on the image on the basis of GC approximate region.


Exemplary embodiments of an object manipulation apparatus, a handling method, and a program product will be explained below in detail with reference to the accompanying drawings.


First Embodiment

First, an outline of a system for object manipulation task including an object manipulation apparatus (picking robot), which is an example of an object manipulation robot, and a robot integrated management system will be described.


General Outline


FIG. 1 is a schematic diagram illustrating an example of a configuration of a system for object manipulation task 100 according to the embodiment. The system for object manipulation task 100 according to the embodiment includes an object manipulation apparatus (a manipulator 1, a housing 2, and a controller 3), a sensor support portion 4, an article container sensor 5, a grasped article measuring sensor 6, a cargo collection container sensor 7, a temporary storage space sensor 8, an article container drawing portion 9, an article container weighing machine 10, a cargo collection container drawing portion 11, and a cargo collection container weighing machine 12.


The sensor support portion 4 supports sensors (the article container sensor 5, the grasped article measuring sensor 6, the cargo collection container sensor 7, and the temporary storage space sensor 8).


The article container sensor 5 measures an internal state of an article container 101. The article container sensor 5 is, for example, an image sensor installed above the article container drawing portion 9.


The grasped article measuring sensor 6 is installed in the vicinity of the article container sensor 5, and measures an object grasped by the manipulator 1.


The cargo collection container sensor 7 measures an internal state of a cargo collection container. The cargo collection container sensor 7 is, for example, an image sensor installed above the cargo collection container drawing portion 11.


The temporary storage space sensor 8 measures an article put on a temporary storage space 103.


The article container drawing portion 9 draws the article container 101 in which target articles to be handled are stored.


The article container weighing machine 10 measures a weight of the article container 101.


The cargo collection container drawing portion 11 draws a cargo collection container 102 that contains articles taken out by the manipulator 1.


The cargo collection container weighing machine 12 measures a weight of the cargo collection container 102.


The article container sensor 5, the grasped article measuring sensor 6, the cargo collection container sensor 7, and the temporary storage space sensor 8 may be optional sensors. For example, sensors capable of acquiring image information, three-dimensional information and the like, such as an RGB image camera, a range image camera, a laser range finder, and a Light Detection and Ranging or Laser Imaging Detection and Ranging (LiDAR) can be used.


Note that, although not illustrated in the schematic diagram of FIG. 1, the system for object manipulation task 100 according to the embodiment includes, in addition to the components described above, various sensors, a power supply unit for operating various drive units, a cylinder for storing compressed air, a compressor, a vacuum pump, a controller, an external interface such as a user interface (UI), and a safety mechanism such as a light curtain or a collision detector.


The manipulator 1 includes an arm portion and a handling (picking) tool portion 14.


The arm portion is an articulated robot that is driven by a plurality of servo motors. The articulated robot, whose typical example is a vertical articulated robot of six axes (axes 13a to 13f) as illustrated in FIG. 1, is configured by a combination of a multi-axis vertical articulated robot, a SCARA robot, a linear motion robot and the like.


The handling tool portion 14 includes a force sensor and a pinching mechanism. The handling tool portion 14 grasps a grasping target object.


A robot integrated management system 15 is a system that manages the system for object manipulation task 100. The handling tool portion 14 can be attached to and detached from the arm portion that grasps the grasping target object, by using a handling tool changer. The handling tool portion 14 can be replaced with an optional handling tool portion 14 in accordance with an instruction from the robot integrated management system 15.



FIG. 2 is a diagram illustrating an example of a functional configuration of the controller 3 according to the embodiment. The controller 3 according to the embodiment includes a processing unit 31, a planning unit 32, and a control unit 33.


The processing unit 31 performs noise removal processing on image sensor information captured by the camera, background removal processing on information other than an object (for example, the article container and the ground), image resizing for generating an image to be input to the planning unit 32, and normalization processing. For example, the processing unit 31 inputs an RGB-D image to the planning unit 32 as processed image sensor information.


The planning unit 32 calculates, by deep learning, a candidate group of grasp configurations (GC) of the handling tool portion 14 in an image coordinate system, which has a highly possibility of success for grasping target object. The planning unit 32 converts each candidate into a 6D grasping posture in the world coordinate system. The 6D grasping posture includes three-dimensional coordinates indicating a position and three-dimensional coordinates indicating a orientation. The planning unit 32 evaluates a score of grasping easiness of each 6D grasping posture candidate and then calculates a candidate group having higher scores of easiness or the optimal candidate. Moreover, the planning unit 32 generates a trajectory from an initial posture of the manipulator 1 to grasp postures of candidates in the candidate group with higher scores or the grasp posture of the optimal candidate, and transmits the trajectory to the control unit 33.


The control unit 33 generates a time series of a position, a velocity, and an acceleration of each joint of the manipulator 1 on the basis of the trajectory received from the planning unit 32, and controls a behavior of causing the manipulator 1 to grasp the grasping target object. In addition, the control unit 33 makes the controller 3 repeatedly function until the grasping operation succeeds or an upper limit of the number of times of operation execution is reached.



FIG. 3 is a diagram illustrating an example of a functional configuration of the planning unit 32 according to the embodiment. The planning unit 32 according to the embodiment includes a GC candidate calculation unit 321, a posture calculation unit 322, an evaluation unit 323, and a generation unit 324.


The GC candidate calculation unit 321 calculates a GC candidate group by deep learning.



FIG. 4 is a diagram illustrating an example of the GC according to the embodiment. The example in FIG. 4 represents the GC in a case where the grasping posture of the handling tool portion 14 is projected onto an image when a grasping target object 104 is grasped from directly above. In one example, the GC is expressed by a rotated bounding box {x, y, w, h, θ} on the image. The parameters x and y indicate the center position of the handling tool portion 14, the parameter w indicates an opening width of the handling tool portion 14, the parameter h indicates a width of a finger of the handling tool portion 14, and the parameter θ indicates an angle formed by the opening width w of the GC and an image horizontal axis.


Returning to FIG. 1, the posture calculation unit 322 converts the GC ({x, y, w, h, θ}) calculated by the GC candidate calculation unit 321 into a grasp posture in the world coordinate system in 6D ({X, Y, Z, roll, pitch, yaw}) of the handling tool portion 14. A relationship between the GC and the 6D posture is expressed by the following Equations (1) to (4) on the basis of a depth image (IDepth), a camera matrix (IC), and information of a position and a posture (Tcamworld) of the camera in the world coordinate system. Note that a matrix Rotcamworld included in the matrix Tcamworld is a rotation matrix of a 3×3 size camera, and Transcamworld is a vector indicating the position of the camera, which is expressed by three rows and one column.









IC
=

[




c
x



0



f
x





0



c
y




f
y





0


0


1



]





(
1
)













T
cam
world

=

[




Rot
cam
world




Trans
cam
world





0


1



]





(
2
)













[



X




Y




Z




1



]

=


T
cam
world

[




cos

θ




sin

θ



0




(

x
-

c
x


)

/

f
x








-
sin


θ




cos

θ



0




(

x
-

c
y


)

/

f
y






0


0


1





I
Depth

(

x
,
y

)

+

D

insertion


amount







0


0


0


1



]





(
3
)













[




cos


yaw





-
sin



yaw



0





sin


yaw




cos


yaw



0




0


0


1



]

[




cos


pitch



0



sin


pitch





0


1


0





sin


pitch



0



cos


pitch




]




(
4
)










[



1


0


0




0



cos


roll





-
sin



roll





0



sin


roll




cos


roll




]

=


Rot
cam
world

[




cos

θ




sin

θ



0






-
sin


θ




cos

θ



0




0


0


1



]





The GC and the 6D posture can be mutually converted by Equations (1) to (4) above. Dinsertion amount is an insertion amount when grasping is performed by the handling tool portion 14, and is determined by a fixed value or the shape and size of the grasping target object 104.


Moreover, in addition to the posture, the opening width W of the handling tool portion 14 in the world coordinate system can be easily obtained by converting an end point of a line segment of a projection w on the image of the opening width W into world coordinates according to Equation (3) and calculating a distance between the end points.


The evaluation unit 323 evaluates a score of easiness when grasping the grasping target object 104 in the posture of the handling tool portion 14. The score of grasping easiness is calculated by, for example, a heuristic evaluation formula in which the possibility of success, stability, and safety of grasping are considered in combination. In addition, the score of the grasping easiness is also obtained by directly using deep learning (see, for example, JP 7021160 B2). The evaluation unit 323 sorts the scores of easiness in descending order, and calculates a candidate group having higher scores or a candidate having the highest score.


The generation unit 324 generates the trajectory from the initial posture of the manipulator 1 to the candidate group having higher scores or the optimum posture of the handling tool portion 14 described above by using a planer of route planning, for example, Movelt (Online, Searched on Jun. 29, 2022, Internet “URL:https://moveit.ros.org/”), and transmits the trajectory and the score of easiness to the control unit 33.



FIG. 5 is a diagram illustrating an example of a functional configuration of the GC candidate calculation unit 321 according to the embodiment. The GC candidate calculation unit 321 according to the embodiment includes a feature calculation unit 3211, a position heatmap calculation unit 3212, a region calculation unit 3213, a GC region calculation unit 3214, and a GC calculation unit 3215.


Upon receiving input of the RGB-D image from the processing unit 31, the feature calculation unit 3211 calculates a feature map of the image of target grasping objects. Specifically, the feature calculation unit 3211 enhances the accuracy of feature learning by using a neural network that fuses not only the last feature but also an intermediate feature. Note that, in the technology according to the related art, the feature is calculated by directly fusing the feature maps of last output layer (for example, the last feature map of a plurality of pieces of sensor information) calculated by the neural network, so that the role of the intermediate feature for the accuracy of a learning result has not been considered.


The feature calculation unit 3211 calculates the feature map by receiving input of a plurality of pieces of image sensor information, integrating a plurality of intermediate features extracted by a plurality of feature extractors from the pieces of image sensor information, and fusing features of the pieces of image sensor information including the intermediate features by convolution calculation. The pieces of image sensor information include, for example, a color image indicating a color of the image and a depth image indicating a distance from the camera to the object included in the image. The feature extractors are implemented by a neural network having an encoder-decoder model structure.



FIG. 6 is a diagram illustrating an example of processing in the feature calculation unit 3211 according to the embodiment. The example in FIG. 6 represents a case where the feature calculation unit 3211 uses the neural network having the encoder-decoder model structure. Specifically, in a network with triangular-like shape on the upper side of FIG. 6, the left half has a structure indicating processing in an encoder in which features of a color image IRGB are extracted, and the right half has a structure indicating processing in a decoder in which the color image IRGB is restored. Similarly, a network on the lower side of FIG. 6 has a structure in which the left half illustrates processing in the encoder in which features of a depth image IDepth are extracted, and has a structure in which the right half illustrates processing in the decoder in which the depth image IDepth is restored.


According to the embodiment, intermediate features (XRGBi,j and XDi,j; i,j={(0,0),(0,1),(0,2),(1,0),(1,1),(2,0),(2,1),(3,0),(4,0)}) obtained by the encoder are fused by the following Equation (5), and the feature map by the convolution calculation (Conv) is calculated.










X
F
k

=

{






X
RGB

0
,
0


+

X
D

0
,
0



,




k
=
0








Conv

(

X
F

K
-
1


)

+

X
RGB

K
,
0


+

X
D

K
,
0


+




n
=
0

K


X
RGB


n

%2

,

n
/
2




+

X
D


n

%2

,

n
/
2




,




k
>
0









(
5
)







Returning to FIG. 5, the region calculation unit 3213 calculates an expression of the GC on a circular anchor by the neural network on the basis of the feature map. Note that, in the technology according to the related art, box-shaped anchors are generated at predetermined intervals with respect to the entire image, and the relative position and the relative rotation angle with respect to the anchor are learned. In this case, since the anchor for the entire image is generated, the calculation amount increases. In addition, it is necessary to generate boxes of a plurality of sizes and a plurality of rotation angles, and the number of parameters increases.


Considering the above, in the GC candidate calculation unit 321 of the present embodiment, the position heatmap calculation unit 3212 calculates a position heatmap indicating the success possibility of position for the handling tool to grasp target object 104, based on which box-shaped anchors only need to be generated in the area with high success possibility in an image instead of the entire image. The region calculation unit 3213 according to the embodiment is implemented by the neural network that detects the circular anchor on the feature map on the basis of the position heatmap and calculates a first parameter on the circular anchor. The region calculation unit 3213 generates the circular anchor in a region having a higher score of the position heatmap (a region larger than a threshold), whereby the region where the circular anchor is generated is narrowed down, and the calculation amount can be reduced. In addition, as illustrated in FIG. 5, by using a plurality of anchors having different sizes, it is possible to treat the grasping target objects 104 having different sizes.


The position heatmap calculation unit 3212 calculates the position heatmap by a neural network (for example, a fully connected neural network (FCN) and a U-Net) using an image as an input. Ground truth of the position heatmap is obtained from x and y of the GC. For example, a value of each point in the position heatmap is generated by calculating Gaussian distances from the position of each point to x and y of GC in the image.


In the present embodiment, the anchor having a circular shape is used. In a case of the circular shape, unlike a case of a box shape, it is not necessary to consider the angle, and it is sufficient if only circles of a plurality of sizes are generated, so that the number of parameters can be reduced. As a result, learning efficiency can be enhanced.


On the other hand, in the present embodiment, unlike previous studies angle degree of θ of CG is not directly regressed because there is a possibility that an inaccurate value of a loss function due to discontinuity occurs at the boundary, resulting in learning of the angle may become difficult. Considering this, the region calculation unit 3213 enhances learning performance by learning the center (Cx,Cy) and radius (R) of a circumscribed circle of the GC and coordinates (dRx,dRy) of a midpoint of a short side of the GC (for example, the center of “h”) with respect to the center of the circle instead of the angle θ.



FIG. 7 is a diagram illustrating an example of processing in the GC region calculation unit 3214 according to the embodiment. The GC region calculation unit 3214 converts the expression of the GC on the circular anchor obtained by the learning into an approximate region ({x′, y′, w′, h′, θ}) of the GC by the following Equation (6).









{




x
=

c
x







y
=

c
y







w
=

2
*



dR
x
2

+

dR
y
2










h
=

2
*



R
2

-

(


dR
x
2

+

dR
y
2


)










θ
=

arc

tan



dR
y


dR
x











(
6
)







Returning to FIG. 5, when the GC region calculation unit 3214 calculates an approximate region of the GC, the GC calculation unit 3215 extracts features in the approximate region of the GC from the feature map, and performs pooling and alignment of the features by rotated region of interest (ROI) alignment. Then, the GC calculation unit 3215 inputs the features for which the pooling and the alignment have been performed to fully connected layers (fc1 and fc2 in the example of FIG. 5), and calculates the values ({x, y, w, h, θ}) of the GC, a probability p0 that grasping is possible, and a probability p1 that grasping is impossible.



FIG. 8 is a flowchart illustrating an example of a handling method according to the embodiment. First, the feature calculation unit 3211 calculates the feature map indicating the features of the captured image of grasping target objects 104 (step S1).


Next, the region calculation unit 3213 calculates, on the basis of the feature map calculated in step S1, the expression of position and posture of the handling tool portion 14 capable of grasping the grasping target object 104 by the first parameter on the circular anchor in the image (step S2). In the example in FIG. 7 described above, the first parameter includes parameters Cx and Cy indicating the center of the circular anchor and a parameter R indicating the radius of the circular anchor.


Next, The GC region calculation unit 3214 calculated the approximate region of GC by convert a first parameter on a circular anchor to GC region.


Next, the GC calculation unit 3215 calculates the approximate region of GC of the handling tool portion 14 which is expressed as a second parameter indicating a position and a posture of the handling tool on the image on the basis of GC approximate region. Specifically, the GC calculation unit 3215 calculates the second parameter ({x, y, w, h, θ} in Equation (6) above) from the parameter (dRx and dRy in the example of FIG. 7) indicating the midpoint of the side of the GC whose circumscribed circle is the circular anchor, the parameter (Cx and Cy in the example of FIG. 7) indicating the center of the circular anchor, and the parameter (R in the example of FIG. 7) indicating the radius of the circular anchor. Note that the handling method according to the above-described embodiment can be applied not only to a pinching-type handling tool but also to any handling tool that can be expressed by the rotated bounding box {x, y, w, h, θ} on the image.


As described above, with the object manipulation apparatus (the manipulator 1, the housing 2, and the controller 3) according to the embodiment, it is possible to more effectively utilize the intermediate features (for example, see FIG. 6) obtained by the neural network and more appropriately control operation of the object manipulation apparatus with a smaller calculation amount.


In the technology according to the related art, it is necessary to learn the rotation angle of the box, which is an expression of the posture of the handling tool on the image. In order to learn the rotation angle, it is necessary to generate a large number of rotated candidate boxes or classify the rotation angle (convert the angle into a high-dimensional one-hot vector), and thus, the calculation amount is enormous. On the other hand, when learning the rotation angle, learning has been difficult because two rotation angles (for example, an expression of a box having a rotation angle of 0 degrees on the image is the same as an expression of a box having a rotation angle of 180 degrees on the image) exist for the same rotated box due to the symmetry of the box.


Finally, an example of a diagram illustrating an example of a hardware configuration of the controller 3 according to the embodiment will be described.


Example of Hardware Configuration


FIG. 9 is a diagram illustrating an example of a hardware configuration of a diagram illustrating an example of a hardware configuration of the controller 3 according to the embodiment. The controller 3 according to the embodiment includes a control device 301, a main storage (or memory) device 302, an auxiliary storage device 303, a display device 304, an input device 305, and a communication device 306. The control device 301, the main storage device 302, the auxiliary storage device 303, the display device 304, the input device 305, and the communication device 306 are connected to each other through a bus 310.


Note that the display device 304, the input device 305, and the communication device 306 do not have to be included. For example, in a case where the controller 3 is connected to another device, a display function, an input function, and a communication function of other devices may be used.


The control device 301 executes a computer program read from the auxiliary storage device 303 to the main storage device 302. The control device 301 is, for example, one or more processors such as a central processing unit (CPU). The main storage device 302 is a memory such as a read only memory (ROM) and a random access memory (RAM). The auxiliary storage device 303 is a memory card, a hard disk drive (HDD), or the like.


The display device 304 displays information. The display device 304 is, for example, a liquid crystal display. The input device 305 receives input of the information. The input device 305 is, for example, a hardware key or the like. Note that the display device 304 and the input device 305 may be a liquid crystal touch panel or the like having both of a display function and an input function. The communication device 306 communicates with another device.


The computer program executed by the controller 3 is a file having an installable or executable format. The computer program is stored, as a computer program product, in a non-transitory computer-readable recording medium such as a compact disc read only memory (CD-ROM), a memory card, a compact disc recordable (CD-R), and a digital versatile disc (DVD) and is provided.


The computer program executed by the controller 3 may be configured to be stored on a computer connected to a network such as the Internet and be provided by being downloaded via the network. Alternatively, the computer program executed by the controller 3 may be configured to be provided via a network such as the Internet without being downloaded.


In addition, the computer program executed by the controller 3 may be configured to be provided in a state of being incorporated in advance in a ROM or the like.


The computer program executed by the controller 3 has a module configuration including a function that can be implemented by the computer program among functions of the controller 3.


Functions implemented by the computer program are loaded into the main storage device 302 by reading and executing the computer program from a storage medium such as the auxiliary storage device 303 by the control device 301. In other words, the functions implemented by the computer program are generated on the main storage device 302.


Note that some of the functions of the controller 3 may be implemented by hardware such as an integrated circuit (IC). The IC is, for example, a processor executing dedicated processing.


Moreover, in a case of implementing the respective functions using a plurality of processors, each processor may implement one of the functions, or may implement two or more of the functions.


While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; moreover, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.

Claims
  • 1. An object manipulation apparatus comprising: one or more hardware processors coupled to a memory and configured to function as a feature calculation unit to calculate a feature map indicating a feature of a captured image of grasping target objects;a region calculation unit to calculate, on the basis of the feature map, a position and a posture of a handling tool by a first parameter on a circular anchor in the image, the handling tool being capable of grasping the grasping target object; anda grasp configuration (GC) calculation unit to calculate a GC of the handling tool by converting the position and the posture indicated by the first parameter into a second parameter indicating a position and a posture of the handling tool on the image.
  • 2. The apparatus according to claim 1, wherein the feature calculation unit implements the calculation of the feature map by receiving input of a plurality of pieces of image sensor information,integrating a plurality of intermediate features extracted by a plurality of feature extractors from the plurality of pieces of image sensor information, andfusing, by convolution calculation, features of the plurality of pieces of image sensor information including the plurality of intermediate features.
  • 3. The apparatus according to claim 2, wherein the plurality of pieces of image sensor information include a color image indicating a color of the image and a depth image indicating a distance from camera to objects in the image, andthe plurality of feature extractors is implemented by a neural network having an encoder-decoder model structure.
  • 4. The apparatus according to claim 1, wherein the one or more hardware processors are further configured to function as a position heatmap calculation unit to calculate a position heatmap indicating success probability for grasping target object, andthe region calculation unit is implemented by a neural network detecting the circular anchor on the feature map on the basis of the position heatmap and calculating the first parameter on the detected circular anchor.
  • 5. The apparatus according to claim 1, wherein the first parameter includes a parameter indicating a center of the circular anchor and a parameter indicating a radius of the circular anchor, andthe GC calculation unit calculates the second parameter from a parameter indicating a midpoint of a side of the GC whose circumscribed circle is the circular anchor,the parameter indicating the center of the circular anchor, andthe parameter indicating the radius of the circular anchor.
  • 6. A handling method implemented by a computer, the method comprising: calculating a feature map indicating a feature of an image on the basis of image sensor information including a grasping target object;calculating, on the basis of the feature map, a position and a posture of a handling tool by a first parameter on a circular anchor in the image, the handling tool being capable of grasping the grasping target object; andcalculating a grasp configuration (GC) of the handling tool by converting the position and the posture indicated by the first parameter into a second parameter indicating a position and a posture of the handling tool on the image.
  • 7. A computer program product comprising a non-transitory computer-readable recording medium on which an executable program is recorded, the program instructing a computer to: calculate a feature map indicating a feature of an image on the basis of image sensor information including a grasping target object;calculate, on the basis of the feature map, a position and a posture of a handling tool by a first parameter on a circular anchor in the image, the handling tool being capable of grasping the grasping target object; andcalculate a grasp configuration (GC) of the handling tool by converting the position and the posture indicated by the first parameter into a second parameter indicating a position and a posture of the handling tool on the image.
Priority Claims (1)
Number Date Country Kind
2022-122589 Aug 2022 JP national