Bandsaw Automated Portioning Saw System with Visual Feedback and Method of Use

Information

  • Patent Application
  • 20200230721
  • Publication Number
    20200230721
  • Date Filed
    January 29, 2020
    4 years ago
  • Date Published
    July 23, 2020
    4 years ago
Abstract
An automated saw wherein: the automated saw comprises one or more visual sensors, a positioning system. The automated saw is configured to analyze an uncut meat and calculate a one or more cutting depths for a one or more cut portions from the uncut meat. The uncut meat comprises a first end and a second end. The first end of the uncut meat can be analyzed by the automated saw by: capturing a first slice image of the first end, locating a first bone configuration and a configuration of a one or more meat portions in relation to the first bone configuration, and measuring portions of the one or more meat portions to categorize which among a one or more cuts of meat is presented at the first end of the uncut meat. The automated saw calculates a first cutting depth for a first meat portion.
Description
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT (IF APPLICABLE)

Not applicable.


REFERENCE TO SEQUENCE LISTING, A TABLE, OR A COMPUTER PROGRAM LISTING COMPACT DISC APPENDIX (IF APPLICABLE)

Not applicable.


BACKGROUND OF THE INVENTION

No prior art is known to the Applicant.


BRIEF SUMMARY OF THE INVENTION

An automated saw wherein: said automated saw comprises one or more visual sensors, a positioning system. Said automated saw is configured to analyze an uncut meat and calculate a one or more cutting depths for a one or more cut portions from said uncut meat. Said uncut meat comprises a first end and a second end. Said first end of said uncut meat can be analyzed by said automated saw by: capturing a first slice image of said first end, locating a first bone configuration and a configuration of a one or more meat portions in relation to said first bone configuration, and measuring portions of said one or more meat portions to categorize which among a one or more cuts of meat is presented at said first end of said uncut meat. Said automated saw calculates a first cutting depth for a first meat portion. Said automated saw is configured to preferences for various said one or more cuts of meat and corresponding said one or more cutting depths. Said first cutting depth can correspond to a first cut of meat and a second cutting depth to a second cut of meat.


Said automated saw wherein: said automated saw comprises said one or more visual sensors, said positioning system. Said automated saw is configured to analyze said uncut meat and calculate said one or more cutting depths for said one or more cut portions from said uncut meat. Said uncut meat comprises said first end and said second end. Said first end of said uncut meat can be analyzed by said automated saw by: capturing said first slice image of said first end, locating said first bone configuration and a configuration of said one or more meat portions in relation to said first bone configuration, and measuring portions of said one or more meat portions to categorize which among said one or more cuts of meat is presented at said first end of said uncut meat.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING


FIG. 1 illustrates a perspective overview of an automated saw 100.



FIG. 2 illustrates a block diagram of portions of said automated saw 100.



FIGS. 3A and 3B illustrate a back-side perspective overview of said automated saw 100 with a cutting plane 314 being calculated.



FIG. 4 illustrates a back-side perspective overview of said automated saw 100 with a one or more cuts of meat 400 being shown.



FIGS. 5A and 5B illustrate a perspective overview of an uncut meat 202, and a perspective overview of a first cut portion 318a and a second cut portion 318b, respectively.



FIGS. 6A and 6B illustrate an elevated front view of a first slice image 600a and a second slice image 600b, respectively.



FIG. 7 illustrates a method of use 700 for said automated saw 100.



FIG. 8 illustrates a network diagram 800.



FIGS. 9A, 9B, 9C, 9D and 9E illustrate a mobile phone 900a, a personal computer 900b, a tablet 900c, a smart watch 900d and a smart phone 900e, respectively.



FIGS. 10A, 10B and 10C illustrate an address space 1000, an address space 1000a and an address space 1000e, respectively.



FIGS. 11A and 11B illustrate a flow chart between one or more computers 804 and a server 806.



FIGS. 12A and 12B illustrate interactions between a device application 1200, a server application 1204 and a data storage 808.





DETAILED DESCRIPTION OF THE INVENTION

The following description is presented to enable any person skilled in the art to make and use the invention as claimed and is provided in the context of the particular examples discussed below, variations of which will be readily apparent to those skilled in the art. In the interest of clarity, not all features of an actual implementation are described in this specification. It will be appreciated that in the development of any such actual implementation (as in any development project), design decisions must be made to achieve the designers' specific goals (e.g., compliance with system- and business-related constraints), and that these goals will vary from one implementation to another. It will also be appreciated that such development effort might be complex and time-consuming, but would nevertheless be a routine undertaking for those of ordinary skill in the field of the appropriate art having the benefit of this disclosure. Accordingly, the claims appended hereto are not intended to be limited by the disclosed embodiments, but are to be accorded their widest scope consistent with the principles and features disclosed herein.



FIG. 1 illustrates a perspective overview of an automated saw 100.


In one embodiment, said automated saw 100 can comprise a base 102, a conveyor 104, a post 106, an overhang portion 108, a controller 110, one or more visual sensors 112, a positioning system 114, a pusher 116, a tray 118 and a bandsaw blade 120. Said base 102 can comprise a lower portion of said automated saw 100 which supports said tray 118 and said positioning system 114. In one embodiment, said positioning system 114 can comprise a system which moves said tray 118 and said pusher 116 relative to other portions of said automated saw 100. In one embodiment, said conveyor 104 can transport elements having been cut by said bandsaw blade 120, as discussed below. In one embodiment, said tray 118 can move in two or three axes as dictated by said controller 110.


In one embodiment, said one or more visual sensors 112 can comprise a camera, as is known in the art. Said one or more visual sensors 112 can comprise a first visual sensor 122 and a second visual sensor 124. In one embodiment, said first visual sensor 122 can create an overview of said tray 118 and its contents, and said second visual sensor 124 can create a front-end view thereof.


In one embodiment, said controller 110 can comprise one among one or more computers 804. In one embodiment, said controller 110 can receive signals from said one or more visual sensors 112, said positioning system 114, others among said one or more computers 804 and/or a server 806 to determine movement of said positioning system 114 and movement of said bandsaw blade 120.



FIG. 2 illustrates a block diagram of portions of said automated saw 100.


One objective of said automated saw 100 can be to control movement of an uncut meat 202 within a first horizontal direction 204 and a second horizontal direction 206; with the added control of movement of said uncut meat 202 relative to said tray 118 by pressing said uncut meat 202 with said pusher 116 in a pushing direction 212.


Said uncut meat 202 comprises a first end 214 and a second end 216. In one embodiment, said first end 214 comprises a cutting end 218 of said uncut meat 202.



FIGS. 3A and 3B illustrate a back-side perspective overview of said automated saw 100 with a cutting plane 314 being calculated.


In one embodiment, said automated saw 100 can calculate a diameter 306, a surface area 302 and one or more cut portions 318 of said uncut meat 202. As a starting point, said automated saw 100 can know an uncut meat length 316, said diameter 306 and an uncut meat weight 304 of said uncut meat 202 with a scale 300. It can thereby calculate said cutting plane 314 by using said surface area 302 and said one or more cut portions 318 along with said uncut meat weight 304, said diameter 306 and said uncut meat length 316 to determine where to place said cutting plane 314 in order to cut said one or more cut portions 318.


In one embodiment, said one or more cut portions 318 can comprise at least a first cut portion 318a and a second cut portion 318b. Further, said one or more cut portions 318 can comprise a cut portion weight 320.


As illustrated, said automated saw 100 has already cut said first cut portion 318a, and has calculated said uncut meat length 316 from said uncut meat weight 304 and said uncut meat length 316 by dividing the desired quantity of said cut portion weight 320 for said second cut portion 318b by said uncut meat weight 304, then taking that fraction and multiplying it by said uncut meat length 316, the product of which comprises said one or more cut portions 318.


In another embodiment, said one or more cut portions 318 can be calculated by accounting for said surface area 302 where said uncut meat 202 is not consistently the same diameter as said diameter 306; wherein, said one or more cut portions 318 can be calculated by accounting for a volume and/or density of said uncut meat 202 along the entirety of said uncut meat length 316.



FIG. 4 illustrates a back-side perspective overview of said automated saw 100 with one or more cuts of meat 400 being shown.


In one embodiment, said uncut meat 202 can comprise said one or more cuts of meat 400. For example, in one embodiment, a beef loin can comprise a porterhouse cut, and a T-bone cut which need to be treated differently. One feature of said automated saw 100 is to analyze each of said one or more cut portions 318 before slicing them from said uncut meat 202.


Said one or more cuts of meat 400 can comprise at least a first cut of meat 400a and a second cut of meat 400b.



FIGS. 5A and 5B illustrate a perspective overview of said uncut meat 202, and a perspective overview of said first cut portion 318a and said second cut portion 318b, respectively.


Shown here are said uncut meat 202 as a short loin 502, said first cut portion 318a as a T-bone cut 504, and said second cut portion 318b as a porterhouse cut 506. However, said automated saw 100 can be used to cut other types of meat without varying from the current disclosure.


One feature of said automated saw 100 is to visually analyze each among said one or more cut portions 318 and determine what cut of meat is presented for cutting. In one embodiment, said automated saw 100 can visually inspect one or more bone configurations 500 and determine which cut of meat is presented (such as said T-bone cut 504 or said porterhouse cut 506).



FIGS. 6A and 6B illustrate an elevated front view of a first slice image 600a and a second slice image 600b, respectively.


In one embodiment, said automated saw 100 can capture one or more slice images 600 with said one or more visual sensors 112. In one embodiment, said one or more slice images 600 can comprise visual images, as is known in the art. In one embodiment, said one or more slice images 600 can capture light and data not visible to the human eye, such as IR and heat signals.



FIG. 7 illustrates a method of use 700 for said automated saw 100.


In one embodiment, said first end 214 of said uncut meat 202 can be analyzed by said automated saw 100 by: capturing said first slice image 600a of said first end 214, locating a first bone configuration 500a and a configuration of one or more meat portions 602 in relation to said first bone configuration 500a, and measuring portions of said one or more meat portions 602 to categorize which among said one or more cuts of meat 400 is presented at said first end 214 of said uncut meat 202. Further, said automated saw 100 can calculate a first cutting depth 322a for a first meat portion 602a. In one embodiment, said automated saw 100 can be configured to preferences for various said one or more cuts of meat 400 and corresponding one or more cutting depths 322; wherein, for example, said first cutting depth 322a can correspond to said first cut of meat 400a and a second cutting depth 322b to said second cut of meat 400b. A butcher might, for example, prefer different thicknesses for said T-bone cut 504 and said porterhouse cut 506.


In one embodiment, said one or more slice images 600 can comprise at least said first slice image 600a and said second slice image 600b.



FIG. 8 illustrates a network diagram 800.


In one embodiment, said network diagram 800 can comprise said one or more computers 804, one or more locations 802, and a network 810. In one embodiment, said one or more locations 802 can comprise a first location 802a, a second location 802b and a third location 802c. Said one or more computers 804 can comprise a first computer 804a, and a second computer 804b In one embodiment, said server 806 can communicate with said one or more computers 804 over said network 810. Said one or more computers 804 can be attached to a printer 812 or other accessories, as is known in the art.


In one embodiment, said server 806 can attach to a data storage 808.


In one embodiment, said printer 812 can be hardwired to said first computer 804a (not illustrated here), or said printer 812 can connect to one of said one or more computers 804 (such as said second computer 804b, as illustrated) via said network 810.


Said network 810 can be a local area network (LAN), a wide area network (WAN), a piconet, or a combination of LANs, WANs, or piconets. One illustrative LAN is a network within a single business. One illustrative WAN is the Internet.


In one embodiment, said server 806 represents at least one, but can be many servers, each connected to said network 810. Said server 806 can connect to said data storage 808. Said data storage 808 can connect directly to said server 806, as shown in FIG. 1, or may exist remotely on said network 810. In one embodiment, said data storage 808 can comprise any suitable long-term or persistent storage device and, further, may be separate devices or the same device and may be collocated or distributed (interconnected via any suitable communications network).



FIGS. 9A, 9B, 9C, 9D and 9E illustrate a mobile phone 900a, a personal computer 900b, a tablet 900c, a smart watch 900d and a smart phone 900e, respectively.


In one embodiment, said one or more computers 804 can comprise said mobile phone 900a, said personal computer 900b, said tablet 900c, said smart watch 900d or said smart phone 900e. In one embodiment, each among said one or more computers 804 can comprise one or more input devices 904, a keyboard 904a, a trackball 904b, one or more cameras 904c, a track pad 904d, a data 906 and/or a home button 908, as is known in the art.


In the last several years, the useful definition of a computer has become more broadly understood to include mobile phones, tablet computers, laptops, desktops, and similar. For example, Microsoft®, have attempted to merge devices such as a tablet computer and a laptop computer with the release of “Windows® 8”. In one embodiment, said one or more computers each can include, but is not limited to, a laptop (such as said personal computer 900b), desktop, workstation, server, mainframe, terminal, a tablet (such as said tablet 900c), a phone (such as said mobile phone 900a), and/or similar. Despite different form-factors, said one or more computers can have similar basic hardware, such as a screen 902 and said one or more input devices 904 (such as said keyboard 904a, said trackball 904b, said one or more cameras 904c, a wireless—such as RFID—reader, said track pad 904d, and/or said home button 908). In one embodiment, said screen 902 can comprise a touch screen. In one embodiment, said track pad 904d can function similarly to a computer mouse as is known in the art. In one embodiment, said tablet 900c and/or said personal computer 900b can comprise a Microsoft® Windows® branded device, an Apple® branded device, or similar. In one embodiment, said tablet 900c can be an X86 type processor or an ARM type processor, as is known in the art.


Said network diagram 100 can comprise said data 906. In one embodiment, said data 906 can comprise data related to financial transactions.


In one embodiment, said one or more computers can be used to input and view said data 906. In one embodiment, said data 906 can be input into said one or more computers by taking pictures with one of said one or more camera 204c, by typing in information with said keyboard 904a, or by using gestures on said screen 902 (where said screen 902 is a touch screen). Many other data entry means for devices like said one or more computers are well known and herein also possible with said data 906. In one embodiment, said first computer 102a can comprise an iPhone®, a BlackBerry®, a smartphone, or similar. In one embodiment, said one or more computers 804 can comprise a laptop computer, a desktop computer, or similar.



FIGS. 10A, 10B and 10C illustrate an address space 1000, an address space 1000a and an address space 1000e, respectively.


In one embodiment, said one or more computers 804 can comprise said address space 1000, and more specifically, said first computer 804a can comprise said address space 1000a, and so on. In turn, each among said address space 1000 can comprise a processor 1002, a memory 1004, a communication hardware 1006 and a location hardware 1008. Thus, said address space 1000a a processor 1002a, a memory 1004a, a communication hardware 1006a and a location hardware 1008a; and said address space 1000e can comprise a processor 1002e, a memory 1004e, a communication hardware 1006e and a location hardware 1008e.


Each among said one or more computers 804 and said server 806 can comprise an embodiment of said address space 1000. In one embodiment, said processor 1002 can comprise a plurality of processors, said memory 1004 can comprise a plurality of memory modules, and said communication hardware 1006 can comprise a plurality of communication hardware components. In one embodiment, said data 906 can be sent to said processor 1002; wherein, said processor 1002 can perform processes on said data 906 according to an application stored in said memory 1004, as discussed further below. Said processes can include storing said data 906 into said memory 1004, verifying said data 906 conforms to one or more preset standards, or ensuring a required set among said required said data 906 has been gathered for said data management system and method. In one embodiment, said data 906 can include data which said one or more computers 804 can populate automatically, such as a date and a time, as well as data entered manually. Once a portion of gathering data has been performed said data 906 can be sent to said communication hardware 1006 for communication over said network 810. Said network 810 can include a network transport processor for packetizing data, communication ports for wired communication, or an antenna for wireless communication. In one embodiment, said data 906 can be collected in one or more computers and delivered to said server 806 through said network 810.



FIGS. 11A and 11B illustrate a flow chart between said one or more computers 804 and said server 806.


In the first embodiment, said communication hardware 1006a and said communication hardware 1006e can send and receive said data 906 to and from one another and or can communicate with said data storage 808 across said network 810. Likewise, in the second embodiment, said data storage 808 can be embedded inside of said one or more computers 804, which may speed up data communications over said network 810.


As illustrated in FIG. 4A, in one embodiment, said server 806 can comprise a third-party data storage and hosting provider or privately managed as well.


As illustrated in FIG. 4B, a data storage 808a can be located on said first computer 804a. Thus, said first computer 804a can operate without a data connection out to said server 806.



FIGS. 12A and 12B illustrate interactions between a device application 1200, a server application 1204 and said data storage 808.


For nomenclature, each among data records can comprise a set of data records in use on said one or more computers 804; thus, said first computer 804a can comprise a data records 1202a, and so on.


Said automated saw 100 wherein: said automated saw 100 comprises said one or more visual sensors 112, said positioning system 114. Said automated saw 100 can be configured to analyze said uncut meat 202 and calculate said one or more cutting depths 322 for said one or more cut portions 318 from said uncut meat 202. Said uncut meat 202 comprises said first end 214 and said second end 216. Said first end 214 of said uncut meat 202 can be analyzed by said automated saw 100 by: capturing said first slice image 600a of said first end 214, locating said first bone configuration 500a and a configuration of said one or more meat portions 602 in relation to said first bone configuration 500a, and measuring portions of said one or more meat portions 602 to categorize which among said one or more cuts of meat 400 can be presented at said first end 214 of said uncut meat 202. Said automated saw 100 calculates said first cutting depth 322a for said first meat portion 602a. Said automated saw 100 can be configured to preferences for various said one or more cuts of meat 400 and corresponding said one or more cutting depths 322. Said first cutting depth 322a can correspond to said first cut of meat 400a and said second cutting depth 322b to said second cut of meat 400b. Said automated saw 100 further comprises said conveyor 104, said controller 110, and said bandsaw blade 120. Said positioning system 114 comprises said pusher 116, and said tray 118. Said tray 118 can be configured to in said first horizontal direction 204 and said second horizontal direction 206 relative to said bandsaw blade 120. Said pusher 116 can be configured to push portions of said uncut meat 202 toward and past said bandsaw blade 120. Said tray 118 can be configured to horizontally press said uncut meat 202 into said bandsaw blade 120 in order to separate said one or more cut portions 318 from said uncut meat 202.


Various changes in the details of the illustrated operational methods are possible without departing from the scope of the following claims. Some embodiments may combine the activities described herein as being separate steps. Similarly, one or more of the described steps may be omitted, depending upon the specific operational environment the method is being implemented in. It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments may be used in combination with each other. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.”

Claims
  • 1. An automated saw wherein: said automated saw comprises one or more visual sensors, a positioning system;said automated saw is configured to analyze an uncut meat and calculate one or more cutting depths for one or more cut portions from said uncut meat;said uncut meat comprises a first end and a second end;said first end of said uncut meat can be analyzed by said automated saw by: capturing a first slice image of said first end,locating a first bone configuration and a configuration of one or more meat portions in relation to said first bone configuration, andmeasuring portions of said one or more meat portions to categorize which among one or more cuts of meat is presented at said first end of said uncut meat;said automated saw calculates a first cutting depth for a first meat portion;said automated saw is configured to preferences for various said one or more cuts of meat and corresponding said one or more cutting depths;said first cutting depth can correspond to a first cut of meat and a second cutting depth to a second cut of meat;said automated saw further comprises a conveyor, a controller, and a bandsaw blade;said positioning system comprises a pusher, and a tray;said tray is configured to in a first horizontal direction and a second horizontal direction relative to said bandsaw blade;said pusher is configured to push portions of said uncut meat toward and past said bandsaw blade; andsaid tray is configured to horizontally press said uncut meat into said bandsaw blade in order to separate said one or more cut portions from said uncut meat.
  • 2. An automated saw wherein: said automated saw comprises one or more visual sensors, a positioning system;said automated saw is configured to analyze an uncut meat and calculate one or more cutting depths for one or more cut portions from said uncut meat;said uncut meat comprises a first end and a second end;said first end of said uncut meat can be analyzed by said automated saw by: capturing a first slice image of said first end,locating a first bone configuration and a configuration of one or more meat portions in relation to said first bone configuration, andmeasuring portions of said one or more meat portions to categorize which among one or more cuts of meat is presented at said first end of said uncut meat.
  • 3. The automated saw of claim 2, wherein: said one or more visual sensors comprises a first visual sensor and a second visual sensor;said automated saw calculates an uncut meat length using said first visual sensor having an overview perspective of said uncut meat length with a controller;said automated saw captures an image of one or more slice images of said uncut meat length using said second visual sensor having a substantially direct view of said first end of said uncut meat length;said automated saw calculates a first cutting depth for a first meat portion;said automated saw is configured to preferences for various said one or more cuts of meat and corresponding said one or more cutting depths; andsaid first cutting depth can correspond to a first cut of meat and a second cutting depth to a second cut of meat.
  • 4. The automated saw of claim 3, wherein: said automated saw further comprises A conveyor, said controller, and a bandsaw blade;said positioning system comprises a pusher, and a tray;said tray is configured to in a first horizontal direction and a second horizontal direction relative to said bandsaw blade;said pusher is configured to push portions of said uncut meat toward and past said bandsaw blade; andsaid tray is configured to horizontally press said uncut meat into said bandsaw blade in order to separate said one or more cut portions from said uncut meat.
  • 5. The automated saw of claim 2, wherein: said one or more visual sensors comprises said first visual sensor and said second visual sensor;said automated saw calculates said uncut meat length using said first visual sensor having an overview perspective of said uncut meat length with said controller; andsaid automated saw captures an image of said one or more slice images of said uncut meat length using said second visual sensor having a substantially direct view of said first end of said uncut meat length.
  • 6. The automated saw of claim 2, wherein: said automated saw calculates said first cutting depth for said first meat portion; andsaid automated saw is configured to preferences for various said one or more cuts of meat and corresponding said one or more cutting depths.
  • 7. The automated saw of claim 6, wherein: said first cutting depth can correspond to said first cut of meat and said second cutting depth to said second cut of meat.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims U.S. Patent Application No. 62/798,181 filed 2019 Jan. 29, Ser. No. 16/751,100 filed 2020 Jan. 23, and 62/796,027 filed 2019 Jan. 23.

Provisional Applications (2)
Number Date Country
62798181 Jan 2019 US
62796027 Jan 2019 US
Continuation in Parts (1)
Number Date Country
Parent 16751100 Jan 2020 US
Child 16776413 US