Evaluating and recycling electronic devices

Information

  • Patent Grant
  • 12321965
  • Patent Number
    12,321,965
  • Date Filed
    Tuesday, August 24, 2021
    4 years ago
  • Date Issued
    Tuesday, June 3, 2025
    4 months ago
Abstract
Methods, apparatus, and systems for generating a price of a target device. An evaluator device obtains technical properties associated with the target device. The technical properties include a make and a model of the target device. Physical properties associated with the target device are obtained. The physical properties include information related to wear and tear of the target device. Obtaining the physical properties includes indicating to a user that the user should position the target device in multiple predetermined positions and that the evaluator device records an image of the target device in each of the multiple predetermined positions. A video of the target device is recorded while the target device is positioned in the multiple predetermined positions. The obtained physical properties are evaluated to generate a condition metric value of the target device. Based on the generated condition metric value, the price of the target device is determined.
Description
TECHNICAL FIELD

The present disclosure is directed generally to methods and systems for evaluating and recycling mobile phones and other consumer electronic devices and, more particularly, to hardware and/or software for facilitating device identification, evaluation, purchase, and/or other processes associated with electronic device recycling.


BACKGROUND

Consumer electronic devices, such as mobile phones, laptop computers, notebooks, tablets, PDAs, MP3 players, wearable smart devices, etc., are ubiquitous. Currently there are over 14.02 billion mobile devices in use in the world. In other words, there are more mobile devices in use than there are people on the planet. Part of the reason for the rapid growth in the number of consumer electronic devices is the rapid pace at which these devices evolve, and the increased usage of such devices in third world countries.


As a result of the rapid pace of development, a relatively high percentage of consumer electronic devices are replaced every year as consumers continually upgrade their mobile phones and other electronic devices to obtain the latest features or a better operating plan. According to the U.S. Environmental Protection Agency, the U.S. alone disposes of over 370 million mobile phones, PDAs, tablets, and other electronic devices every year. Millions of other outdated or broken mobile phones and other electronic devices are simply tossed into junk drawers or otherwise kept until a suitable disposal solution arises.


Although many electronic device retailers and cell carrier stores now offer mobile phone trade-in or buyback programs, many old mobile phones still end up in landfills or are improperly disassembled and disposed of in developing countries. Unfortunately, however, mobile phones and similar devices typically contain substances that can be harmful to the environment, such as arsenic, lithium, cadmium, copper, lead, mercury, and zinc. If not properly disposed of, these toxic substances can seep into groundwater from decomposing landfills and contaminate the soil with potentiality harmful consequences for humans and the environment.


As an alternative to retailer trade-in or buyback programs, consumers can now recycle and/or sell their used mobile phones using self-service kiosks located in malls, retail stores, or other publicly accessible areas. Such kiosks are operated by ecoATM, LLC, the assignee of the present application, and aspects of these kiosks are described in, for example: U.S. Pat. Nos. 7,881,965, 8,195,511, 8,200,533, 8,239,262, 8,423,404 and 8,463,646, which are incorporated herein by reference in their entireties.


There continues to be a need for improving the means available to consumers for recycling or reselling their mobile phones and other consumer electronic devices. Simplifying the recycling/reselling process, enhancing the consumer experience, and discouraging fraud can incentivize consumers to dispose of their old electronic devices in an efficient and environmentally conscientious way.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an embodiment of a suitable computing environment for implementing various aspects of the present technology.



FIG. 2 is a block diagram illustrating various components typically incorporated in computing systems and other devices on which the present technology can be implemented.



FIG. 3 shows a user interface presented by an application running on an evaluator device or a target device according to some embodiments of the present technology.



FIG. 4 shows a series of user interfaces presented to the user when the user wants to sell a target device according to some embodiments of the present technology.



FIGS. 5A-5D show a series of user interfaces associated with an application running on an evaluator device to guide the user in evaluating a target device according to some embodiments of the present technology.



FIG. 6 shows a series of user interfaces that display a guaranteed price and various payment options according to some embodiments of the present technology.



FIG. 7 shows a user interface that displays tests of the target device including touchscreen functionality according to some embodiments of the present technology.



FIG. 8 shows a user interface that displays a test of the microphone of the target device according to some embodiments of the present technology.



FIG. 9 shows a user interface that displays a test of the global positioning system (GPS) of the target device according to some embodiments of the present technology.



FIG. 10 shows a user interface that displays a test of the display of the target device according to some embodiments of the present technology.



FIG. 11 shows a map displaying kiosks and associated prices.



FIG. 12 is an isometric view of a kiosk for recycling and/or other processing of mobile phones and other consumer electronic devices in accordance with some embodiments of the present technology.



FIG. 13 is a high-level flow diagram of a routine to generate a guaranteed price of a target device (e.g., a mobile phone, tablet computer, thumb drive, television, SLR, etc.) for recycling in accordance with some embodiments of the present technology.



FIG. 14 is a flow diagram of a routine for remotely evaluating a target device for recycling in accordance with some embodiments of the present technology.





DETAILED DESCRIPTION

The following disclosure describes various embodiments of hardware and/or software systems and methods that facilitate the identification, evaluation, purchase, and/or other processes associated with recycling of electronic devices. In various embodiments, for example, the systems and methods described in detail herein enable a user to connect a first electronic device (an “evaluator device”), such as a mobile phone, to a second electronic device (a “target device”), such as another mobile phone, computer, appliance, peripheral, and so on, to accurately assess the condition and secondhand or resale market value of the target device. For example, a user could connect a first mobile phone evaluator device to a second mobile phone target device to get information about the second device, evaluate that information, and thus find out how much the second device is worth. The term “target device” is used herein for ease of reference to generally refer to an electronic device that a user may wish to evaluate for recycling. The term “evaluator device” is used herein for ease of reference to generally refer to an electronic device configured to obtain information from and/or about a target device and facilitate processing (e.g., recycling) of the target device. The evaluator device can include application software (an “app”) and/or hardware for connecting to and evaluating the target device (e.g., via a wired or wireless connection). In various embodiments, the app enables device owners and/or other users to conveniently evaluate and price their target devices without having to leave their home or office. The present technology enables device owners to maintain awareness of the market value of their target devices with minimal user input, and provides certainty so that owners can have a quick and predictable experience selling their target devices (e.g., at an associated recycling kiosk, via mail-in of device, at a physical store, etc.). In some embodiments, the evaluator device can inform the user of the values of their target devices, manage a portfolio of target devices for recycling, and offer recommendations for where and when to recycle target devices.


Further, the present technology prevents users from incorrectly overestimating a phone's condition, such as claiming that the phone is in good condition when the phone screen is cracked or the phone is otherwise damaged. When a user incorrectly overestimates the phone's condition, a final price for the phone is significantly lower than an expected estimated price. In such a case, the user usually rejects the final price, which then leads to a lost transaction and a negative experience for the user which can affect goodwill of the entity facilitating the phone's return (and recycling). The present technology addresses this problem by providing the user with a “guaranteed” price that will not change if the user submits the target device for sale, and thereby avoids or at least greatly reduces the occurrence of incomplete transactions and disgruntled users. The present technology enables the user to evaluate a target device using the evaluator device at home (or any other location), thus decreasing the time and interaction required for a user selling the target device at a physical location (e.g., a kiosk).


Certain details are set forth in the following description and in FIGS. 1-14 to provide a thorough understanding of various embodiments of the present technology. In other instances, well-known structures, materials, operations and/or systems often associated with smartphones and other handheld devices, consumer electronic devices, computer hardware, software, and network systems, etc. are not shown or described in detail in the following disclosure to avoid unnecessarily obscuring the description of the various embodiments of the present technology. Those of ordinary skill in the art will recognize, however, that the present technology can be practiced without one or more of the details set forth herein, or with other structures, methods, components, and so forth.


The terminology used below is to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with a detailed description of certain examples of embodiments of the present technology. Indeed, certain terms may even be emphasized below; however, any terminology intended to be interpreted in any restricted manner will be specifically defined as such in this Detailed Description section.


The accompanying Figures depict embodiments of the present technology and are not intended to be limiting of its scope. The sizes of various depicted elements are not necessarily drawn to scale, and these various elements may be arbitrarily enlarged to improve legibility. Component details may be abstracted in the Figures to exclude details such as position of components and certain precise connections between such components when such details are unnecessary for a complete understanding of how to make and use the embodiments disclosed herein.


In the Figures, identical reference numbers identify identical, or at least generally similar, elements. To facilitate the discussion of any particular element, the most significant digit or digits of any reference number refers to the Figure in which that element is first introduced. For example, element 110 is first introduced and discussed with reference to FIG. 1.



FIG. 1 illustrates an embodiment of a suitable computing environment 100 for implementing various aspects of the present technology. The environment 100 includes a first electronic device (e.g., an evaluator device 110) and a second electronic device (e.g., a target device 120). In the illustrated embodiment, the evaluator device 110 and the target device 120 are depicted as a handheld computing device such as a smartphone or other mobile phone. However, in other embodiments, the evaluator device 110 and/or the target device 120 can be any manner of electronic device. For example, the evaluator device 110 and the target device 120 could be, for example, a tablet, a handheld gaming device, a media player, all manner of mobile phones; smartphones; handheld devices; personal digital assistants (PDAs); MP3 or other digital music players; tablet, notebook, Ultrabook and laptop computers; e-readers; all types of cameras; GPS devices; set-top boxes and other media players; VoIP phones; universal remote controls; speakers; headphones; wearable computers; larger consumer electronic devices, such as desktop computers, televisions, projectors, DVRs, game consoles, Blu-ray Disc™ players, printers, network attached storage devices, etc.; as well as smaller electronic devices such as Google® Glass™, smartwatches (e.g., the Apple Watch™, Android Wear™ devices such as the Moto 360®, or the Pebble Steel™ watch), fitness bands, thumb drives, wireless hands-free devices; unmanned aerial vehicles; etc. Although many embodiments of the present technology are described herein in the context of mobile phones, aspects of the present technology are not limited to mobile phones and generally apply to other consumer electronic devices. Such devices include, as non-limiting examples, desktop computers, TVs, game consoles, etc.


In one embodiment, the evaluator device 110 can execute a software application to aid in the evaluation of the target device 120. For example, the evaluator device 110 can have a camera 150 and a flashlight 170, and can use the camera and the flashlight to take pictures and record videos of the target device 120. The evaluator device 110 can provide instructions to the user through speakers 160 and/or display 115 of the evaluator device 110 to direct the user how to position the target device 120 with respect to the camera 150 and/or flashlight 170, as described in more detail below. The flashlight can be a flash used in taking pictures or can include flashlight functionality of a device. In embodiments, the flashlight 170 is strobed instead of remaining on for longer periods of time.


In some embodiments, both the evaluator device 110 and the target device 120 can execute one or more applications. The one or more applications can communicate with each other, and the evaluator device 110 and the target device 120 can work in a server-client relationship to determine a price for the target device 120. For example, the application executed by the target device 120 can provide information about the target device 120 to the application executed by the evaluator device 110. Information can include, but is not limited to, make and model of the target device 120, operating system version, memory/storage capacity of the target device 120, service provider to the target device 120, IMEI number of the target device 120, network capabilities (e.g., 4G, 5G, etc.) of the target device 120, and so on.


In some embodiments, the target device 120 includes a communication interface (e.g., a connector port 122 and/or a wireless transceiver (not shown)) and the evaluator device 110 similarly includes a communication interface (e.g., a connector port 118 and/or a wireless transceiver (not shown)). In this embodiment, the evaluator device 110 can be electrically connected to the target device 120 via a wireless connection 130 between the respective device transceivers, such as a Wi-Fi or Bluetooth network or a near-field communication (NFC) link; or via a wired connection 140, such as a universal serial bus (USB), Ethernet, or Lightning cable connected between the device connector ports 118 and 122. The evaluator device 110 can run special software configured to evaluate the target device 120. The evaluator device 110 and the target device 120 can be connected via a USB cable. A display screen 115 of the evaluator device 110 can display information such as textual information 112 indicating that the evaluator device 110 has identified the target device 120, an image 114 representing the target device 120, and/or icons or buttons 116 enabling the user to select various options or actions such as confirming the correct identification of the target device 120, pricing the target device 120, saving the target device 120 in a list of devices, etc.


As described in detail below, the present technology enables the evaluator device 110 to obtain information from the target device 120 over the wireless connection 130 and/or the wired connection 140, and evaluate the obtained information to facilitate recycling and/or other processing of the target device 120. The term “processing” is used herein for ease of reference to generally refer to all manner of services and operations that may be performed on, with, or otherwise in relation to a target device. Such services and operations can include, for example, selling, reselling, recycling, upcycling, donating, exchanging, identifying, evaluating, pricing, auctioning, decommissioning, transferring data from or to, reconfiguring, refurbishing, etc. mobile phones and other target devices. The term “recycling” is used herein for ease of reference to generally refer to selling, purchasing, reselling, exchanging, donating, and/or receiving target devices. For example, owners may elect to sell their used target devices, and the target devices can be recycled for resale, reconditioning, repair, recovery of salvageable components, environmentally conscious disposal, etc.



FIG. 2 is a block diagram showing some of the components 200 typically incorporated in computing systems and other devices on which the present technology can be implemented. In the illustrated embodiment, the evaluator device 110 includes a processing component 230 that controls operation of the evaluator device 110 in accordance with computer-readable instructions stored in memory 240. The processing component 230 may be any logic processing unit, such as one or more central processing units (CPUs), graphics processing units (GPUs), digital signal processors (DSPs), application-specific integrated circuits (ASICs), etc. The processing component 230 may be a single processing unit or multiple processing units in an evaluator device or distributed across multiple devices. Aspects of the present technology can be embodied in a special purpose computing device or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein. Aspects of the present technology can also be practiced in distributed computing environments in which functions or modules are performed by remote processing devices that are linked through a communications network, such as a local area network (LAN), wide area network (WAN), or the Internet. In a distributed computing environment, modules can be located in both local and remote memory storage devices.


The processing component 230 is connected to memory 240, which can include a combination of temporary and/or permanent storage, and both read-only memory (ROM) and writable memory (e.g., random-access memory or RAM), writable non-volatile memory such as flash memory or other solid-state memory, hard drives, removable media, magnetically or optically readable discs, nanotechnology memory, biological memory, and so forth. As used herein, memory does not include a transitory propagating signal per se. The memory 240 includes data storage that contains programs, software, and information, such as an operating system 242, application programs 244, and data 246. Evaluator device 110 operating systems can include, for example, Windows®, Linux®, Android™, iOS®, and/or an embedded real-time operating system. The application programs 244 and data 246 can include software and databases configured to control evaluator device 110 components, process target device 120 information and data (e.g., to evaluate device make, model, condition, pricing, etc.), communicate and exchange data and information with remote computers and other devices, etc.


The evaluator device 110 can include input components 210 that receive input from user interactions and provide input to the processor 230, typically mediated by a hardware controller that interprets the raw signals received from the input device and communicates the information to the processor 230 using a known communication protocol. Examples of an input component 210 include a keyboard (with physical or virtual keys), a pointing device (such as a mouse, joystick, dial, or eye tracking device), a touchscreen 212 that detects contact events when it is touched by a user, a microphone 214 that receives audio input, and a camera 216 for still photographs and/or video capture. The evaluator device 110 can also include various other input components 210 such as GPS or other location determination sensors, motion sensors, wearable input devices with accelerometers (e.g., wearable glove-type input devices), biometric sensors (e.g., fingerprint sensors), light sensors, card readers (e.g., magnetic stripe readers or memory card readers), and so on.


The processor 230 can also be connected to one or more various output components 220, for example, directly or via a hardware controller. The output devices can include a display 115 on which text and graphics are displayed. The display 115 can be, for example, an LCD, LED, or OLED display screen (such as a desktop computer screen, handheld device screen, or television screen), an e-ink display, a projected display (such as a heads-up display device), and/or a display integrated with a touchscreen 212 that serves as an input device as well as an output device that provides graphical and textual visual feedback to the user. The output devices can also include a speaker 224 for playing audio signals, haptic feedback devices for tactile output such as vibration, etc. In some implementations, the speaker 224 and the microphone 214 are implemented by a combined audio input-output device.


In the illustrated embodiment, the evaluator device 110 further includes one or more communication components 250. The communication components can include, for example, a wireless transceiver 252 (e.g., one or more of a Wi-Fi transceiver; Bluetooth transceiver; NFC device; wireless modem or cellular radio utilizing GSM, CDMA, 3G, and/or 4G technologies; etc.) and/or a wired network connection 118 (e.g., one or more of an Ethernet port, cable modem, FireWire cable, Lightning connector, USB port, etc.). The communication components 250 are suitable for communication between the evaluator device 110 and other local and/or remote computing devices, for example, the target device 120, directly via a wired or wireless peer-to-peer connection and/or indirectly via the communication link 270 (which can include the Internet, a public or private intranet, a local or extended Wi-Fi network, cell towers, the plain old telephone system (POTS), etc.). For example, the wireless transceiver 252 of the evaluator device 110 can connect to the wireless transceiver 282 of the target device 120 via the wireless connection 130, and/or the wired connector 140 of the evaluator device 110 can connect to the wired connector 122 of the target device 120 via the wired connection 140. The evaluator device 110 further includes power 260, which can include battery power and/or facility power for operation of the various electrical components associated with the evaluator device 110.


Unless described otherwise, the construction and operation of the various components shown in FIG. 2 are of conventional design. As a result, such components need not be described in further detail herein, as they will be readily understood by those skilled in the relevant art. In other embodiments, the evaluator device 110 and/or the target device 120 can include other features that may be different from those described above. In still further embodiments, the evaluator device 110 and/or the target device 120 can include more or fewer features similar to those described above.



FIG. 3 shows a user interface presented by an application running on the evaluator device or the target device according to some embodiments of the present technology. The user interface 300 presents several buttons 310, 320, 330, 340 to the user.


When the user selects button 310, an application running on the device determines that the user wants to sell the device displaying the user interface 300. When the user selects the button 320, the application determines the user wants to sell another device, for example, the target device 120 in FIG. 1, and use the device displaying the user interface 300 as the evaluator device 110 in FIG. 1.


When the user selects the button 330, the application displays to the user previous offers received for various devices that the user previously offered for sale. The user can have an account with an entity (e.g., ecoATM). The user can login to that account and retrieve previous offers. When the user selects the button 340, the application provides the user additional information about trading in the user's devices (e.g. terms of service, privacy notices, recycling policies, etc.).



FIG. 4 shows a series of user interfaces presented to the user when the user wants to sell a target device according to some embodiments of the present technology. When the user selects the button 310 in FIG. 3, the application running on the device determines that the user has a single device that the user wants to sell. In step 400, the application provides information to the user encouraging the user to obtain a second device, such as an evaluator device 110 in FIG. 1 to obtain a more precise quote. The application provides two buttons 410 and 420. If the user wants to proceed with a single device, the user can select button 410, in which case the application provides a rough quote to the user. When the user selects button 420, the user can go back to the user interface 300 in FIG. 3 and can select button 320 in FIG. 3.


If the user wants to proceed with a single device, the application can gather information about the device by querying the user or automatically. For example, in step 430, asks the user to specify a type of device, such as iPhone, Samsung, Huawei, Dell, Lenovo, etc. In step 440, the application presents options 442, 444 (only two labeled for brevity) to the user to select a make of the device. In step 450, the application presents options 452, 454 (only two labeled for brevity) to the user to select a carrier/telecommunications service provider.


In step 460, the application presents options 462, 464 (only two labeled for brevity) to the user to select the memory capacity of the device. The application can also query the user whether the target device 120 is currently under contract or is a company phone.


In step 470, the application presents multiple questions 472, 474 (only two labeled for brevity) to prompt the user to describe the condition of the device, such as whether there is a crack on the front glass, crack in the back glass, issues with the display, broken buttons, broken cameras, etc. If in step 470 the user indicates that the device is in mint condition by, for example, not selecting a “Yes” button 476 for any of the questions, the application can request the user to provide additional information about the device, as described in FIG. 5 below.


In addition, the application can automatically gather information about the target device 120, thus shortening the user-UI interaction by obtaining parameters directly from the target device 120. To determine whether the device is an Android or an Apple device, the application can determine the operating system (OS) of the target device 120. If the operating system is iOS, the application can determine that the target device 120 is an Apple device. If the operating system is Android, the application can query the manufacturer of the device for the type of the device such as Samsung, Google, HTC, etc.


To determine the make, model, memory capacity, and/or carrier information of the target device 120, the application can obtain information from the target device 120, and can present the determined make, model, memory capacity, and/or carrier information for the user to confirm.


To test the ability of the target device 120 ability to connect over a wireless network, the application can ask the user's permission to automatically dial a number or send a text message. If the call is successfully placed and/or the text is successfully sent, the application can determine that the target device 120 has network capability.


To test whether the target device 120 has logged out of user's personal accounts, such as Gmail and/or iCloud, the application can attempt to access the user's personal accounts automatically. If the login attempts are unsuccessful, the application can determine that the user has successfully logged out of the user's personal accounts.


In step 480, based on information that the user has provided to the application, the application provides a price estimate 482 of the device to the user. However, in this case, the price estimate 482 is not a guarantee of a minimum price that the user is going to receive once the user submits the device for inspection and/or sale. The price estimate 482 can be reduced once the user submits the device (e.g., the user takes the device to a physical kiosk, mails-in the device to an evaluating entity, submits the device at a physical store, etc.).


Finally, in step 490, the application can present an alternative offer to the user. For example, as shown in FIG. 4, the offer can be to trade in the device for an upgraded device and receive a discount on the upgraded device. The discount can be higher than the price estimate 482 received in step 480. The offer can include credits for unrelated services such as an Uber or a Lyft ride, various accessories that can be traded for the device, credits towards accessory, gift cards, points, mileage credits, etc. For example, employers can offer trade-in for employee devices in exchange for carbon credits for which the employer can apply. Alternatively, the employer can reimburse employees for the traded-in devices.



FIGS. 5A-5D show a series of user interfaces associated with an application running on an evaluator device to guide the user in evaluating a target device according to some embodiments of the present technology. The application running on the evaluator device 110 can vary the steps of target device 120 evaluation, as explained in FIGS. 5A-5D, depending on what type of target device 120 is being evaluated, what type of damage has been disclosed by the user, etc.



FIG. 5A shows a display of the evaluator device 110 in FIG. 1 showing an initial step in evaluating the target device 120. Display element 500 can show all the steps needed to complete the evaluation of the target device 120. An image of the current step 510 can be highlighted to attract the user's attention, while the images of the rest of the steps 520, 530, 540, 550 can presented to appear less visible, such as by darkening them.


An application running on the evaluator device 110 can direct the user via audio, text, picture, video, or other similar means to logout of the user's personal accounts stored in the target device 120, and to display on the target device 120 an application verifying that the user has been logged out from the user's personal accounts. The user's personal accounts can include iCloud, Google, Dropbox, etc. The application, such as settings, can verify that the user has been logged out from the user's personal accounts. Further, the evaluator device 110 can direct the user, via audio, text, picture, video, or other similar means, to point the camera 150 in FIG. 1 of the evaluator device 110 to the display of target device 120 to record the display created by the verifying application such as Settings.


In addition, the application running on the evaluator device 110 can request the user to factory reset the target device 120. To verify that the target device 120 has gone or is going through the process of factory reset, the camera 150 of the evaluator device 110 can record progress or completion of the factory reset being completed on the target device.


The evaluator device 110 can record a picture (and/or video) of the target device 120 and produce an indication, such as a clicking sound, that the picture has been recorded. The evaluator device 110 can perform optical character recognition (OCR) to determine from the recorded image whether the user has logged out from the user's personal accounts.


In addition, the evaluator device 110 can provide a help button 560. When the user selects the button 560, evaluator device 110 can provide more detailed instructions to the user, provide a list of frequently asked questions (FAQ), and/or provide contact information of technical support.


In step 520 shown in FIG. 5B, the evaluator device 110 can direct the user to display the target device 120's unique identifier 590, such as primary and secondary International Mobile Equipment Identity (IMEI) for devices with multiple subscriber identity modules (SIMs), Unique Device Identification (UDI), media access control (MAC) address, Bluetooth MAC address, WiFi MAC Address, Universally Unique Identifier (UUID), Internet protocol (IP) Address (ipvc4/6), target device's phone number, target device model and serial numbers, etc. To get the unique identifier 590, e.g. IMEI, the user can also dial *#06#on the target device 120 to bring up the unique identifier. The evaluator device 110 can perform OCR on the unique identifier 590. In some embodiments, when all three of the memory/storage capacity 570 of the target device 120, service provider 580 to the target device 120, and IMEI number of the target device 120 are available, the evaluator device 110 can grade the target device 120 more efficiently. Therefore, during the OCR phase if only two of these pieces of information can be displayed on the screen of the target device 120 at the same time, the evaluator device 110 can direct the user to “scroll down,” such that the third piece of information can be read.


In addition, the evaluator device 110 can also request information about the capacity 570 and carrier 580 of the target device 120, as shown in FIG. 5B. The evaluator device 110 can get the unique identifier 590 from a previously installed application such as an application installed by a phone manufacturer. To prevent the user from photographing the IMEI of a damaged target device 120 and then taking a video of an undamaged device and attempting to sell the damaged target device 120, the application running on the evaluator device 110 can instruct the user to keep the evaluator device 110 constantly focused on the target device 120. When the app detects that the evaluator device 110 is not constantly (or substantially constantly) focused on the target device 120, it can present an error message to the user (e.g., as an audio alert (e.g. beeping sound), visual alert (e.g., a flashing/blinking light), and so on).


An application running on the evaluator device 110 can direct the user via audio, text, picture, video, or other similar means how to display the information containing the unique identifier 590, the capacity 570 and the carrier 580 on the display of the target device 120. For example, the evaluator device 110 can communicate to the user to go to settings, select the “general” button, and then select “about” to obtain the needed information.


The application running on the evaluator device 110 can direct the user to record a picture of the target device 120 showing the needed information. The evaluator device 110 can produce an indication, such as a sound, that the picture is recorded. Once the picture is recorded, the evaluator device 110 can use OCR to obtain the needed information from the picture.


As described herein, the evaluator device 110 obtains a unique identifier of the target device 120. In some embodiments, the evaluator device 110 determines whether the target device 120 has been evaluated previously based on the unique identifier 590. Upon determining that the target device 120 has been evaluated previously, the evaluator device 110 retrieves data describing the target device 120 from a database.


Once the evaluator device 110 obtains the unique identifier 590, the evaluator device 110 can determine whether the unique identifier 590 has been evaluated before, such as if the user has tried to scan the target device 120 multiple times. If the unique identifier 590 has been evaluated before, the evaluator device 110 can pre-populate the device information using the previously stored information such as IMEI number, storage capacity, etc. In addition, once the evaluator device 110 has the unique identifier 590, the evaluator device 110 can gather some info automatically. For example, the evaluator device 110 can query the wireless telecommunication provider database to get additional information about the device, such as technical specifications, age, number of resales, etc.


In step 515, shown in FIG. 5C, the evaluator device 110 can instruct the user to turn off the target device 120's screen prior to proceeding to the final three steps 530, 540, 550. The reason to turn off the target device 120's screen is to increase the visibility of any screen imperfections (e.g., cracks) on the target device 120 in the steps described in FIG. 5D. When the user turns off the display screen, the user can communicate to the evaluator device 110 to move to the final three steps 530, 540, 550 by selecting the button 525.



FIG. 5D shows the remaining steps needed to evaluate the target device 120. In these three steps, 530, 540, 550, the application running on the evaluator device 110 directs the user, via audio, text, picture, video, or other similar means, to take several pictures, such as three pictures, of the target device 120, as shown in FIG. 5D. The guide 505 displayed on the evaluator device 110 indicates how to position the target device 120 within the camera view of the evaluator device 110. The guide 505 can be any color and/or shape, such as a green rounded rectangle, a red rounded rectangle with broken lines, etc. In some embodiments, no guide is displayed. The pictures can be from approximately a three-quarters view to the left of the target device 120, three-quarters view to the right of the target device 120, and from the front. Additional pictures such as from the back and or the sides can also be taken. In one embodiment, steps 530, 540, 550 can be performed once for the display side of the target device 120, and once for the back side of the target device 120. For example, during evaluation, the evaluator device 110 can ask the user to position the target device 120 to take pictures of the back side of the target device 120, since most devices today have glass backs.


In some embodiments, a remote operator detects that a second device has replaced the target device 120 by analyzing the video. It is determined that the obtained physical properties are inaccurate in response to detecting that the second device has replaced the target device 120. For example, while the user is positioning the target device 120 to take the pictures, unknown to the user, the evaluator device 110 can record a video of the user's actions. The reason that the evaluator device 110 records the video unknown to the user is to prevent the user from switching out the target device 120 with another device (sometimes referred to as a “second device”) that is in better condition than the target device 120. While the user is recording a video, a remote operator can receive the video in real time and can detect whether the user has switched out the target device 120 for a device that is in better condition. If the remote operator detects the switch, the remote operator can instruct the evaluator device 110 to abort the evaluation and produce a notification to the user that the evaluation has been aborted. In some embodiments, detecting that the second device has replaced the target device is performed using an artificial intelligence module. For example, the remote operator can be an artificial intelligence module trained to detect the device switch.


In some embodiments, the evaluator device 110 records the video of the target device 120 by flashing or strobing the flashlight 170 of the evaluator device 110 and moving the camera 150 of the evaluator device 110 over the target device 120. The camera 150 of the evaluator device 110 is moved over the target device 120 using sweeping movements and different angles, such that the camera 150 captures screen and/or device imperfections (e.g., cracks) that may not be otherwise visible in picture format. In some embodiments, the video is about 30 seconds long, such that at some point in that time period glints or cracks are revealed. The sweeping movements can be from left to right, top to bottom, bottom to top, etc., such that the camera 150 of the evaluator device 110 is moved through a variety of motions to enable the camera 150 to view and record different areas and perspectives of the target device 120. The artificial intelligence module can also be trained to detect screen and/or device imperfections (e.g., cracks) that may not be otherwise visible in picture format from the recorded video. In some embodiments, the artificial intelligence module processes the video in real time, i.e., while the video is being recorded. As soon as the artificial intelligence module determines an imperfection in the target device 120, the evaluator device 110 can proceed to a pricing screen of a user interface of the evaluator device 110 and instruct the user to stop photographing the target device 120. As a backup feature, if the Internet connection of the target device 120 has insufficient speed the video is processed locally in the evaluator device 110 and not uploaded to the cloud. This feature is used when the target device 120 uses 3G instead of LTE or W-Fi.


The artificial intelligence module is implemented using the components illustrated and described in more detail with reference to FIGS. 1-2. For example, the artificial intelligence module can be implemented on the evaluator device 110 using instructions programmed in the memory 240 illustrated and described in more detail with reference to FIG. 2. Likewise, embodiments of the artificial intelligence module can include different and/or additional components, or be connected in different ways. The artificial intelligence module is sometimes referred to as a machine learning module.


In some embodiments, the artificial intelligence module includes a feature extraction module implemented using the components illustrated and described in more detail with reference to FIG. 2. The feature extraction module extracts a feature vector from the recorded video. The feature extraction module reduces the redundancy in the input data, e.g., repetitive data values, to transform the input data into a reduced set of features. The feature vector contains the relevant information from the input data, such that properties of a target device under evaluation or data value thresholds of interest can be identified by the artificial intelligence module by using this reduced representation. In some example embodiments, the following dimensionality reduction techniques are used by the feature extraction module: independent component analysis, Isomap, Kernel PCA, latent semantic analysis, partial least squares, principal component analysis, multifactor dimensionality reduction, nonlinear dimensionality reduction, Multilinear Principal Component Analysis, multilinear subspace learning, semidefinite embedding, Autoencoder, and deep feature synthesis.


In alternate embodiments, the artificial intelligence module performs deep learning (also known as deep structured learning or hierarchical learning) directly on input data to learn data representations, as opposed to using task-specific algorithms. In deep learning, no explicit feature extraction is performed; features are implicitly extracted by the artificial intelligence module. For example, the artificial intelligence module can use a cascade of multiple layers of nonlinear processing units for implicit feature extraction and transformation. Each successive layer uses the output from a previous layer as input. The artificial intelligence module can thus learn in supervised (e.g., classification) and/or unsupervised (e.g., pattern analysis) modes. The artificial intelligence module can learn multiple levels of representations that correspond to different levels of abstraction, wherein the different levels form a hierarchy of concepts. In this manner, the artificial intelligence module can be configured to differentiate features of interest from background features.


In some embodiments, the artificial intelligence module, e.g., in the form of a convolutional neural network (CNN) generates output, without the need for feature extraction, directly from input data. The output is provided to the evaluator device 110. A CNN is a type of feed-forward artificial neural network in which the connectivity pattern between its neurons is inspired by the organization of a visual cortex. Individual cortical neurons respond to stimuli in a restricted region of space known as the receptive field. The receptive fields of different neurons partially overlap such that they tile the visual field. The response of an individual neuron to stimuli within its receptive field can be approximated mathematically by a convolution operation. CNNs are based on biological processes and are variations of multilayer perceptrons designed to use minimal amounts of preprocessing.


In some embodiments, the artificial intelligence module is trained based on training data, to correlate the feature vector to expected outputs in training data. The training data includes a positive training set of features that have been determined to have a desired property in question, and, in some embodiments, a negative training set of features that lack the property in question. Machine learning techniques are used to train the artificial intelligence module, that when applied to a feature vector, outputs indications of whether the feature vector has an associated desired property or properties, such as a probability that the feature vector has a particular Boolean property, or an estimated value of a scalar property. The artificial intelligence module can further apply dimensionality reduction (e.g., via linear discriminant analysis (LDA), principle component analysis (PCA), or the like) to reduce the amount of data in the feature vector to a smaller, more representative set of data.


Supervised machine learning can be used to train the artificial intelligence module, with feature vectors of the positive training set and the negative training set serving as the inputs. In some embodiments, different machine learning techniques, such as linear support vector machine (linear SVM), boosting for other algorithms (e.g., AdaBoost), logistic regression, naïve Bayes, memory-based learning, random forests, bagged trees, decision trees, boosted trees, boosted stumps, neural networks, CNNs, etc., are used. In some example embodiments, a validation set is formed of additional features, other than those in the training data that have already been determined to have or to lack the property in question. The artificial intelligence module can apply a trained machine learning model to the features of the validation set to quantify the accuracy of the machine learning model. Common metrics applied in accuracy measurement include: Precision and Recall, where Precision refers to a number of results the artificial intelligence module correctly predicted out of the total it predicted, and Recall is a number of results the artificial intelligence module correctly predicted out of the total number of features that did have the desired property in question. In some embodiments, the artificial intelligence module is iteratively re-trained until the occurrence of a stopping condition, such as the accuracy measurement indication that the artificial intelligence module is sufficiently accurate, or a number of training rounds having taken place.


Returning now to FIGS. 5A-5D, when the user positions the device according to the instructions, the user can select the button 535, and the evaluator device 110 can indicate that the picture has been taken by, for example, producing a clicking sound, even though the evaluator device 110 is recording a video. The reason to indicate that the picture has been taken is to lead the user to believe that the app is recording pictures, thus leading the user who wants to switch out the target device 120 to be less careful when switching the devices.


When recording the video in steps 530, 540, 550, the evaluator device 110 can have the flashlight 170 in FIG. 1 on, so that any fractures on the target device 120 reflect the flashlight and create highlights, which can be detected as cracks on the target device 120. In one embodiment, the application running on the evaluator device 110 can check whether the target device 120 has a white frame around the display. If the target device 120 has a white frame, the application can perform steps 530, 540, 550 twice, once with the flashlight 170 on, and once with the flashlight 170 off.


If the evaluator device 110 determines that the target device 120 has a white or gray border or back, the evaluator device 110 can choose to not turn on the flashlight because the flashlight helps with detecting highlights and/or glints of the cracked glass when the glass is over a black sub-surface, e.g. LCD. However, when the cracked glass is on top of a white or a gray surface, the evaluator device 110 can detect drop shadows on top of the white or gray surface. In that case, turning on the flashlight can deter the detection of the drop shadows by illuminating them and washing them out. Consequently, the evaluator device 110 can ask the user to take two pictures of a single side of the target device 120, such as a front side or the backside of the target device 120. The first picture can be taken with the flashlight, and the second picture can be taken without the flashlight.


Steps 510, 520, 530, 540, 550 can be repeated numerous times, independently of each other. For example, step 510 can be repeated three or four times, while steps 530, 540, 550 are repeated twice. The number of pictures taken and the number of videos taken can vary depending on the information provided by the user, and information detected in the pictures and videos recorded. For example, if the remote operator detects a potential crack in the front of the screen based on a video of the front of the screen, the remote operator can request an additional video recording of the front of the screen under different lighting conditions.


As explained in this application, the evaluator device 110 can be, for example, a laptop, or a desktop, and the user can receive a quote for the target device 120 using a web browser running on the evaluator device 110. The evaluator device 110 can ask the user's permission to turn on the evaluator device's 110 camera 150. If the user approves, the browser tells the user to move the target device 120 through the various positions so the fixed camera 150 can record the video for the remote operator such as an AI to analyze. In this scenario, the user can optionally place a flashlight (e.g., the flash of a working phone) resting on their evaluator device's 110 screen, facing the user, so that the camera 150 can better record any cracks in the target device 120.



FIG. 6 shows a series of user interfaces that display a guaranteed price and various payment options according to some embodiments of the present technology. After performing the target device 120 evaluation, the application running on the evaluator device 110 can present a guaranteed price 600 to the user. If the user refuses the guaranteed price 600, the application running on the evaluator device 110 can present an option to the user to explain the reasoning behind the offer price, such as a crack in the target device 120 was detected. If the user accepts the guaranteed price 600, in step 610, the application can offer to the user store credit such as store points, a gift card, a better quality device, device accessories, credits for unrelated services such as an Uber or a Lyft ride, points, mileage credits, etc.


In step 620, the application can present to the user options 630, 640, 650 on how to receive payment. None of the options 630, 640, 650 require further investigation of the target device 120, because the guaranteed price 600 is an accurate quote.


If the user selects option 630, the user can take the target device 120 to a store or a kiosk and receive cash payment. The kiosk or a device at the store can read the target device 120's IMEI, electrically or via OCR. The kiosk or the device at the store can pay the user without needing to do any further interrogation. Consequently, kiosks or devices at the store do not have to include any grading capabilities, and can be cheap devices that confirm IMEI and identity, and pay customers.


If the user selects option 640, the user can print the label and ship the target device 120 to the store or the kiosk. The cost of the shipping can be subtracted from the guaranteed price 600. If the user selects the third option 650, the user can receive a shipping box with the shipping label. The cost of the box and the shipping label can also be subtracted from the guaranteed price 600. Alternatively, the user can receive payment for the target device 120 from a third party such as a GrubHub™, Uber™, or Lyft™ driver that comes to pick up the target device 120 and take the target device 120 to a store or a kiosk for a nominal fee.


The user can increase the evaluation of the target device 120 by testing the target device 120's touchscreen functionality, display quality, Wi-Fi, camera, microphone, GPS, etc. To test additional functionality, the application running on the evaluator device 110 can display a code, such as a QR code. The application running on the evaluator device 110 can instruct the user, via audio, video, text, picture or other similar means, to point a camera of the target device 120 toward the code.


Once the target device 120 scans the code, the target device 120 can execute instructions to test the target device 120's touchscreen functionality, display quality, network access such as data, Wi-Fi, camera, microphone, GPS, etc. The instructions can be provided to the target device 120 through an application installed on the target device 120 or can be provided to the target device 120 when the target device 120 visits a specified webpage.



FIG. 7 shows a user interface that displays tests of the target device 120 including touchscreen functionality according to some embodiments of the present technology. The user can run the tests if the user has selected “Sell This Phone” 310 in FIG. 3, or “Sell Another Phone” 320 in FIG. 3.


If the user selects the “Sell This Phone” and completes the self-evaluation, e.g. the evaluator device 110 in FIG. 1 determines the make and model of the target device 120 in FIG. 1, the evaluator device 110 can choose to run the tests described in FIGS. 7-10 based on the make and model of the target device 120. For example, if the target device 120 is a more expensive device, such as an iPhone™ 11 Pro Max, the evaluator device 110 can run the tests in FIGS. 7-10. If the target device 120 is a less inexpensive device, such as an iPhone™ 5, the evaluator device 110 can skip the tests in FIGS. 7-10.


If the user selects “Sell Another Phone” 320, the evaluator device 110 can run the tests in FIGS. 7-10, and the applications on the evaluator device 110 and the target device 120 can communicate with each other to determine the final price 600. The target device 120 can report its condition, and with that data, the evaluator device 110 can determine the final price.


To run the tests on the target device 120, in one embodiment, the evaluator device can ask the user to download the test application to the target device 120, where the test application is configured to run the tests. In other embodiments, to streamline the process, and not require the user to download another application, the evaluator device 120 can show a QR code on its screen and ask the target device 120 to point the target device 120's camera at the QR code. When the target device 120 detects the QR code, the target device 120 can launch a web browser and take the user to a website. The website, via HTML5 and Javascript, can run one or more of the tests in FIGS. 7-10 via the target device 120's browser.


The evaluator device 110 can run the tests in FIGS. 7-10, either before or after providing the quote 482 in FIG. 4, 600 in FIG. 6. For example, if the evaluator device 110 runs the tests in FIGS. 7-10 before providing the quote 482, the evaluator device 110 can offer the user the highest price. In another example, evaluator device 110 can run the tests in FIGS. 7-10 only after providing the quote 482, 600 and the user declines the quote. After the user declines the quote, the evaluator device 110 can try to win the user back by offering a chance for a higher quote, such as by asking “Want $10 more?” If the user indicates that they do want a higher quote, the evaluator device 110 can run the tests in FIGS. 7-10.


To test network access of the target device 120, the target device 120 can be asked to access a particular webpage 700. If the target device 120 successfully accesses the webpage 700, that is an indication that the target device 120 has functioning network access. To test the camera of the target device 120, the target device 120 scans a QR code that can initiate testing of other functionalities of the target device 120.


To test the touchscreen functionality, the target device 120 can display a square 710, that can move horizontally and vertically across the screen 720. In each position of the square 710, the target device 120 can ask the user to select the square 710. The target device 120 can determine whether the selection was detected, and whether the selection matches the location of the square 710. If the selection matches the location of the square 710 for all positions of the square 710 across the screen 720, the target device 120 can determine that the touchscreen functionality works.



FIG. 8 shows a user interface that displays a test of the microphone of the target device 120 according to some embodiments of the present technology. The target device 120 or the evaluator device 110 can instruct the user, via audio, video, text, picture or other similar means, how to test the microphone. For example, the target device 120 can provide a button to start the test and record the audio, and a button 800 to stop the test. The target device 120 can display a visualization 810 of the recorded audio. In some embodiments, during the microphone test, the evaluator device 110 produces audio signals at, e.g., 12,000 Hz. The microphone of the target device 120 captures the audio signals at the test frequency. The process is repeated for the different speakers and microphones available.



FIG. 9 shows a user interface that displays a test of the GPS of the target device 120 according to some embodiments of the present technology. The target device 120 can test the GPS by determining the location of the target device 120 using the GPS and communicating the location of the target device 120 to the user, via audio, video, text, picture or other similar means. For example, the target device 120 can display the detected location 900 of the target device 120 on a map. The target device 120 can request a confirmation from the user, such as using buttons 910, 920 or an audio communication. In embodiments, the GPS of the target device 120 is tested by using the evaluator device 110 to record a value that the target device 120 generates as its location. The evaluator device 110 records the value using the QR code procedure disclosed herein. The evaluator device 110 compares the recorded value to a location value generated by the GPS of the evaluator device 110 to perform the test.



FIG. 10 shows a user interface that displays a test of the display of the target device 120 according to some embodiments of the present technology. The target device 120 can display the colors, such as red, blue, and green, on the display screen 1000 of the target device 120. The target device 120 can present a query to the user, via audio, video, text, picture or other similar means, such as: “in the areas where we are cycling through colors, do you see any pixels, elements, or spots that are not shifting colors?”. The user can respond to the query using buttons 1010, 1020, or by using audio communication.



FIG. 11 shows a map displaying kiosks and associated prices. Once the user accepts the guaranteed price 600 in FIG. 6, the evaluator device 110 can present a map 1100 of kiosks and/or stores to the user, indicating where the user can redeem the target device 120. The map 1100 can also present a price 1110, 1120 (only two labeled for brevity) that can be obtained at the various kiosks. The price 1110, 1120 can depend on the method of payment selected by the user, such as methods described in FIG. 6, and/or the distance to the kiosk.



FIG. 12 is an isometric view of a kiosk 1200 for recycling and/or other processing of mobile phones and other consumer electronic devices in accordance with some embodiments of the present technology.


In the illustrated embodiment, the kiosk 1200 is a floor-standing, self-service kiosk configured for use by a user 1201 (e.g., a consumer, customer, etc.) to recycle, sell, and/or perform other operations with a target device 120 such as a mobile phone or other consumer electronic device. In other embodiments, the kiosk 1200 can be configured for use on a countertop or a similar raised surface. Although the kiosk 1200 is configured for use by consumers, in various embodiments, the kiosk 1200 and/or various portions thereof can also be used by other operators, such as a retail clerk or kiosk assistant, to facilitate the selling or other processing of target devices 120 such as mobile phones and other consumer electronic devices.


The kiosk 1200 can be used in a number of different ways to efficiently facilitate the recycling, selling, and/or other processing of target devices 120 such as mobile phones and other consumer electronic devices. For example, a user wishing to sell a used mobile phone or other target device 120 can bring the target device 120 to the kiosk 1200 for recycling. In some embodiments, the kiosk 1200 can perform a visual analysis and/or an electrical analysis, verify the user's identity, and pay the user 1201 for the target device 120 using one or more of the methods and/or systems described in detail in the commonly owned patents and patent applications identified herein and incorporated by reference in their entireties.


As those of ordinary skill in the art will appreciate, various embodiments of the kiosk 1200 can be used for recycling virtually any consumer electronic device, such as mobile phones, MP3 players, tablet computers, laptop computers, e-readers, PDAs, Google® Glass™, smartwatches, and other portable or wearable devices, as well as other relatively non-portable electronic devices such as desktop computers, printers, televisions, DVRs, devices for playing games, and entertainment or other digital media on CDs, DVDs, Blu-ray™, etc.



FIG. 13 is a high-level flow diagram of a routine 1300 to generate a guaranteed price of a target device 120 (e.g., a mobile phone, tablet computer, thumb drive, television, SLR, etc.) for recycling in accordance with some embodiments of the present technology. In various embodiments, an app running on an evaluator device (e.g., the evaluator device 110 of FIG. 1) such as a mobile phone or computer, and/or another processing device operatively connectable to the app, such as a remote computer (e.g., a server), can perform some or all of the routine 1300. In some instances, for example, a user who owns a target device 120 (e.g., a game console, laptop, etc.) may want to know how much the target device 120 is worth so that he or she can decide whether to sell it. The routine 1300 of FIG. 13 enables the user to use another electronic device (e.g., the evaluator device 110) to quickly obtain a current price for the target device 120, without requiring the user to bring the target device 120 to a recycling kiosk 1200, a retail outlet, or another location, and without requiring the user to manually provide information about the target device 120 and its configuration.


In various embodiments, the routine 1300 and the other flow routines described in detail herein can be implemented by an evaluator device 110 running an app that can obtain information about a connected target device 120. The target device 120 may be, for example, one of various consumer electronic devices, such as a used mobile telecommunication device, which includes all manner of handheld devices having wired and/or wireless communication capabilities (e.g., a smartphone, computer, television, game console, home automation device, etc.). In some embodiments, the user downloads the app to the evaluator device 110 from an app store or other software repository associated with the device manufacturer or a third party (e.g., the Apple® Appstore™, Google Play™ store, Amazon® Appstore™, and so on), from a website, from a kiosk such as the kiosk 1200 (e.g., sideloading an app over a wired or wireless data connection), from a removable memory device such as an SD flashcard or USB drive, etc. In other embodiments, the app is loaded on the evaluator device 110 before it is first acquired by the user (e.g., preinstalled by the device manufacturer, a wireless service carrier, or a device vendor).


In block 1302, the evaluator device 110 can obtain technical properties associated with the target device 120. The technical properties can include make and model of the device, computation capacity of the device, memory capacity of the device, carrier providing data, and/or cellular connectivity to the device, etc.


In block 1304, the evaluator device 110 can obtain physical properties associated with the target device 120. The physical properties can include wear and tear of the target device 120. To obtain the physical properties, the evaluator device 110 can instruct a user of the target device 120 to position the target device 120 in multiple predetermined positions, such as front, back, sides; three-quarter view from the top, bottom, and left of the front; and/or three-quarter view from the top, bottom, and left of the back of the target device 120.


While obtaining the physical properties of the target device 120, the evaluator device 110 can indicate to the user that the evaluator device 110 has recorded an image of each of the multiple predetermined positions. The indication can include a click, a voice explaining that the picture has been taken, or a visual display indicating that the picture has been taken. In addition, the evaluator device 110 can record a video of handling of the target device 120 while the target device 120 is being positioned into the multiple predetermined positions, without informing the user that the video is being recorded. By not informing the user that the video is being recorded, the user who wants to commit fraud, and for example, switch the target device 120 with a second device in a better condition, is less careful, and can switch out the devices between pictures. However, because the evaluator device 110 is recording a video, the evaluator device 110 can record the switching out of the devices.


In block 1306, the evaluator device 110 can evaluate the obtained information of the physical properties to generate a condition metric value of the target device 120 by analyzing the video. The condition metric value can indicate the wear and tear of the target device 120. The condition metric value can be a numerical value indicating the condition of the target device 120 in a predetermined scale, such as a scale from 1 to 100, where 100 indicates mint condition, and 1 indicates that a replacement needed. The condition metric value can be a vector or a matrix, indicating the condition of the device for various factors. For example, the condition metric value can be a vector (50, 10, 70 . . . ), where the first entry indicates the condition of the screen, the second entry indicates the condition of the battery, the third entry indicates the memory capacity of the device, etc. The condition metric value to generate a condition metric value of the target device 120 can also be a scalar value representing a weighted sum of the vector entries.


To determine the wear and tear of the target device 120, the evaluator device 110 can activate its flashlight to illuminate the target device 120, prior to or while the target device 120 is positioned into the multiple predetermined positions. The flashlight can create highlights, or glints, in the cracks on the surface of the target device 120. The evaluator device 110 can detect the highlights, or glints, appearing on a surface of the target device 120.


The video analysis can be performed by a remote operator such as an AI. The AI can detect an inaccurate determination of the wear and tear of the target device 120 by detecting from the video whether a second device has replaced the target device 120. If the AI detects a switch in the devices, the AI can alert the user and/or a system administrator that a switch has been detected.


In block 1308, based on the generated condition metric value, the evaluator device 110 can determine the guaranteed price of the target device 120, as explained in this application. In block 1310, the evaluator device 110 can present to the user the guaranteed price of the target device 120.


The evaluator device 110 can determine whether the target device 120 has been evaluated previously, such as by determining whether the unique device identifier has been previously evaluated by the system. Upon determining that the target device 120 has been evaluated previously, the evaluator device 110 can obtain from a database, properties of the target device 120 and can populate the technical properties and the physical properties from the data contained in the database.



FIG. 13 and the flow diagrams that follow are representative and may not show all functions or exchanges of data, but instead they provide an understanding of commands and data exchanged under the system. Those skilled in the relevant art will recognize that some functions or exchange of commands and data may be repeated, varied, omitted, or supplemented, and other (less important) aspects not shown may be readily implemented. Those skilled in the art will appreciate that the blocks shown in FIG. 13 and in each of the flow diagrams discussed below may be altered in a variety of ways. For example, while processes or blocks are presented in a given order, alternative implementations may perform routines in a different order, and some processes or blocks may be rearranged, deleted, moved, added, subdivided, combined, and/or modified to provide alternative or sub-combinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, although processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed or implemented in parallel, or may be performed at different times. Some of the blocks depicted in FIG. 13 and the other flow diagrams are of a type well known in the art, and can themselves include a sequence of operations that need not be described herein. Those of ordinary skill in the art can create source code and/or microcode, program logic arrays, or otherwise implement the embodiments disclosed herein based on the flow diagrams and the Detailed Description provided herein.


In various embodiments, all or a portion of the routine 1300 and the routines in the other flow diagrams herein can be implemented by means of a consumer or another user (such as a retail employee) operating one or more of the electronic devices and systems described above. For example, in some embodiments, the routine 1300 and other routines disclosed herein can be implemented by a mobile device, such as the evaluator device 110 described above with reference to FIG. 1. For example, in some instances the app can run on one or more evaluator devices 110 and/or on one or more target devices 120. Accordingly, the description of the routine 1300 and the other routines disclosed herein may refer interchangeably to the routine, the app, the evaluator device 110, the target device 120, and/or the kiosk 1200 performing an operation, with the understanding that any of the above devices, systems, and resources can perform all or part of the operation.


While various embodiments of the present technology are described herein using mobile phones and other handheld devices as examples of electronic devices, the present technology applies generally to all types of electronic devices. For example, in some embodiments, the app can be installed and/or run on a larger evaluator device 110 and/or target device 120, e.g., a laptop or tower computer, to perform all or a portion of the routine 1300. For example, the app can inventory a laptop or desktop computer and provide the user a confirmation code that the user can print out and bring to a kiosk or to an associated retailer location or point of sale (or send in with the computer via, e.g., courier, mail, or package delivery service) as a receipt. The code can identify the target device 120 and represent the agreed price determined according to the valuation performed by the app based on the information it obtained from the target device 120 and on any user-provided supplemental information. In some embodiments, the app and/or the receipt can indicate any elements that require independent verification (e.g., undamaged screen glass) for the user to receive the agreed price for the target device 120. The user can then take the target device 120 to the retail storefront or point of sale (or, e.g., to the kiosk 1200 for viewing by a remote kiosk operator) for its condition to be independently verified, after which the user can deposit the target device 120. The user can then receive the price upon verified receipt of the target device 120, such as a retailer or the kiosk 1200 issuing cash, a credit, or a card such as a gift card.



FIG. 14 is a flow diagram of a routine 1400 for remotely evaluating a target device 120 for recycling in accordance with some embodiments of the present technology. In various embodiments, the kiosk 1200 and/or another processing device operatively connectable to the kiosk 1200, such as a remote server, can perform some or all of the routine 1400. In some embodiments, the routine 1400 can be performed in conjunction with the routine 1300 of FIG. 13 performed by the evaluator device 110, which can be remote from the kiosk 1200. For example, the kiosk 1200 and/or a remote server can provide software (e.g., the app described above) to be installed on the evaluator device 110, and then the kiosk and/or server can remotely receive information about the target device 120 via the app installed on the evaluator device 110, use the information to provide an offer price for the target device 120, and record the offer price so that the user can recycle the target device 120 for the quoted offer price when the user brings the target device 120 to the kiosk 1200.


In block 1402, the routine 1400 provides the app described above to install on the evaluator device 110. The routine 1400 can provide the app to the evaluator device 110 by various avenues: for example, from the kiosk 1200 (e.g., sideloading the app over a wired or wireless data connection); through a website (e.g., a website associated with the kiosk operator); from a software repository run by the device manufacturer or a third party (e.g., the Apple® Appstore™, Google Play™ store, Amazon® Appstore™, etc.); via a removable memory device such as an SD flashcard or USB drive; by preinstallation on the evaluator device 110 by the device manufacturer, a wireless service carrier, or a device vendor; and so on.


In block 1404, the routine 1400 receives information about the target device 120 and/or the user via the app on the evaluator device 110. The information can include, for example, a device identifier such as a serial number, IMEI number, or hardware address; a device make and/or model name, number, or code; data describing the device configuration, characteristics, and/or capabilities; owner information, such as a name, driver's license number, and/or account identifier; etc. For example, the user may download and run the app on the evaluator device 110 to obtain such information about the target device 120, and the app can store information about the target device 120, and/or transmit the information, for example, to a remote server computer. In various embodiments, the routine 1400 can access the stored or transmitted information, such as by receiving the information at the server computer.


In block 1406, the routine 1400 records one or more identifiers of the target device 120 (and/or the evaluator device 110) and/or the user. In some embodiments, the routine 1400 utilizes an identifier associated with the target device 120 that was included in the information that the routine 1400 received in block 1404. Examples of such target device 120 identifiers include the IMEI of a mobile phone, the model and/or serial numbers of a laptop computer, a unique wireless identifier of the target device 120 such as a Wi-Fi interface MAC address, a product bar code, USB vendor ID and device ID (and release number) codes, etc. The identifier can also be a derived code such as a unique hash (based on, e.g., the information received in block 1404), and/or a serially or randomly assigned code such as by generating a globally unique identifier (GUID) for the target device 120 and/or user. In some embodiments, the routine 1400 can generate an identifier after pricing the target device 120, so that the identifier reflects the pricing. User-related identifiers can include, for example, a driver's license number, account credentials such as a username and password, etc. The routine 1400 can record the identifiers in a registry database that indexes identifiers against, for example, evaluator devices 110 to which the app has been installed, target devices that the system has priced (e.g., remotely via the app), and/or target devices that the kiosk 1200 has previously evaluated, for example. The database can be, for example, one or more of the databases associated with the server computer, can be stored in the cloud storage facility, can be distributed among the kiosks 1200, and so on.


In block 1408, the routine 1400 evaluates the target device 120 based on the information. For example, the routine 1400 can compare the received target device 120 information to a database of prices, such as a lookup table, pricing model, or other data structure containing prices for various target devices on a server that can be remotely located from the evaluator device 110 and/or the target device 120 (e.g., the server computer). The routine 1400 can, for example, use the identifier to determine the make and model of the target device 120, and use the make and model of the target device 120 (along with, e.g., information about the condition of the target device 120) to determine a price to offer for the device based on the data in the database or pricing model. In some embodiments, the routine 1400 can determine an offer price that enables the user to receive the price in exchange for recycling the target device 120 at the kiosk 1200. In some embodiments, the routine 1400 can determine an offer price for the target device 120 that is contingent on an assessment of the visual condition of the target device 120 by the evaluator device 110 or by the kiosk 1200. In some embodiments, the routine 1400 can determine an offer price that includes a range of prices based on the possible outcomes of such an assessment. In some instances, the target device 120 may have no market value. In various embodiments, the pricing data is updated on a continuous or periodic basis.


In block 1410, the routine 1400 sends an offer price quote for the target device 120 to the evaluator device 110. In some embodiments, the routine 1400 sends the offer price quote from the server to the app running on the evaluator device 110. In block 1412, the routine 1400 associates the quoted price with the identified target device 120 (and/or the evaluator device 110) and/or the user. For example, the routine 1400 can store information about the price quote, the target device 120, and/or the user in the database and/or in one or more data structures maintained by the app on the evaluator device 110, by the kiosk 1200, and/or by other aspects of the present technology. In some embodiments, the routine 1400 can associate the price with a unique identifier such as a hash value generated based on the user, the device identification, the app, and/or the time and amount of the price itself, etc. For example, the routine 1400 can associate a numeric or alphanumeric identifier code with the offer price for the target device 120 and give that code to the user, informing the user that the user can enter the code at the kiosk 1200 by a certain time to receive the offer price for the device. For example, the routine 1400 can display the code on the screen 115 of the evaluator device 110 and/or send the user an email or text message containing the code. The routine 1400 can store the price and the identifier in a data structure on the evaluator device 110 (e.g., in a table maintained by the app) and/or remotely from the evaluator device 110 (e.g., in a data structure maintained at one or more of the kiosk 1200, the server computer, the cloud storage facility, etc.), and it can transmit them between or among various computing and/or storage facilities. In some embodiments, the routine 1400 transmits the identifier to the server computer so that when the kiosk 1200 receives the identifier, the kiosk 1200 can look up the identifier and retrieve the associated price for the target device 120. After block 1412, the routine 1400 ends.


Embodiments of the kiosk 1200 and various features thereof can be at least generally similar in structure and function to the systems, methods and corresponding features described in the following patents and patent applications, which are incorporated herein by reference in their entireties: U.S. Pat. Nos. 10,860,990, 10,853,873, 10,572,946, 10,475,002; 10,445,708; 10,438,174; 10,417,615; 10,401,411; 10,269,110; 10,127,647; 10,055,798; 9,885,672; 9,881,284; 8,200,533; 8,195,511; and 7,881,965; U.S. patent application Ser. Nos. 17/445,178; 17/445,158; 17/445,083; 17/445,082; 17/125,994; 16/794,009; 16/788,169; 16/788,153; 16/719,699; 16/794,009; 16/534,741; 15/057,707; 14/967,183; 14/966,346; 14/964,963; 14/663,331; 14/660,768; 14/598,469; 14/568,051; 14/498,763; 13/794,816; 13/794,814; 13/753,539; 13/733,984; 13/705,252; 13/693,032; 13/658,828; 13/658,825; 13/492,835; 13/113,497; U.S. Provisional Application Nos. 63/220,890; 63/220,381; 63/127,148; 63/116,020; 63/116,007; 63/088,377; 63/070,207; 63/066,794; 62/950,075; 62/807,165; 62/807,153; 62/804,714; 62/782,947; 62/782,302; 62/332,736; 62/221,510; 62/202,330; 62/169,072; 62/091,426; 62/090,855; 62/076,437; 62/073,847; 62/073,840; 62/059,132; 62/059,129; 61/607,572; 61/607,548; 61/607,001; 61/606,997; 61/595,154; 61/593,358; 61/583,232; 61/570,309; 61/551,410; 61/472,611; 61/347,635; 61/183,510; and 61/102,304. All the patents and patent applications listed in the preceding sentence and any other patents or patent applications identified herein are incorporated herein by reference in their entireties.


The above Detailed Description of the embodiments disclosed herein is not intended to be exhaustive or to limit the embodiments disclosed to the precise form disclosed above. Although specific examples for the embodiments disclosed herein are described above for illustrative purposes, various equivalent modifications are possible within the scope of the embodiments, as those skilled in the relevant art will recognize.


References throughout the foregoing description to features, advantages, or similar language do not imply that all of the features and advantages that may be realized with the present technology should be or are in any single embodiment. Rather, language referring to the features and advantages is understood to mean that a specific feature, advantage, or characteristic described in connection with an embodiment is included in at least one embodiment of the present technology. Thus, discussions of the features and advantages, and similar language, throughout this specification may, but do not necessarily, refer to the same embodiment.


Furthermore, the described features, advantages, and characteristics of the present technology may be combined in any suitable manner in one or more embodiments. One skilled in the relevant art will recognize that the present technology can be practiced without one or more of the specific features or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments of the present technology.


Any patents and applications and other references noted above, including any that may be listed in accompanying filing papers, are incorporated herein by reference. Aspects of the embodiments disclosed herein can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further implementations of the embodiments disclosed herein.


Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements; the coupling or connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.


The teachings of the embodiments disclosed herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various examples described above can be combined to provide further implementations of the embodiments disclosed herein. Some alternative implementations of the embodiments disclosed herein may include not only additional elements to those implementations noted above, but also may include fewer elements. Further any specific numbers noted herein are only examples: alternative implementations may employ differing values or ranges.


Although the above description describes various embodiments and the best mode contemplated, regardless of how detailed the above text, the embodiments disclosed herein can be practiced in many ways. Details of the system may vary considerably in its specific implementation, while still being encompassed by the present technology. As noted above, particular terminology used when describing certain features or aspects of the embodiments disclosed herein should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the embodiments disclosed herein to the specific examples disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the embodiments disclosed herein encompasses not only the disclosed examples, but also all equivalent ways of practicing or implementing the embodiments disclosed herein under the claims.


From the foregoing, it will be appreciated that specific embodiments have been described herein for purposes of illustration, but that various modifications may be made without deviating from the spirit and scope of the various embodiments disclosed. Further, while various advantages associated with certain embodiments have been described above in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the embodiments disclosed herein. Accordingly, the embodiments are not limited, except as by the appended claims.


Although certain aspects of the embodiments disclosed herein are presented below in certain claim forms, the applicant contemplates the various aspects of the embodiments disclosed herein in any number of claim forms. Accordingly, the applicant reserves the right to pursue additional claims after filing this application to pursue such additional claim forms, in either this application or in a continuing application.

Claims
  • 1. A computer-implemented method to generate a price of a target device, the method comprising: obtaining, by a mobile handheld evaluator device, technical properties associated with the target device, the technical properties comprising a make and a model of the target device;obtaining, by the mobile handheld evaluator device, physical properties associated with the target device, the physical properties comprising information related to wear and tear of the target device, wherein obtaining the physical properties comprises: indicating, by a user interface of the handheld evaluator device, to a user that the user should position the target device in one or more predetermined positions relative to a camera of the mobile handheld evaluator device;indicating, by the user interface, to the user that the mobile handheld evaluator device records one or more images of the target device in each of the predetermined positions using a guide displayed on the mobile evaluator device to indicate how to position the target device within a camera view of the mobile evaluator device; andcapturing, by the camera of the mobile handheld evaluator device, the one or more images of the target device while the target device is positioned in each of the predetermined positions;evaluating, using an application software and/or hardware of the mobile handheld device, the obtained physical properties to generate a condition metric value of the target device, to perform the evaluating remotely by the user;based on the generated condition metric value, determining the price of the target device; andindicating, by the user interface, the price of the target device to the user;wherein the condition metric value is obtained based on a trained artificial intelligence module to detect screen and/or device imperfections of the target device;the training of the trained artificial intelligence module includes training data, wherein the training data includes a positive training set of features that have been determined to have a desired property in question, and, in some embodiments, a negative training set of features that lack the property in question using a supervised and/or unsupervised machine learning.
  • 2. The computer-implemented method of claim 1, further comprising: activating a flashlight of the handheld evaluator device to illuminate the target device, prior to or while the target device is positioned at the predetermined positions; and detecting highlights appearing on a surface of the target device by analyzing the one or more captured images, the highlights indicating at least one crack on the surface of the target device.
  • 3. The computer-implemented method of claim 1, further comprising: detecting that a second device has replaced the target device by analyzing the one or more captured images; anddetermining that the obtained physical properties are inaccurate responsive to detecting that the second device has replaced the target device.
  • 4. The computer-implemented method of claim 3, wherein detecting that the second device has replaced the target device is performed using a trained machine learning model.
  • 5. The computer-implemented method of claim 1, further comprising: obtaining a unique identifier of the target device;determining whether the target device has been evaluated previously based on the unique identifier; andupon determining that the target device has been evaluated previously, retrieving data describing the target device from a database.
  • 6. The computer-implemented method of claim 5, wherein obtaining the technical properties and the physical properties comprises: populating the technical properties and the physical properties from the data describing the target device.
  • 7. The computer-implemented method of claim 1, further comprising: receiving, by the user interface, an acceptance of the price from the user; andupon receiving the acceptance, presenting, by the user interface, a map to the user indicating one or more locations at which the user can redeem the target device at an associated price.
  • 8. An mobile evaluator device to evaluate a target device, the mobile evaluator device comprising: one or more processors; andat least one non-transitory computer-readable medium coupled to the one or more processors, wherein the at least one non-transitory computer-readable medium stores instructions, which, when executed by the one or more processors cause the one or more processors to: obtain technical properties associated with the target device, the technical properties comprising a make and a model of the target device;obtain physical properties associated with the target device, the physical properties comprising information related to wear and tear of the target device, wherein obtaining the physical properties comprises: indicate, by a user interface of the mobile evaluator device, to a user that the user should position the target device in one or more predetermined positions;indicate, by the user interface, to the user that the mobile evaluator device records one or more images of the target device in each of the predetermined positions using a guide displayed on the mobile evaluator device to indicate how to position the target device within a camera view of the mobile evaluator device; andrecord, by a camera of the mobile evaluator device, a video of the target device while the target device is positioned in the predetermined positions;evaluate, using an application software and/or hardware of the mobile device, the obtained physical properties to generate a condition metric value of the target device, to perform the evaluating remotely by the user;based on the generated condition metric value, determine the price of the target device; andindicate, by the user interface, the price of the target device to the user;wherein the condition metric value is obtained based on a trained artificial intelligence module to detect screen and/or device imperfections of the target device;the training of the trained artificial intelligence module includes training data, wherein the training data includes a positive training set of features that have been determined to have a desired property in question, and, in some embodiments, a negative training set of features that lack the property in question using a supervised and/or unsupervised machine learning.
  • 9. The mobile evaluator device of claim 8, wherein the instructions further cause the one or more processors to: activate a flashlight of the mobile evaluator device to illuminate the target device, prior to or while the target device is positioned into the predetermined positions; and detect highlights appearing on a surface of the target device by analyzing the video, the highlights indicating at least one crack on the surface of the target device.
  • 10. The evaluator device of claim 8, wherein the instructions further cause the one or more processors to: detect that a second device has replaced the target device by analyzing the video; anddetermine that the obtained physical properties are inaccurate responsive to detecting that the second device has replaced the target device.
  • 11. The evaluator device of claim 10, wherein the instructions cause the one or more processors to detect that the second device has replaced the target device using an artificial intelligence module.
  • 12. The evaluator device of claim 8, wherein the instructions further cause the one or more processors to: obtain a unique identifier of the target device;determine whether the target device has been evaluated previously based on the unique identifier; andupon determining that the target device has been evaluated previously, retrieve data describing the target device from a database.
  • 13. The evaluator device of claim 12, wherein the instructions to obtain the technical properties and the physical properties cause the one or more processors to: populate the technical properties and the physical properties from the data describing the target device.
  • 14. The evaluator device of claim 8, wherein the instructions further cause the one or more processors to: receive, by the user interface, an acceptance of the price from the user; andupon receiving the acceptance, present, by the user interface, a map to the user indicating one or more locations at which the user can redeem the target device at an associated a price.
  • 15. A non-transitory computer-readable medium storing instructions, which, when executed by at least one computing device of a mobile handheld evaluator device, cause the at least one computing device to: obtain technical properties associated with a target device, the technical properties comprising a make and a model of the target device;obtain physical properties associated with the target device, the physical properties comprising information related to wear and tear of the target device, wherein obtaining the physical properties comprises: indicate, by a user interface of the mobile handheld evaluator device, to a user that the user should position the target device in one or more predetermined positions relative to a camera of the handheld evaluator device;indicate, by the user interface, to the user that the mobile handheld evaluator device records one or more images of the target device in each of the predetermined positions using a guide displayed on the mobile evaluator device to indicate how to position the target device within a camera view of the mobile evaluator device; andcapture, by the camera of the mobile handheld evaluator device, the one or more images of the target device while the target device is positioned in each of the predetermined positions;evaluate, using an application software and/or hardware of the mobile handheld device, the obtained physical properties to generate a condition metric value of the target device, to perform the evaluating remotely by the user;based on the generated condition metric value, determine the price of the target device; and indicate, by the user interface, the price of the target device to the user;wherein the condition metric value is obtained based on a trained artificial intelligence module to detect screen and/or device imperfections of the target device;the training of the trained artificial intelligence module includes training data, wherein the training data includes a positive training set of features that have been determined to have a desired property in question, and, in some embodiments, a negative training set of features that lack the property in question using a supervised and/or unsupervised machine learning.
  • 16. The non-transitory computer-readable medium of claim 15, wherein the instructions further cause the at least one computing device to: activate a flashlight of the mobile handheld evaluator device to illuminate the target device, prior to or while the target device is positioned at the predetermined positions; and detect highlights appearing on a surface of the target device by analyzing the one or more captured images, the highlights indicating at least one crack on the surface of the target device.
  • 17. The non-transitory computer-readable medium of claim 15, wherein the instructions further cause the at least one computing device to: detect that a second device has replaced the target device by analyzing the one or more captured images; anddetermine that the obtained physical properties are inaccurate responsive to detecting that the second device has replaced the target device.
  • 18. The non-transitory computer-readable medium of claim 17, wherein the instructions cause the at least one computing device to detect that the second device has replaced the target device using a machine learning model.
  • 19. The non-transitory computer-readable medium of claim 15, wherein the instructions further cause the at least one computing device to: obtain a unique identifier of the target device;determine whether the target device has been evaluated previously based on the unique identifier; andupon determining that the target device has been evaluated previously, retrieve data describing the target device from a database.
  • 20. The non-transitory computer-readable medium of claim 19, wherein the instructions to obtain the technical properties and the physical properties cause the at least one computing device to: populate the technical properties and the physical properties from the data describing the target device.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Patent Application No. 63/070,207, filed Aug. 25, 2020, which is incorporated by reference in its entirety herein.

US Referenced Citations (638)
Number Name Date Kind
1327315 Davies Jan 1920 A
1730015 Rooke Oct 1929 A
3808439 Renius Apr 1974 A
4248334 Hanley et al. Feb 1981 A
4519522 McElwee May 1985 A
4593820 Antonie Jun 1986 A
4715709 Sekine et al. Dec 1987 A
4821118 Lafreniere Apr 1989 A
4845636 Walker Jul 1989 A
4870357 Young et al. Sep 1989 A
4878736 Hekker et al. Nov 1989 A
4893789 Novorsky Jan 1990 A
4927051 Falk et al. May 1990 A
4951308 Bishop et al. Aug 1990 A
5025344 Maly et al. Jun 1991 A
5027074 Haferstat Jun 1991 A
5077462 Newell et al. Dec 1991 A
5091773 Fouche et al. Feb 1992 A
5105149 Tokura Apr 1992 A
5159560 Newell et al. Oct 1992 A
5216502 Katz Jun 1993 A
5280170 Baldwin Jan 1994 A
5319459 Mochizuki et al. Jun 1994 A
5339096 Beaufort et al. Aug 1994 A
5413454 Movesian May 1995 A
5419438 Squyres et al. May 1995 A
5436554 Decker Jul 1995 A
5482140 Moore Jan 1996 A
5570920 Crisman et al. Nov 1996 A
5572444 Lentz et al. Nov 1996 A
5610710 Canfield et al. Mar 1997 A
5717780 Mitsumune et al. Feb 1998 A
5747784 Walter et al. May 1998 A
5748084 Isikoff May 1998 A
5775806 Allred Jul 1998 A
5839058 Phillips et al. Nov 1998 A
5871371 Rothenberger et al. Feb 1999 A
5920338 Katz Jul 1999 A
5937396 Konya Aug 1999 A
5949901 Nichani et al. Sep 1999 A
5965858 Suzuki et al. Oct 1999 A
5966654 Croughwell et al. Oct 1999 A
5987159 Nichani Nov 1999 A
5988431 Roe Nov 1999 A
6029851 Jenkins et al. Feb 2000 A
6041229 Turner Mar 2000 A
6055512 Dean et al. Apr 2000 A
6100986 Rydningen Aug 2000 A
6170702 Zettler et al. Jan 2001 B1
6181805 Koike et al. Jan 2001 B1
6228008 Pollington et al. May 2001 B1
6234812 Ivers et al. May 2001 B1
6259827 Nichani Jul 2001 B1
6264104 Jenkins et al. Jul 2001 B1
6323782 Stephens et al. Nov 2001 B1
6330354 Companion et al. Dec 2001 B1
6330958 Ruskin et al. Dec 2001 B1
6393095 Robinson May 2002 B1
6462644 Howell et al. Oct 2002 B1
6529837 Kang Mar 2003 B1
6535637 Wootton et al. Mar 2003 B1
6573886 Lehtinen et al. Jun 2003 B1
6587581 Matsuyama et al. Jul 2003 B1
6595684 Casagrande et al. Jul 2003 B1
6597552 Griepentrog et al. Jul 2003 B1
6633377 Weiss et al. Oct 2003 B1
6667800 Larsson et al. Dec 2003 B1
6687679 Van Luchene Feb 2004 B1
6748296 Banerjee et al. Jun 2004 B2
6754637 Stenz Jun 2004 B1
6758370 Cooke et al. Jul 2004 B2
6798528 Hartman Sep 2004 B1
6822422 Sagawa Nov 2004 B2
6842596 Morii et al. Jan 2005 B2
6854656 Matsumori Feb 2005 B2
6947941 Koon Sep 2005 B1
D512964 Kissinger et al. Dec 2005 S
7062454 Giannini et al. Jun 2006 B1
7066767 Liao Jun 2006 B2
7069236 Tsunenari Jun 2006 B1
7076449 Tsunenari et al. Jul 2006 B2
7086592 Wagner et al. Aug 2006 B2
7178720 Strubbe et al. Feb 2007 B1
7234609 DeLazzer et al. Jun 2007 B2
7251458 O'Connell Jul 2007 B2
7268345 Schultz Sep 2007 B2
7334729 Brewington Feb 2008 B2
7343319 Jen Mar 2008 B1
7407392 Cooke et al. Aug 2008 B1
7408674 Moro et al. Aug 2008 B2
7431158 Yamada et al. Oct 2008 B2
7455226 Hammond et al. Nov 2008 B1
7520666 Pevzner et al. Apr 2009 B2
7529687 Phan May 2009 B1
7567344 LeBlanc et al. Jul 2009 B2
7642687 Kageyama et al. Jan 2010 B2
7646193 Suzuki et al. Jan 2010 B2
7649450 Campion et al. Jan 2010 B2
7702108 Amon et al. Apr 2010 B2
7735125 Alvarez et al. Jun 2010 B1
7761331 Low et al. Jul 2010 B2
7783379 Beane et al. Aug 2010 B2
7848833 Li Dec 2010 B2
7881965 Bowles et al. Feb 2011 B2
7890373 Junger Feb 2011 B2
D640199 Wilson Jun 2011 S
8010402 Sharma et al. Aug 2011 B1
8019588 Wohlberg et al. Sep 2011 B1
8025229 Hammond et al. Sep 2011 B2
8031930 Wang et al. Oct 2011 B2
8107243 Guccione et al. Jan 2012 B2
8112325 Foy et al. Feb 2012 B2
8142199 Almouli Mar 2012 B1
8156008 Bae et al. Apr 2012 B2
8195511 Bowles et al. Jun 2012 B2
8200533 Librizzi et al. Jun 2012 B2
8200736 Shi Jun 2012 B2
8215546 Lin et al. Jul 2012 B2
8239262 Bowles et al. Aug 2012 B2
8254883 Uchida Aug 2012 B2
8266008 Siegel et al. Sep 2012 B1
8340815 Peters et al. Dec 2012 B2
8369987 Claessen Feb 2013 B2
8401914 Kim Mar 2013 B1
8417234 Sanding et al. Apr 2013 B2
8423404 Bowles et al. Apr 2013 B2
8429021 Kraft et al. Apr 2013 B2
8463646 Bowles Jun 2013 B2
8536472 Wu et al. Sep 2013 B2
8543358 Trabona Sep 2013 B2
8566183 Bonar et al. Oct 2013 B1
8606633 Tarbert et al. Dec 2013 B2
8718717 Vaknin et al. May 2014 B2
8755783 Brahami et al. Jun 2014 B2
8806280 Stephenson Aug 2014 B2
8823794 Suzuki et al. Sep 2014 B2
8824136 Interian et al. Sep 2014 B1
8743215 Lee Nov 2014 B1
8922643 Ji et al. Dec 2014 B2
9010627 Prasad et al. Apr 2015 B1
9043026 Lien et al. May 2015 B2
9075781 Matthews Jul 2015 B2
9124056 Lewis, Jr. Sep 2015 B1
9189911 Kavli et al. Nov 2015 B2
9195979 Geller Nov 2015 B2
9256863 Chayon et al. Feb 2016 B2
9283672 Matthews Mar 2016 B1
9317989 Grow et al. Apr 2016 B2
9355515 Brahami et al. May 2016 B2
9367436 Matthews Jun 2016 B2
9367982 Chayun et al. Jun 2016 B2
9378606 Chayun et al. Jun 2016 B2
9390442 Lyle Jul 2016 B2
9469037 Matthews Oct 2016 B2
9497563 Hornung et al. Nov 2016 B2
9578133 Matthews Feb 2017 B2
9582101 Chang et al. Feb 2017 B2
9595238 Won Mar 2017 B2
9621947 Oztaskent Apr 2017 B1
9641997 Vratskides May 2017 B2
9668298 Pearl et al. May 2017 B1
9697548 Jaff et al. Jul 2017 B1
9704142 Ahn Jul 2017 B2
9718196 Matthews Aug 2017 B2
9792597 Abbott Oct 2017 B1
9818160 Bowles et al. Nov 2017 B2
9858178 Matthews Jan 2018 B2
9866664 Sinha et al. Jan 2018 B2
9881284 Bowles et al. Jan 2018 B2
9885672 Forutanpour et al. Feb 2018 B2
9904911 Bowles et al. Feb 2018 B2
9911102 Bowles Mar 2018 B2
9934644 Chayun et al. Apr 2018 B2
9936331 Matthews Apr 2018 B2
9972046 Ackerman May 2018 B2
10032140 Bowles et al. Jul 2018 B2
10043339 Walker et al. Aug 2018 B2
10044843 Sinha et al. Aug 2018 B2
10055798 Bowles et al. Aug 2018 B2
10127647 Forutanpour et al. Nov 2018 B2
10157379 Singh Dec 2018 B2
10157427 Bowles et al. Dec 2018 B2
10261611 Matthews Apr 2019 B2
10264426 Matthews Apr 2019 B2
10269110 Forutanpour et al. Apr 2019 B2
10275813 Fu Apr 2019 B2
10325440 Abdelmalak et al. Jun 2019 B2
10339509 Bordeleau et al. Jul 2019 B2
10401411 Snook et al. Sep 2019 B2
10417615 Bowles et al. Sep 2019 B2
10438174 Bowles et al. Oct 2019 B2
10445708 Hunt et al. Oct 2019 B2
10452527 Matthews Oct 2019 B2
10475002 Silva et al. Nov 2019 B2
10496963 Silva et al. Dec 2019 B2
10528992 Yost Jan 2020 B2
10529008 Pritchard Jan 2020 B1
10565629 Hartman Feb 2020 B2
10572946 Bowles et al. Feb 2020 B2
10671367 Matthews Jun 2020 B2
10679279 Ward Jun 2020 B2
10740891 Chen et al. Aug 2020 B1
10803527 Zankat et al. Oct 2020 B1
10810732 Dwivedi et al. Oct 2020 B2
10824942 Bhotika et al. Nov 2020 B1
10825082 Librizzi et al. Nov 2020 B2
10834555 Matthews Nov 2020 B2
10846672 Dion et al. Nov 2020 B2
10853873 Bowles et al. Dec 2020 B2
10860122 Matthews Dec 2020 B2
10860990 Bowles et al. Dec 2020 B2
10909673 Forutanpour et al. Feb 2021 B2
10970786 Matheson et al. Apr 2021 B1
10977700 Bordeleau et al. Apr 2021 B2
11010841 Bowles et al. May 2021 B2
11024111 Abdelmalak et al. Jun 2021 B2
11080662 Bowles et al. Aug 2021 B2
11080672 Bowles Aug 2021 B2
11107046 Bowles Aug 2021 B2
11122034 Cicchitto Sep 2021 B2
11126973 Silva et al. Sep 2021 B2
11164000 Lee et al. Nov 2021 B2
11232412 Hunt et al. Jan 2022 B2
11288789 Chen et al. Mar 2022 B1
11302038 Muendel et al. Apr 2022 B2
11315093 Bowles Apr 2022 B2
11321768 Beauchamp May 2022 B2
11341471 Dion et al. May 2022 B2
11379886 Fields et al. Jul 2022 B1
11386740 Shah Jul 2022 B2
11417068 Burris et al. Aug 2022 B1
11436570 Bowles et al. Sep 2022 B2
11443289 Bowles et al. Sep 2022 B2
11462868 Forutanpour et al. Oct 2022 B2
11482067 Forutanpour et al. Oct 2022 B2
11526932 Bowles et al. Dec 2022 B2
11631096 Schubert et al. Apr 2023 B2
11657631 Sagnoas May 2023 B2
11843206 Forutanpour et al. Dec 2023 B2
11907915 Bowles et al. Feb 2024 B2
12033454 Forutanpour et al. Jul 2024 B2
12045973 Johnson et al. Jul 2024 B2
20010025883 Ichihara et al. Oct 2001 A1
20010035425 Rocco et al. Nov 2001 A1
20010039531 Aoki Nov 2001 A1
20020014577 Ulrich et al. Feb 2002 A1
20020035515 Moreno Mar 2002 A1
20020067184 Smith et al. Jun 2002 A1
20020087413 Mahaffy et al. Jul 2002 A1
20020112177 Voltmer Aug 2002 A1
20020129170 Moore et al. Sep 2002 A1
20020147656 Tam Oct 2002 A1
20020157033 Cox Oct 2002 A1
20020162966 Yoder Nov 2002 A1
20020186878 Hoon et al. Dec 2002 A1
20030006277 Maskatiya et al. Jan 2003 A1
20030018897 Bellis, Jr. et al. Jan 2003 A1
20030025476 Trela Feb 2003 A1
20030036866 Nair et al. Feb 2003 A1
20030061150 Kocher et al. Mar 2003 A1
20030100707 Hwang et al. May 2003 A1
20030146898 Kawasaki et al. Aug 2003 A1
20030158789 Miura et al. Aug 2003 A1
20030170529 Sagawa Sep 2003 A1
20030179371 Rangarajan et al. Sep 2003 A1
20030191675 Murashita Oct 2003 A1
20030197782 Ashe Oct 2003 A1
20030204289 Banerjee et al. Oct 2003 A1
20040012825 Tesavis Jan 2004 A1
20040039639 Walker Feb 2004 A1
20040088231 Davis May 2004 A1
20040114153 Andersen et al. Jun 2004 A1
20040141320 Bock et al. Jul 2004 A1
20040150815 Sones et al. Aug 2004 A1
20040156557 Van Der Weij Aug 2004 A1
20040156667 Van Der Weij et al. Aug 2004 A1
20040184651 Nordbryhn Sep 2004 A1
20040186744 Lux Sep 2004 A1
20040189812 Gustavsson Sep 2004 A1
20040200902 Ishioroshi Oct 2004 A1
20040205015 DeLaCruz Oct 2004 A1
20040235513 O'Connell Nov 2004 A1
20040242216 Boutsikakis Dec 2004 A1
20040243478 Walker et al. Dec 2004 A1
20040262521 Devitt et al. Dec 2004 A1
20050027622 Walker et al. Feb 2005 A1
20050043897 Meyer Feb 2005 A1
20050109841 Ryan et al. May 2005 A1
20050128551 Yang Jun 2005 A1
20050135917 Kauppila et al. Jun 2005 A1
20050137942 LaFluer Jun 2005 A1
20050139661 Eglen et al. Jun 2005 A1
20050143149 Becker et al. Jun 2005 A1
20050167620 Cho et al. Aug 2005 A1
20050187657 Hashimoto et al. Aug 2005 A1
20050216120 Rosenberg et al. Sep 2005 A1
20050222690 Wang et al. Oct 2005 A1
20050231595 Wang et al. Oct 2005 A1
20050240958 Nguyen et al. Oct 2005 A1
20060167580 Whittier Jan 2006 A1
20060022827 Highham Feb 2006 A1
20060038114 Cofer et al. Feb 2006 A9
20060047573 Mitchell et al. Mar 2006 A1
20060074756 Boykin Apr 2006 A1
20060085158 Cakiner Apr 2006 A1
20060184379 Tan et al. Aug 2006 A1
20060195384 Bauer et al. Aug 2006 A1
20060219776 Finn Oct 2006 A1
20060229108 Cehelnik Oct 2006 A1
20060235747 Hammond et al. Oct 2006 A1
20060217152 Fok et al. Nov 2006 A1
20060258008 Holler et al. Nov 2006 A1
20060261931 Cheng et al. Nov 2006 A1
20060271431 Wehr et al. Nov 2006 A1
20060279307 Wang et al. Dec 2006 A1
20060280356 Yamagashi Dec 2006 A1
20060287929 Bae et al. Dec 2006 A1
20070012665 Nelson Jan 2007 A1
20070013124 Graef et al. Jan 2007 A1
20070013139 Kumagai Jan 2007 A1
20070032098 Bowles et al. Feb 2007 A1
20070050083 Signorelli Mar 2007 A1
20070057815 Foy et al. Mar 2007 A1
20070129906 Stoecker et al. Jun 2007 A1
20070133844 Waehner et al. Jun 2007 A1
20070150403 Mock et al. Jun 2007 A1
20070140310 Rolton et al. Jul 2007 A1
20070205751 Suzuki et al. Sep 2007 A1
20070258085 Robbins Nov 2007 A1
20070263099 Motta et al. Nov 2007 A1
20070269099 Nishino et al. Nov 2007 A1
20070276911 Bhumkar Nov 2007 A1
20070281734 Mizrachi Dec 2007 A1
20070282999 Tu et al. Dec 2007 A1
20080004828 Mizrachi Jan 2008 A1
20080027581 Saether et al. Jan 2008 A1
20080033596 Fausak et al. Feb 2008 A1
20080109746 Mayer May 2008 A1
20080111989 Dufour et al. May 2008 A1
20080133432 Ramseyer Jun 2008 A1
20080149720 Colville Jun 2008 A1
20080167578 Bryer et al. Jul 2008 A1
20080177598 Davie Jul 2008 A1
20080207198 Juric Aug 2008 A1
20080228582 Fordyce Sep 2008 A1
20080231113 Guccione et al. Sep 2008 A1
20080255901 Carroll et al. Oct 2008 A1
20080256008 Kwok Oct 2008 A1
20080260235 Cai et al. Oct 2008 A1
20080277467 Carlson Nov 2008 A1
20080281691 Pearson et al. Nov 2008 A1
20080296374 Gonen et al. Dec 2008 A1
20080303915 Omi Dec 2008 A1
20080306701 Zhong et al. Dec 2008 A1
20090051907 Li et al. Feb 2009 A1
20090079388 Reddy Feb 2009 A1
20090078775 Giebel et al. Mar 2009 A1
20090095047 Patel et al. Apr 2009 A1
20090114716 Ramachandran May 2009 A1
20090132813 Schibuk May 2009 A1
20090145727 Johns Jun 2009 A1
20090156199 Steenstra et al. Jun 2009 A1
20090160668 Crowley et al. Jun 2009 A1
20090177319 Garibaldi et al. Jul 2009 A1
20090184865 Valo et al. Jul 2009 A1
20090187491 Bull et al. Jul 2009 A1
20090190142 Taylor et al. Jul 2009 A1
20090207743 Huq et al. Aug 2009 A1
20090244285 Chathukutty Oct 2009 A1
20090247133 Holmen et al. Oct 2009 A1
20090248883 Suryanarayana et al. Oct 2009 A1
20090251815 Wang et al. Oct 2009 A1
20090262341 Konopa et al. Oct 2009 A1
20090265035 Jenkinson et al. Oct 2009 A1
20090299543 Cox et al. Dec 2009 A1
20090312009 Fishel Dec 2009 A1
20090321511 Browne Dec 2009 A1
20090322706 Austin Dec 2009 A1
20100005004 Hudak et al. Jan 2010 A1
20100051695 Yepez et al. Mar 2010 A1
20100063894 Lundy Mar 2010 A1
20100110174 Leconte May 2010 A1
20100115887 Schroeder et al. May 2010 A1
20100147953 Barkan Jun 2010 A1
20100157280 Kusevic et al. Jun 2010 A1
20100161397 Gauthier et al. Jun 2010 A1
20100162359 Casey et al. Jun 2010 A1
20100174596 Gilman Jul 2010 A1
20100185506 Wolff Jul 2010 A1
20100219234 Forbes Sep 2010 A1
20100235198 Fini et al. Sep 2010 A1
20100237854 Kumhyr et al. Sep 2010 A1
20100260271 Kapoor Oct 2010 A1
20100262481 Baker et al. Oct 2010 A1
20100312639 Mastronardi Dec 2010 A1
20110035322 Lively Feb 2011 A1
20110043628 Yun Feb 2011 A1
20110047022 Walker Feb 2011 A1
20110055322 Gregersen Mar 2011 A1
20110060641 Grossman et al. Mar 2011 A1
20110066514 Maraz Mar 2011 A1
20110067520 Ihrke et al. Mar 2011 A1
20110082734 Zhang et al. Apr 2011 A1
20110113479 Ganem May 2011 A1
20110173576 Murphy et al. Jul 2011 A1
20110191861 Spears Aug 2011 A1
20110296339 Kang Dec 2011 A1
20110296508 Os et al. Dec 2011 A1
20110313840 Mason et al. Dec 2011 A1
20120004761 Madruga Jan 2012 A1
20120016518 Saario et al. Jan 2012 A1
20120022965 Seergy Jan 2012 A1
20120026582 Okabe et al. Feb 2012 A1
20120029985 Wilson et al. Feb 2012 A1
20120030097 Hagan et al. Feb 2012 A1
20120030399 Ben-Harosh Feb 2012 A1
20120054113 Jayaraman et al. Mar 2012 A1
20120063501 Aguren Mar 2012 A1
20120078413 Baker Mar 2012 A1
20120095875 Guthrie Apr 2012 A1
20120116928 Gventer May 2012 A1
20120116929 Gventer May 2012 A1
20120117001 Gventer et al. May 2012 A1
20120127307 Hassenzahl May 2012 A1
20120146956 Jenkinson Jun 2012 A1
20120209783 Smith et al. Aug 2012 A1
20120235812 De Mello et al. Sep 2012 A1
20120254046 Librizzi et al. Oct 2012 A1
20120280934 Ha Nov 2012 A1
20120301009 Dabic Nov 2012 A1
20120303431 Phillips et al. Nov 2012 A1
20130006713 Haake Jan 2013 A1
20130034305 Jahanshahi et al. Feb 2013 A1
20130041508 Hu et al. Feb 2013 A1
20130046611 Bowles et al. Feb 2013 A1
20130046699 Bowles et al. Feb 2013 A1
20130112440 Alsaif et al. May 2013 A1
20130124426 Bowles et al. May 2013 A1
20130126741 Srivastava et al. May 2013 A1
20130137376 Fitzgerald et al. May 2013 A1
20130144797 Bowles et al. Jun 2013 A1
20130155061 Jahanshahi et al. Jun 2013 A1
20130159119 Henderson et al. Jun 2013 A1
20130169413 Schuessler Jul 2013 A1
20130173430 Benjamin Jul 2013 A1
20130173434 Hartman Jul 2013 A1
20130181935 McKenzie et al. Jul 2013 A1
20130198089 Bowles Aug 2013 A1
20130198144 Bowles Aug 2013 A1
20130200912 Panagas Aug 2013 A1
20130246211 Sullivan Sep 2013 A1
20130246212 Sullivan Sep 2013 A1
20130253700 Carson et al. Sep 2013 A1
20130284805 Kraft et al. Oct 2013 A1
20130290146 West et al. Oct 2013 A1
20130297388 Kyle, Jr. et al. Nov 2013 A1
20140006451 Mullis et al. Jan 2014 A1
20140012643 Behrisch Jan 2014 A1
20140028449 Sigal et al. Jan 2014 A1
20140038556 DeSousa Feb 2014 A1
20140046748 Nagarajan Feb 2014 A1
20140046845 Dogin et al. Feb 2014 A1
20140052329 Amirpour Feb 2014 A1
20140067710 Gventer et al. Mar 2014 A1
20140080550 Ino et al. Mar 2014 A1
20140143161 Ahn May 2014 A1
20140147004 Uchida May 2014 A1
20140149201 Abbott May 2014 A1
20140150100 Gupta et al. May 2014 A1
20140156883 Bowles Jun 2014 A1
20140178029 Raheman et al. Jun 2014 A1
20140214505 Shuster-Arechiga et al. Jul 2014 A1
20140235258 Wang et al. Aug 2014 A1
20140244315 Cahill et al. Aug 2014 A1
20140267691 Humphrey Sep 2014 A1
20140278244 Humphrey et al. Sep 2014 A1
20140297368 Ferder Oct 2014 A1
20140330685 Nazzari Nov 2014 A1
20140347473 Wolff et al. Nov 2014 A1
20150006281 Takahashi Jan 2015 A1
20150046343 Martini Feb 2015 A1
20150066677 Bowles et al. Mar 2015 A1
20150088698 Ackerman Mar 2015 A1
20150088731 Ackerman Mar 2015 A1
20150120485 Nash Apr 2015 A1
20150161714 Fainshtein Jun 2015 A1
20150170237 Powell Jun 2015 A1
20150177330 Morris Jun 2015 A1
20150193797 Gerrity Jul 2015 A1
20150206200 Edmondson et al. Jul 2015 A1
20150278529 Cho et al. Oct 2015 A1
20150293860 Bowles Oct 2015 A9
20150294278 Nguyen Oct 2015 A1
20150309912 Nguyen et al. Oct 2015 A1
20150317619 Curtis Nov 2015 A1
20150324761 Nguyen et al. Nov 2015 A1
20150324870 Nguyen et al. Nov 2015 A1
20150332206 Trew et al. Nov 2015 A1
20150356637 Graffia et al. Dec 2015 A1
20160019607 Burmester et al. Jan 2016 A1
20160019685 Nguyen et al. Jan 2016 A1
20160055392 Nakano Feb 2016 A1
20160078434 Huxham et al. Mar 2016 A1
20160087381 Wong et al. Mar 2016 A1
20160092849 Cirannek et al. Mar 2016 A1
20160098689 Bowles Apr 2016 A1
20160125612 Seki et al. May 2016 A1
20160171544 Heminger et al. Jun 2016 A1
20160171575 Bowles et al. Jun 2016 A1
20160184990 Song et al. Jun 2016 A1
20160210648 Cirannek et al. Jul 2016 A1
20160269401 Saito et al. Sep 2016 A1
20160269895 Soini et al. Sep 2016 A1
20160275460 Ploetner et al. Sep 2016 A1
20160275518 Bowles et al. Sep 2016 A1
20160292710 Casselle Oct 2016 A1
20160301786 Koltsov et al. Oct 2016 A1
20160328684 Bowles et al. Nov 2016 A1
20160379287 Dabiri Dec 2016 A1
20170083886 Silva et al. Mar 2017 A1
20170091823 Adinarayan et al. Mar 2017 A1
20170110902 Miller Apr 2017 A1
20170115235 Ohlsson et al. Apr 2017 A1
20170169401 Beane et al. Jun 2017 A1
20170221110 Sullivan et al. Aug 2017 A1
20170256051 Dwivedi et al. Sep 2017 A1
20170278191 Tassone et al. Sep 2017 A1
20170301010 Bowles et al. Oct 2017 A1
20170323279 Dion et al. Nov 2017 A1
20170343481 Jahanshahi et al. Nov 2017 A1
20180084094 Sinha et al. Mar 2018 A1
20180101810 Feng et al. Apr 2018 A1
20180157246 Huang et al. Jun 2018 A1
20180157820 Adams et al. Jun 2018 A1
20180160269 Baarman et al. Jun 2018 A1
20180165655 Marcelle et al. Jun 2018 A1
20180240144 Curtis Aug 2018 A1
20180255047 Cicchitto Sep 2018 A1
20180293566 Engles et al. Oct 2018 A1
20180293664 Zhang et al. Oct 2018 A1
20180300776 Yost Oct 2018 A1
20180321163 Casadio Nov 2018 A1
20180322623 Memo et al. Nov 2018 A1
20180342050 Fitzgerald et al. Nov 2018 A1
20180350163 Pofale et al. Dec 2018 A1
20190017863 Saltzman Jan 2019 A1
20190019147 McCarty et al. Jan 2019 A1
20190051090 Goldberg et al. Feb 2019 A1
20190066075 Lobo et al. Feb 2019 A1
20190066439 Pinkus Feb 2019 A1
20190073566 Brauer Mar 2019 A1
20190073568 He et al. Mar 2019 A1
20190102874 Goja Apr 2019 A1
20190156611 Redhead May 2019 A1
20190166278 Hiyama et al. May 2019 A1
20190222748 Weir et al. Jul 2019 A1
20190272628 Tsou Sep 2019 A1
20190279431 Wurmfeld et al. Sep 2019 A1
20190318465 Nguyen Oct 2019 A1
20190372827 Vasseur et al. Dec 2019 A1
20200020097 Do et al. Jan 2020 A1
20200042795 Lee et al. Feb 2020 A1
20200042969 Ray Feb 2020 A1
20200066067 Herman et al. Feb 2020 A1
20200090137 Bowles et al. Mar 2020 A1
20200104720 Boa et al. Apr 2020 A1
20200104868 Schubert et al. Apr 2020 A1
20200175481 Pham Jun 2020 A1
20200175669 Bian et al. Jun 2020 A1
20200202319 Forutanpour et al. Jun 2020 A1
20200202405 Glickman et al. Jun 2020 A1
20200202419 Beauchamp Jun 2020 A1
20200241891 Li et al. Jul 2020 A1
20200265487 Forutanpour et al. Aug 2020 A1
20200342442 Curtis Oct 2020 A1
20200393742 Dion et al. Dec 2020 A1
20200410793 Folco Dec 2020 A1
20210012315 Priebatsch Jan 2021 A1
20210081698 Lindeman et al. Mar 2021 A1
20210081914 Nelms et al. Mar 2021 A1
20210110366 Dion et al. Apr 2021 A1
20210110440 Dion et al. Apr 2021 A1
20210150773 Muendel et al. May 2021 A1
20210174312 Bowles et al. Jun 2021 A1
20210192484 Forutanpour et al. Jun 2021 A1
20210209512 Gaddam et al. Jul 2021 A1
20210209746 Johnson et al. Jul 2021 A1
20210217076 Kruper et al. Jul 2021 A1
20210224867 Bordeleau et al. Jul 2021 A1
20210254966 Hur et al. Aug 2021 A1
20210255240 McGrath Aug 2021 A1
20210264483 Hirata Aug 2021 A1
20210272208 Leise et al. Sep 2021 A1
20210278338 Jung Sep 2021 A1
20210295494 Forutanpour et al. Sep 2021 A1
20210327203 Shah Oct 2021 A1
20210343030 Sagnoas Nov 2021 A1
20210357545 Sugawara et al. Nov 2021 A1
20220027879 Bowles et al. Jan 2022 A1
20220050897 Gaddam et al. Feb 2022 A1
20220051212 Forutanpour et al. Feb 2022 A1
20220051300 Forutanpour et al. Feb 2022 A1
20220051301 Forutanpour et al. Feb 2022 A1
20220051507 Forutanpour et al. Feb 2022 A1
20220068076 Forutanpour et al. Mar 2022 A1
20220114854 Forutanpour et al. Apr 2022 A1
20220164833 Dion et al. May 2022 A1
20220172178 Forutanpour et al. Jun 2022 A1
20220187802 Wittenberg et al. Jun 2022 A1
20220198407 Beane et al. Jun 2022 A1
20220262189 Dion Aug 2022 A1
20220277281 Dion et al. Sep 2022 A1
20220284406 Hunt et al. Sep 2022 A1
20220292464 Silva et al. Sep 2022 A1
20220318774 Bowles Oct 2022 A1
20230007937 Forutanpour et al. Jan 2023 A1
20230077844 Bowles et al. Mar 2023 A1
20230100849 Bowles et al. Mar 2023 A1
20230188998 Zellner et al. Jun 2023 A1
20230196865 Forutanpour et al. Jun 2023 A1
20230238751 Forutanpour et al. Jul 2023 A1
20230264871 Williams et al. Aug 2023 A1
20230274346 Bowles et al. Aug 2023 A1
20230297973 Bowles et al. Sep 2023 A1
20230297974 Bowles et al. Sep 2023 A1
20230306384 Bowles et al. Sep 2023 A1
20230371729 Williams et al. Nov 2023 A1
20240005289 Silva et al. Jan 2024 A1
20240087276 Silva et al. Mar 2024 A1
20240144461 Forutanpour et al. May 2024 A1
20240185317 Forutanpour et al. Jun 2024 A1
20240249251 Bowles Jul 2024 A1
20240249321 Forutanpour et al. Jul 2024 A1
20240265364 Forutanpour et al. Aug 2024 A1
20240265470 Bowles et al. Aug 2024 A1
20240289753 Bowles Aug 2024 A1
20240321033 Forutanpour et al. Sep 2024 A1
20240346463 Hunt et al. Oct 2024 A1
Foreign Referenced Citations (442)
Number Date Country
2760863 Nov 2010 CA
2818533 May 2012 CA
2866147 Sep 2013 CA
3069888 Jan 2019 CA
3069890 Jan 2019 CA
1365479 Aug 2002 CN
1574437 Feb 2005 CN
2708415 Jul 2005 CN
1864088 Nov 2006 CN
1957320 May 2007 CN
2912132 Jun 2007 CN
200965706 Oct 2007 CN
101176124 May 2008 CN
101379488 Mar 2009 CN
201956656 Aug 2011 CN
102315630 Jan 2012 CN
102467728 May 2012 CN
202351953 Jul 2012 CN
202353475 Jul 2012 CN
102654927 Aug 2012 CN
202394296 Aug 2012 CN
102682597 Sep 2012 CN
202564711 Nov 2012 CN
202585951 Dec 2012 CN
202702438 Jan 2013 CN
202711369 Jan 2013 CN
102930642 Feb 2013 CN
102976004 Mar 2013 CN
103198562 Jul 2013 CN
103226870 Jul 2013 CN
203242065 Oct 2013 CN
103440607 Dec 2013 CN
103514641 Jan 2014 CN
103544772 Jan 2014 CN
203408902 Jan 2014 CN
103662541 Mar 2014 CN
103679147 Mar 2014 CN
203520502 Apr 2014 CN
103824387 May 2014 CN
203588366 May 2014 CN
103843040 Jun 2014 CN
103954626 Jul 2014 CN
302944037 Sep 2014 CN
302944252 Sep 2014 CN
302944253 Sep 2014 CN
303042750 Dec 2014 CN
205129815 Apr 2016 CN
205132514 Apr 2016 CN
205140067 Apr 2016 CN
106022379 Oct 2016 CN
303896361 Oct 2016 CN
106203643 Dec 2016 CN
106293734 Jan 2017 CN
106372638 Feb 2017 CN
304051346 Feb 2017 CN
304139831 May 2017 CN
304169301 Jun 2017 CN
206440635 Aug 2017 CN
107220640 Sep 2017 CN
206466691 Sep 2017 CN
107514978 Dec 2017 CN
206861374 Jan 2018 CN
207037788 Feb 2018 CN
105444678 Mar 2018 CN
304702339 Jun 2018 CN
304702340 Jun 2018 CN
304747709 Jul 2018 CN
304795309 Aug 2018 CN
108596658 Sep 2018 CN
207854959 Sep 2018 CN
108647588 Oct 2018 CN
207993120 Oct 2018 CN
207993121 Oct 2018 CN
207995226 Oct 2018 CN
304842785 Oct 2018 CN
108764236 Nov 2018 CN
208086545 Nov 2018 CN
208172834 Nov 2018 CN
208176564 Dec 2018 CN
304958348 Dec 2018 CN
305014434 Jan 2019 CN
305014435 Jan 2019 CN
109831575 May 2019 CN
208819255 May 2019 CN
208819289 May 2019 CN
208819290 May 2019 CN
208969761 Jun 2019 CN
305275610 Jul 2019 CN
110333876 Oct 2019 CN
110347341 Oct 2019 CN
110595361 Dec 2019 CN
110653162 Jan 2020 CN
110675399 Jan 2020 CN
110751002 Feb 2020 CN
110788015 Feb 2020 CN
110796646 Feb 2020 CN
110796647 Feb 2020 CN
110796669 Feb 2020 CN
110827244 Feb 2020 CN
110827245 Feb 2020 CN
110827246 Feb 2020 CN
110827247 Feb 2020 CN
110827248 Feb 2020 CN
110827249 Feb 2020 CN
110880028 Mar 2020 CN
110928730 Mar 2020 CN
305638504 Mar 2020 CN
110976302 Apr 2020 CN
111009073 Apr 2020 CN
111080184 Apr 2020 CN
210348162 Apr 2020 CN
111175318 May 2020 CN
111210473 May 2020 CN
305767220 May 2020 CN
111238430 Jun 2020 CN
111262987 Jun 2020 CN
111272067 Jun 2020 CN
111272388 Jun 2020 CN
111272393 Jun 2020 CN
111273704 Jun 2020 CN
111277466 Jun 2020 CN
111277659 Jun 2020 CN
111277695 Jun 2020 CN
111277696 Jun 2020 CN
111290660 Jun 2020 CN
111290949 Jun 2020 CN
111291661 Jun 2020 CN
111292302 Jun 2020 CN
111294454 Jun 2020 CN
111294459 Jun 2020 CN
111307429 Jun 2020 CN
111311556 Jun 2020 CN
111311687 Jun 2020 CN
111311749 Jun 2020 CN
111314445 Jun 2020 CN
111314535 Jun 2020 CN
111325715 Jun 2020 CN
111325716 Jun 2020 CN
111325717 Jun 2020 CN
111325901 Jun 2020 CN
210666955 Jun 2020 CN
305818424 Jun 2020 CN
111439560 Jul 2020 CN
211149556 Jul 2020 CN
305955503 Jul 2020 CN
211291337 Aug 2020 CN
211296771 Aug 2020 CN
211402187 Sep 2020 CN
211515235 Sep 2020 CN
211538600 Sep 2020 CN
111830293 Oct 2020 CN
111830354 Oct 2020 CN
111860890 Oct 2020 CN
111860891 Oct 2020 CN
211630227 Oct 2020 CN
306113050 Oct 2020 CN
306113051 Oct 2020 CN
306113052 Oct 2020 CN
212023984 Nov 2020 CN
212031269 Nov 2020 CN
306164092 Nov 2020 CN
306164093 Nov 2020 CN
306164094 Nov 2020 CN
306164095 Nov 2020 CN
112098443 Dec 2020 CN
212084259 Dec 2020 CN
212268703 Jan 2021 CN
212314534 Jan 2021 CN
212322247 Jan 2021 CN
212364464 Jan 2021 CN
306272538 Jan 2021 CN
306283626 Jan 2021 CN
112348761 Feb 2021 CN
112348808 Feb 2021 CN
112393880 Feb 2021 CN
112395118 Feb 2021 CN
212586854 Feb 2021 CN
212597202 Feb 2021 CN
306323627 Feb 2021 CN
112433902 Mar 2021 CN
112452935 Mar 2021 CN
112455988 Mar 2021 CN
112456100 Mar 2021 CN
112565505 Mar 2021 CN
212677296 Mar 2021 CN
212681731 Mar 2021 CN
111314537 Apr 2021 CN
112613622 Apr 2021 CN
112613914 Apr 2021 CN
112614117 Apr 2021 CN
112614269 Apr 2021 CN
112633194 Apr 2021 CN
112634245 Apr 2021 CN
112634288 Apr 2021 CN
112634301 Apr 2021 CN
112672145 Apr 2021 CN
112735081 Apr 2021 CN
213001252 Apr 2021 CN
213004872 Apr 2021 CN
112777290 May 2021 CN
112783702 May 2021 CN
112816490 May 2021 CN
112822740 May 2021 CN
112828842 May 2021 CN
112837076 May 2021 CN
112837102 May 2021 CN
213149008 May 2021 CN
213301455 May 2021 CN
213301535 May 2021 CN
213305483 May 2021 CN
112907182 Jun 2021 CN
112991614 Jun 2021 CN
113032198 Jun 2021 CN
113034481 Jun 2021 CN
113034493 Jun 2021 CN
113034529 Jun 2021 CN
113034530 Jun 2021 CN
113034531 Jun 2021 CN
113038012 Jun 2021 CN
113052798 Jun 2021 CN
113110806 Jul 2021 CN
113114794 Jul 2021 CN
113132523 Jul 2021 CN
113160494 Jul 2021 CN
113190215 Jul 2021 CN
113191789 Jul 2021 CN
213765490 Jul 2021 CN
213796595 Jul 2021 CN
213807304 Jul 2021 CN
306700330 Jul 2021 CN
113220647 Aug 2021 CN
113220648 Aug 2021 CN
113237473 Aug 2021 CN
113238680 Aug 2021 CN
113238905 Aug 2021 CN
113252678 Aug 2021 CN
113254292 Aug 2021 CN
113254293 Aug 2021 CN
113254294 Aug 2021 CN
113268162 Aug 2021 CN
113298078 Aug 2021 CN
113301202 Aug 2021 CN
113329222 Aug 2021 CN
213917879 Aug 2021 CN
213933659 Aug 2021 CN
306744667 Aug 2021 CN
306744668 Aug 2021 CN
306786433 Aug 2021 CN
306786434 Aug 2021 CN
113422860 Sep 2021 CN
214160736 Sep 2021 CN
214162705 Sep 2021 CN
214427985 Oct 2021 CN
113591066 Nov 2021 CN
113591963 Nov 2021 CN
215246545 Dec 2021 CN
215247165 Dec 2021 CN
215247245 Dec 2021 CN
215247426 Dec 2021 CN
215262785 Dec 2021 CN
215262787 Dec 2021 CN
215266884 Dec 2021 CN
215266954 Dec 2021 CN
215325354 Dec 2021 CN
215555043 Jan 2022 CN
215556081 Jan 2022 CN
215575427 Jan 2022 CN
215576764 Jan 2022 CN
215576765 Jan 2022 CN
215703219 Feb 2022 CN
215708961 Feb 2022 CN
216612155 May 2022 CN
112672145 Feb 2023 CN
10031532 Oct 2001 DE
0116970 Dec 1991 EP
0654003 May 1995 EP
1168253 Jan 2002 EP
1270905 Jan 2003 EP
1703436 Sep 2006 EP
3206194 Aug 2017 EP
2428072 Jan 2018 EP
3047833 Mar 2018 FR
2167553 May 1986 GB
202209941 Jul 2022 GB
20210100761 Jul 2022 GR
30014296 Aug 2020 HK
H07112801 May 1995 JP
H7334583 Dec 1995 JP
H11242005 Sep 1999 JP
H11334851 Dec 1999 JP
2000121564 Apr 2000 JP
2000171409 Jun 2000 JP
2000180371 Jun 2000 JP
3123095 Jan 2001 JP
2001312766 Nov 2001 JP
2002019147 Jan 2002 JP
2002183286 Jun 2002 JP
2002259528 Sep 2002 JP
2002302252 Oct 2002 JP
2002324264 Nov 2002 JP
2002358354 Dec 2002 JP
2003139516 May 2003 JP
2003230229 Aug 2003 JP
2003242243 Aug 2003 JP
2003264007 Sep 2003 JP
2003267509 Sep 2003 JP
2004021569 Jan 2004 JP
2004191496 Jul 2004 JP
2004226129 Aug 2004 JP
2004239850 Aug 2004 JP
2004288143 Oct 2004 JP
2004303102 Oct 2004 JP
2004341681 Dec 2004 JP
2005063203 Mar 2005 JP
2005122059 May 2005 JP
2005308476 Nov 2005 JP
2006127308 May 2006 JP
2006195814 Jul 2006 JP
2006203451 Aug 2006 JP
2006227764 Aug 2006 JP
2006260246 Sep 2006 JP
2007141266 Jun 2007 JP
2007155455 Jun 2007 JP
2007179516 Jul 2007 JP
2007265340 Oct 2007 JP
2008045959 Feb 2008 JP
2008059403 Mar 2008 JP
2008522299 Jun 2008 JP
2008293391 Dec 2008 JP
2007086725 Apr 2009 JP
2009175035 Aug 2009 JP
2009245058 Oct 2009 JP
2009250971 Oct 2009 JP
2009290852 Dec 2009 JP
2010177720 Aug 2010 JP
2010276896 Dec 2010 JP
2011518387 Jun 2011 JP
2012504832 Feb 2012 JP
2012058932 Mar 2012 JP
2013033361 Feb 2013 JP
2013037441 Feb 2013 JP
6050922 Dec 2016 JP
2017040957 Feb 2017 JP
2017093938 Jun 2017 JP
2017142781 Aug 2017 JP
2017173902 Sep 2017 JP
2017201559 Nov 2017 JP
6266065 Mar 2018 JP
2019012474 Jan 2019 JP
3223233 Sep 2019 JP
2022539909 Sep 2022 JP
2022539910 Sep 2022 JP
2022539912 Sep 2022 JP
2022545336 Sep 2022 JP
20000064168 Nov 2000 KR
20010074614 Aug 2001 KR
20010097567 Nov 2001 KR
100766860 Oct 2007 KR
20130085255 Jul 2013 KR
101326680 Nov 2013 KR
101329949 Nov 2013 KR
20140037543 Mar 2014 KR
101599251 Mar 2016 KR
20180088062 Aug 2018 KR
20180088063 Aug 2018 KR
1020180086617 Aug 2018 KR
20180117278 Oct 2018 KR
20190026131 Mar 2019 KR
20190107593 Sep 2019 KR
20190107595 Sep 2019 KR
20190107596 Sep 2019 KR
1020190107594 Sep 2019 KR
1020200115308 Oct 2020 KR
20210020717 Feb 2021 KR
1020210059148 May 2021 KR
1020210107515 Sep 2021 KR
WO8503790 Aug 1985 WO
WO2001015096 Mar 2001 WO
WO2002005176 Jan 2002 WO
WO0221090 Mar 2002 WO
WO2002025613 Mar 2002 WO
WO2002039357 May 2002 WO
WO2003012717 Feb 2003 WO
WO2003014994 Feb 2003 WO
WO2004021114 Mar 2004 WO
WO2004114490 Dec 2004 WO
WO2005008566 Jan 2005 WO
2005054877 Jun 2005 WO
WO2005101346 Oct 2005 WO
WO2006021825 Mar 2006 WO
WO2006058601 Jun 2006 WO
WO2006080851 Aug 2006 WO
WO2007066166 Jun 2007 WO
WO2009089607 Jul 2009 WO
WO2009128173 Oct 2009 WO
WO2009128176 Oct 2009 WO
WO2009129526 Oct 2009 WO
WO2010040116 Apr 2010 WO
WO2010128267 Nov 2010 WO
WO2010128315 Nov 2010 WO
WO2011131016 Oct 2011 WO
WO2012073126 Jun 2012 WO
WO2013002748 Jan 2013 WO
WO2013074819 May 2013 WO
WO2014075055 May 2014 WO
WO2014141180 Sep 2014 WO
WO2015022409 Feb 2015 WO
WO2015093676 Jun 2015 WO
WO2015108864 Jul 2015 WO
WO2016181224 Nov 2016 WO
WO2015196175 Dec 2016 WO
WO2017034441 Mar 2017 WO
WO2017081527 May 2017 WO
WO2017156046 Sep 2017 WO
WO2018124669 Jul 2018 WO
WO2018133068 Jul 2018 WO
WO2018146374 Aug 2018 WO
WO2019012305 Jan 2019 WO
WO2019012505 Jan 2019 WO
WO2019012506 Jan 2019 WO
WO2019212513 Nov 2019 WO
WO2019212515 Nov 2019 WO
2019008943 Apr 2020 WO
WO2020082991 Apr 2020 WO
WO2020204503 Oct 2020 WO
WO2021019286 Feb 2021 WO
WO2021082918 May 2021 WO
WO2021082919 May 2021 WO
WO2021082920 May 2021 WO
WO2021082921 May 2021 WO
WO2021082922 May 2021 WO
WO2021082923 May 2021 WO
WO2021142009 Jul 2021 WO
WO2021147385 Jul 2021 WO
WO2021147386 Jul 2021 WO
WO2021147387 Jul 2021 WO
WO2021147388 Jul 2021 WO
WO2021172803 Sep 2021 WO
WO2022034298 Feb 2022 WO
WO2022090999 May 2022 WO
WO2022091000 May 2022 WO
2023073248 May 2023 WO
Non-Patent Literature Citations (121)
Entry
“Yunwon Park et al., Ambiguous Surface Defect Image Classification of AMOLED Displays in Smartphones, Jan. 2016, IEEE Transactions on Industrial Informatics, vol. 12, Issue 2” (Year: 2016).
“Md Rafiul Hassan et. al., A Novel Cascaded Deep Neural Network for Analyzing Smart Phone Data for Indoor Localization, Dec. 2019, Future Generation Computer Systems, vol. 101, pp. 760-769” (Year: 2019).
2006 Florida Statutes Title XXXIII, Chapter 538, Sections 538.03 and 538.04, 7 pages.
3GPP Organizational Partners, “3rd Generation Partnership Project; Technical Specification Group Terminals; AT command set for GSM Mobile Equipment (ME),” Global System for Mobile Communications, 1998, 124 pages.
Aftermarket Cellular Accessories, “Cellular Phone Model Identification,” retrieved from http://web/archive.org/web/20060328064957/http://aftermarketcellular.com/ic/identification.html on Mar. 16, 2014, published Mar. 28, 2006, 3 pages.
Altec Lansing User's Guide 2007, 8 pages.
Bhule et al., “Environmental and economic trade-offs in consumer electronic products recycling: a case study of cell phones and computers,” IEEE International Symposium on Electronics and the Environment, Conference Record, 2004.
Bournique, D.: “Mobile Karma Shuts Down As iCloud and Blacklists Challenge Used Phone Buyers”, Prepaid Phone News, Jul. 23, 2014 (Jul. 23, 2014), XP055229747, Retrieved from the Internet URL:http://www.prepaidphonenews.com/2014/07/mobile-karma-shuts-down-as-icloud-and.html; accessed Nov. 27, 2017; 2 pages.
Business Wire, “The World's First Office Photography Machine” at CES 2008 Launched by Ortery Technologies, Jan. 7, 2008, 3 pages.
CNET, “Tackling LCD ”burn ins“, and dead/stick Pixels”, published Sep. 2, 2009, retrieved from http://www.cnet.com/news/tackling-lcd-burn-ins-and-deadstuck-pixels/.
Cybercom Group Europe AB, “OMSI Forum,” Downloads, 2005, 2 pages.
Cybercom Group Europe AB, “OMSI Provides Easy Service and Maintenance for Mobile Phones,” Press Release, 2005, 1 page.
Cybercom Group Europe AB, “The OMSI 2.0 Interface Supports,” OMSI 2.0 Description, available at least before Oct. 2008, 1 page.
Cybercom Group, “Leading Telecom Organisations Address Device Management Issues,” Press Release, 2007, 1 page.
Evgenii Masunov, Mar. 25, 2010, http://www.appleinsider.ru/news/ipone-obladaet-luchshim-tachskrinom-provereno_robotom.html, 4 pages.
Foster et al., “Automated Visual Inspection: A Tutorial” 1990 Computers Ind. Engng. vol. 18(4): 493-504.
Geekanoids, You Tube Video, “Apple iphone 3GS Unboxing and Review”, uploaded on Jun. 19, 2009, retrieved from http://www.youtube.com/watch?v=GCEI9QAeDok on Sep. 2, 2009.
Geyer et al. “The economics of cell phone reuse and recylcing,” The International Journal of Advanced Manufacturing Technology, 47(5): 515-525, 2010.
Graffia et al., “Retail Station for E-Device Identification, Assessment, and Trade-In”, Jun. 6, 2014 (Drawings and Specification) (Year: 2014).
Watson; “Review: SanDisk iXpand Wireless Charger” Sep. 15, 2019, 4 pages retrieved at https://www.whatmobile.net/Reviews/article/review-sandisk-ixpand-wireless-charger.
GSM Arena Glossary, “LCD (Liquid Crystal Display”, retrieved from http://www.gsmarena.com/glossary.php3?term=LCD on Apr. 28, 2016, 1 page.
Hazelwood, et al.; “Life Extension of Electronic Products: A Case Study of Smartphones”, Sep. 20, 2021, IEEE Access, vol. 9, pp. 144726-144739, DOI: 10.1109/ACCESS.2021.3121733.
International Numbering Plan. Retrieved on Apr. 5, 2013 at http://web.archive.org/web/20070322214125/http://www.numberingplans.com/?page+analysis&sub+imeinr, 2 pages.
Investopedia: What's the difference between weighted average accounting and FIFO/LILO accounting methods? Aug. 19, 2010. Accessed via archive.org [https://web.archive.org/web/20100819200402/http://www.investopedia.com/ask/answers/09/weighted-average-fifo-lilo-accounting.asp].
Kanter, James Max, “Color Crack:Identifying Cracks in Glass,” dated Dec. 9, 2014; retrieved from the internet http://www.jmaxkanter.com/static/papers/color_crack.pdf on Sep. 22, 2017.
Lambert, Emily, “Use It Up, Wear It Out”, Forbes 175.5 (2005): 77-78. Business Source Complete. Web. Jan. 6, 2015, 3 pages.
Littleton Partners with Donations Ink (Jan. 19, 2006) US Fed News Service, Including US State News. Web. Jan. 6, 2015, 1 page.
MobileGazette.com, “2006 in Review: The Good, The Bad and The Ugly”, published Dec. 2006, retrieved from http://www.mobilegazette.com/2006-review-06x12x22.htm on Nov. 11, 2015.
Kuriyan, et al.: “Review of Research on Rural PC Kiosks,” Apr. 14, 2007, 22 pages, retrieved at http://research.microsoft.com/research/tem/kiosks.
Oliveira, et al., “Automatic crack detection on road imagery using anisotropic diffusion and region linkage,” 18th European Signal Processing Conference (EUSIPCO-2010), Aug. 23, 2010, pp. 274-278.
Novotny, et al.; “Smart City Concept, Applications and Services,” Aug. 26, 2014, Journal of Telecommunications System & Management, vol. 3, Issue 2, pp. 1-8, DOI: 10.4172/2167-0919.1000117.
PC World, “Wipe Your Cell Phone's Memory Before Giving it Away”, published Jan. 2006, retrieved from http://www.washingtonpost.com/wp-dyn/content/article/2006/01/30/AR2006013001144.html on Nov. 10, 2015.
Perng et al., “A Novel Vision System for CRT Panel Auto-Inspection”, Proceedings of the 2005 IEEE International Conference on Mechatronics, Jul. 10-12, 2005, pp. 4.
Perng et al., “A Novel Vision System for CRT Panel Auto-Inspection”, Journal of the Chinese Institute of Industrial Engineers, vol. 24, No. 5, pp. 341-350 (2007).
Phifer, “How to Use your 3G Phone as a Wireless Broad Band Modem,” Computer Weekly News, 2007, 6 pages.
Rawson, Chris, “TUAW: 25 Ways to Check the Hardware on Your iphone 4”, published Aug. 12, 2010, retrieved at http://www.tuaw.com/2010/08/13/hardware-test-your-iphone-4/ on Feb. 28, 2014.
Rehg et al. “Vision for a Smart Kiosk”, Jun. 1997, Computer Vision and Pattern Recognition, pp. 690-696.
RMS Communications Group, “RMS Communications Group Inc. opens cell phone kiosk at Ocean City Mall in Toms River, N.J.”, retrieved from http://www.prweb.com/releases/2004/11/prweb177351.htm, Nov. 12, 2004, 2 pages.
Rolf Steinhilper “Remanufacturing: The Ultimate Form of Recycling”, Fraunhofer IRBVerlag, 1998, parts 1-3, http://www.reman.org/Publications_main.htm.
Romano “Recycling a Phone at EcoATM is an Easy Route To Feeling Green,” Xconomy, Jan. 22, 2014, pp. 1-3.
Rosebrock, “How to Build a Kick-Ass Mobile Document Scanner in Just 5 Minutes” Pylmage Search, Sep. 2014, 19 pages.
Shotton et al., “Efficiently Combining Contour and Texture Cues for Object Recognition”, Proceedings of the British Machine Vision Conference 2008, (20080901), pp. 7.1-7.10 * abstract *.
Shue, Jiuh-Biing et al. “Extended consumer responsibility: Syncretic value-oriented pricing strategies for trade-in-for-upgrade programs” Transportation Research Part E: Logistics and Transportation Review 122 (2019) 350-367.
Simplysellular, “Get Cash for your Old Cell Phone”, published Apr. 2, 2010, retrieved from http://simplysellular.com/conditions.php on Jan. 6, 2015, 2 pages.
Sony Ericsson Mobile Communications Ab, “P800/P802,” White Paper, 2003, 128 pages.
Sony Ericsson Mobile Communications AB, “T68i/T68ie,” White Paper, 2002, 71 pages.
Tecace Software: “Your phone appraisal-Movaluate—Android Apps on Google Play”, Android Apps On Google Play, Aug. 12, 2013 (Aug. 12, 2013), XP055230264, Retrieved from the Internet URL:https://play.google.com/store/apps/details?id=com.tecace.android.app.movaluate&hl=en; accessed Nov. 27, 2017; 2 pages.
Tech Spurt; Sandisk iXpand Review, Wireless Charger & Auto Photo Backup!, Aug. 21, 2019, 1 page, retrieved at https://www.youtube.com/watch?v=zemKQ6xULww.
Trading devices for dollars, The Economist (US) 405.8813:8 (US), Economist Intelligence Unit N.A. Incorporated, Dec. 1, 2012.
Turner, “5 MP3 Players for Pumping Up Your Workouts,” Mashable.com, Nov. 4, 2010, available online at https://mashable.com/2010/11/04/mp3-players-for-sports/ (Year: 2010).
Grose, Thomas; “New Life for Old Phones,” ASE Prism 22.3 (2012): 18.
Waugh, “Phone recycling machine lets you drop in old mobiles—and spits out cash instantly,” Daily Mail Online, Jan. 13, 2012, p. 1-2.
Wikipedia, “Machine Vision” Sep. 19, 2009, 6 pages.
Wiley Encyclopedia of Computer Science and Technology, Nov. 2008, 2362 pages, Wiley-interscience, ISBN-10: 0471383937, ISBN-13: 978-0471383932.
Wilson, Doug, “Liquid Crystal Display (LCD) Inspection System”, National Instruments Case Study, available May 10, 2009, retrieved from http://sine.ni.com/cs/app/cod/p/id/cs-345 on Jan. 5, 2015, 2 pages.
Wu, “Overview of Wireless Power and Data Communication” WPC/QI Developers Forum, Oct. 29, 2016, 21 pages.
Yahoo Answers, “What is a Clean ESN?” published Jun. 23, 2009, retrieved from http://web.archive.org/web/20090623215042/http://answers.yahoo.com/question/inde,80 20US?gid=20080318061012AANFRco on Apr. 3, 2014.
Zhang, Yiyang, “The design of glass crack detection system based on image preprocessing technology,” 2014 IEEE 7th Joint International Information Technology and Artificial Intelligence Conference, IEEE, Dec. 20, 2014; pp. 39-42.
International Search Report and Written Opinion mailed Jun. 10, 2015 in International Application No. PCT/US2015/014139. 9 pages.
International Search Report and Written Opinion mailed Dec. 22, 2015 in International Application No. PCT/US2015/053591, 18 pages.
Non-Final Office Action mailed Aug. 24, 2017 in U.S. Appl. No. 15/630,539, 23 pages.
Non-Final Office Action mailed Sep. 1, 2017 in U.S. Appl. No. 15/630,460, 23 pages.
Non-Final Office Action mailed Sep. 8, 2017 in U.S. Appl. No. 15/630,508, 13 pages.
Non-Final Office Action response filed Dec. 8, 2017 in U.S. Appl. No. 15/630,508, 19 pages.
Non-Final Office Action response filed Nov. 2, 2017 in U.S. Appl. No. 15/630,539, 15 pages.
Non-Final Office Action Response filed Nov. 29, 2017 in U.S. Appl. No. 15/630,460, 16 pages.
Non-Final Office Action mailed Dec. 29, 2017 in U.S. Appl. No. 14/873,158, 14 pages.
Notice of Allowance mailed Jan. 17, 2018 in U.S. Appl. No. 15/630,539 of Bowles et al., filed Jun. 22, 2017.
Final Office Action mailed Feb. 1, 2018 in U.S. Appl. No. 15/630,460, 47 pages.
Non-Final Office Action response filed Mar. 29, 2018 in U.S. Appl. No. 14/873,158, 18 pages.
Final Office Action mailed Apr. 19, 2018 in U.S. Appl. No. 15/630,508, 15 pages.
Final Office Action Response as Appeal Brief filed Jun. 19, 2018 in U.S. Appl. No. 15/630,460, 24 pages.
Final Office Action mailed Jul. 23, 2018 in U.S. Appl. No. 14/873,158, 13 pages.
Examiner's Answer to Appeal Briefs dated Sep. 28, 2018 in U.S. Appl. No. 15/630,460, 16 pages.
Non-Final Office Action mailed Oct. 22, 2018 in U.S. Appl. No. 14/873,145, 7 pages.
Final Office Action response filed Mar. 25, 2019 in U.S. Appl. No. 14/873,158, 14 pages.
Final Office Action response filed Apr. 3, 2019 in U.S. Appl. No. 15/630,508, 13 pages.
Non-Final Office Action response filed Apr. 22, 2019 in U.S. Appl. No. 14/873,145, 13 pages.
Notice of Allowance mailed May 6, 2019 in U.S. Appl. No. 14/873,158, 7 pages.
Notice of Allowance mailed May 7, 2019 in U.S. Appl. No. 15/630,508 of Silva, J., et al., filed Jun. 22, 2017, 7 pages.
Notice of Allowance mailed Jun. 28, 2019 in U.S. Appl. No. 14/873,145, 9 pages.
Notice of Allowance mailed Sep. 5, 2019 in U.S. Appl. No. 14/873,158, 7 pages.
Notice of Allowance mailed Sep. 9, 2019 in U.S. Appl. No. 15/630,508 of Silva, J., et al., filed Jun. 22, 2017, 7 pages.
Supplemental Appeal Brief filed Apr. 9, 2020 in U.S. Appl. No. 15/630,460, 4 pages.
Written Submissions Arguments and Claims Filed Apr. 23, 2020, EP17180663.1, 6pgs.
Preliminary Opinion in Response to Written Submissions, Jul. 6, 2020, EP17180663.1, 5pgs.
Notice of Allowance mailed Jul. 6, 2020 in U.S. Appl. No. 15/630,460, 8 pages.
Declaration in Support of Written Submissions Jul. 8, 2020, EP17180663.1, 3pgs.
Decision to Refuse Application Sep. 25, 2020, EP17180663.1, 7pgs.
Non-Final Office Action mailed Oct. 16, 2020 in U.S. Appl. No. 16/575,003, 11 pages.
Non-Final Office Action response filed Mar. 16, 2021 in U.S. Appl. No. 16/575,003, 13 pages.
Notice of Allowance mailed Jun. 28, 2021 in U.S. Appl. No. 16/575,003, 15 pages.
Restriction Requirement mailed Jul. 2, 2021 in U.S. Appl. No. 16/556,018, 6 pages.
Restriction Requirement response filed Sep. 2, 2021 in U.S. Appl. No. 16/556,018, 6 pages.
Non-Final Office Action mailed Sep. 15, 2021 in U.S. Appl. No. 16/556,018, 6 pages.
Non-Final Office Action response filed Dec. 10, 2021 in U.S. Appl. No. 16/556,018, 10 pages.
International Search Report and Written Opinion mailed Dec. 20, 2021 in International Application No. PCT/US2021/071269, 14 pages.
Non-Final Office Action mailed Apr. 13, 2022 in U.S. Appl. No. 16/556,018, 5 pages.
Non-Final Office Action response filed Jul. 13, 2022 in U.S. Appl. No. 16/556,018, 10 pages.
Non-Final Office Action mailed Sep. 26, 2022 in U.S. Appl. No. 16/556,018, 6 pages.
Non-Final Office Action response filed Jan. 26, 2023 in U.S. Appl. No. 16/556,018, 12 pages.
Non-Final Office Action mailed Feb. 17, 2023 in U.S. Appl. No. 17/445,575, 6 pages.
Non-Final Office Action response filed Mar. 15, 2023 in U.S. Appl. No. 17/445,575, 9 pages.
Notice of Allowance mailed Apr. 4, 2023 in U.S. Appl. No. 17/445,575, 7 pages.
Notice of Allowance mailed Jul. 21, 2023 in U.S. Appl. No. 16/556,018, 8 pages.
Restriction Requirement mailed Nov. 2, 2023 in U.S. Appl. No. 18/346,618, 6 pages.
Non-Final Office Action mailed Dec. 21, 2023 in U.S. Appl. No. 18/324,903, 37 pages.
Non-Final Office Action mailed Dec. 21, 2023 in U.S. Appl. No. 18/324,921, 27 pages.
Non-Final Office Action mailed Feb. 1, 2024 in U.S. Appl. No. 18/346,618, 7 pages.
Non-Final Office Action response filed Mar. 21, 2024 in U.S. Appl. No. 18/324,903, 29 pages.
Non-Final Office Action response filed Mar. 21, 2024 in U.S. Appl. No. 18/324,921, 27 pages.
Final Office Action mailed Apr. 26, 2024 in U.S. Appl. No. 18/324,903, 36 pages.
Final Office Action mailed Apr. 26, 2024 in U.S. Appl. No. 18/324,921, 31 pages.
Non-Final Office Action response filed Apr. 30, 2024 in U.S. Appl. No. 18/346,618, 12 pages.
Final Office Action mailed Jun. 5, 2024 in U.S. Appl. No. 18/346,618, 8 pages.
Non-Final Office Action received Jul. 1, 2024 in U.S. Appl. No. 18/464,023, pp. all.
Non-Final Office Action response filed Oct. 4, 2024 in U.S. Appl. No. 18/346,618, pp. all.
Anderle, Megan, “Verizon's new app aims to make phone recycling easy and profitable”, Internet Article, May 1, 2014, XP093222792, retrieved from the Internet: URL: https://www.theguardian.com/sustainable-business/verizon-mobile-phone-recycling-cell-ecoatm.
TecAce Software: “Android Smartphone Testing App—Movaluate—TecAce Software | PRLog” Internet Article, May 6, 2013, XP093222769, retrieved from the Internet: URL: https://www.prlog.org/12132313-android-smartphone-testing-app-movaluate.html.
Notice of Allowance mailed Oct. 28, 2024 in U.S. Appl. No. 18/346,618, pp. all.
Response to Non-Final Office Action filed Nov. 1, 2024 in U.S. Appl. No. 18/464,023, pp. all.
Related Publications (1)
Number Date Country
20220067798 A1 Mar 2022 US
Provisional Applications (1)
Number Date Country
63070207 Aug 2020 US