The present application is based on and claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2017-0028082 filed on Mar. 6, 2017, the content of which is incorporated by reference herein in its entirety.
The present disclosure relates generally to augmented reality. For example, the present disclosure is related to a method, system and electronic device for providing augmented reality content created based on data received from one or more external sensor nodes.
Internet of Things (IoT) is a technology that enables devices that belong to a single network to connect each other seamlessly. In order to process information exchanged between distributed configurations such as things, the Internet evolves to the IoT network.
In order to provide IoT services, various technical components are required, such as, sensing technology, wired/wireless communication and network infra technology, a service interfacing technology, a security technology, etc. In particular, various technologies combining various types of devices with a single network, e.g., a sensor network for connecting things, Machine to Machine (M2M), Machine Type Communication (MTC), etc., have been researched.
Under the IoT environment, intelligent Internet Technology (IT) services may be provided to collect and analyze data obtained from things connected to each other and thus to create new value for human life. IoT is fused and combined with various industries along with existing information technologies, and thus may be applied within various fields, such as: smart homes, smart buildings, smart cities, smart cars or connected cars, smart grids, health care, smart home appliances, high quality medical services, etc.
In order to receive IoT services, various types of wearable devices are released on the market. A typical example of the wearable device is a smart watch and a Head-Mounted Display (HMD). Examples of the smart watch are Apple iWatch, Samsung Galaxy GearS, etc. Examples of the HMD are Google Glass, and Samsung GearVR.
An example of various IoT service applications using wearable devices is a building management system in a smart building environment, employing a control service using a portable device. For example, in order to manage a building, ambient environment information (e.g., temperature, humidity) may be collected using data obtained from a wireless sensor network including a number of sensor nodes.
Meanwhile, Augmented Reality (AR), as a type of Mixed Reality between reality and virtual reality, is referred to as a technology that blends information or things in the virtual world into the real world, and thus augments the information or things as if they exist in the original environment. To this end, augmented reality recognizes a specific object, generates a 3D image for the recognized object, and overlays a captured image with the generated 3D image. In general, augmented reality technology discovers a location of an object from an image obtained by a camera, using a marker with a specific image or an image pattern, as a reference. Therefore, in order to implement general augmented reality technology, a number of tasks are required, such as a process of constructing an image registration software program for recognizing markers or location information to which information in the real world is blended, a process of previously registering necessary information in a database, a process of linking the registered information to information in the real world, etc.
When Augmented Reality (AR) technology is applied to large-scale space, such as buildings, etc., a general AR scheme, e.g., a marker-based AR scheme, may not be suitable for the space. For example, in order to detect conditions of the temperature, humidity, energy, wires, pipes, etc. at points of a building which are not seen, such as a point behind the ceiling panel, a point behind the wall, the underground, etc., this requires complicated processes such as a process of installing markers to points that are seen, a process of applying an AR technology to points that are not seen, based on the markers at the points that are seen, etc.
The present disclosure addresses the problems described above and provides an Augmented Reality (AR) technology using data received from wireless sensor nodes.
In accordance with an example aspect of the present disclosure, a method of displaying augmented reality content in an electronic device is provided. The method includes: receiving sensor data from a specific sensor node outside the electronic device; obtaining image information from an image taking unit comprising imaging circuitry such as, for example, and without limitation, a camera, camcorder or the like, configured to generate image information; generating augmented reality content based on the sensor data and the image information; and displaying the augmented reality content.
The method may further include: receiving an identifier from the specific sensor node for identifying the specific sensor node. The method may further include: determining a location of the specific sensor node based on the identifier. The augmented reality content may be generated based on a sensed value comprising at least one of: a temperature value, a humidity value and an illuminance value at the location.
The method may further include: receiving, from the specific sensor node, information on a location of the specific sensor node. The augmented reality content may be generated based on a sensed value comprising at least one of: a temperature value, a humidity value and an illuminance value at the location. The method may further include: calculating (determining) a location of the electronic device based on strength of signals received from a plurality of wireless sensor nodes including the specific sensor node. At least one of the wireless sensor nodes may be installed at a location that differs from other nodes. The calculation (determination) of the location of the electronic device may include: calculating (determining) a height location of the electronic device.
The augmented reality content may be updated based on the location and the movement of the electronic device.
The generating of the augmented reality content may include: scaling the augmented reality content, based on a distance between the electronic device and the specific sensor node.
The method may further include: requesting the sensor data from the specific sensor node. The request may include identification information of the electronic device, and the sensor data may be received in response to the identification information.
The sensor data may be encoded with a first encryption key. The method may further include: decoding the encoded sensor data using a second encryption key corresponding to the first encryption key.
The method may further include: recognizing the specified sensor node as a marker. The augmented reality content may be generated based on information derived from the sensor data, the image information and the marker.
In accordance with another example aspect of the present disclosure, an electronic device is provided. The electronic device includes: a transceiver configured to receive sensor data from a specific sensor node outside the electronic device; an image taking unit comprising image taking circuitry, such as, for example, and without limitation, a camera, a camcorder, or the like, configured to take images and to generate image information; a processor functionally or operatively connected to the transceiver and the image taking unit; and a display configured to display the augmented reality content. The processor is configured to generate the augmented reality content based on the sensor data and the image information.
The transceiver may be configured to receive, from the specific sensor node, an identifier for identifying the specific sensor node. The processor may be configured to determine a location of the specific sensor node based on the identifier. The processor may be configured to generate the augmented reality content based on a sensed value comprising at least one of: a temperature value, a humidity value and an illuminance value at the location.
The transceiver may be configured to receive, from the specific sensor node, information on a location of the specific sensor node. The processor may be configured to generate the augmented reality content based on a sensed value comprising at least one of: a temperature value, a humidity value and an illuminance value at the location.
The processor may be configured to calculate (determine) a location of the electronic device based on strength of signals received from a plurality of wireless sensor nodes including the specific sensor node. At least one of the wireless sensor nodes may be installed at a location that differs from other nodes. The processor may be configured to calculate (determine) a height location of the electronic device.
The processor may be configured to update the augmented reality content based on the location and the movement of the electronic device.
The processor may be configured to scale the augmented reality content, based on a distance between the electronic device and the specific sensor node. The processor may be configured to request the sensor data from the specific sensor node. The request may comprise identification information of the electronic device. The sensor data may be received in response to the identification information.
The sensor data may be encoded with a first encryption key. The processor decodes the encoded sensor data using a second encryption key corresponding to the first encryption key.
The image taking unit may be configured to recognize the specific sensor node as a marker. The processor may be configured to generate the augmented reality content, based on information derived from the sensor data, the image information and the marker.
In accordance with another example aspect of the present disclosure, a method of a wireless sensor node for transmitting sensor data is provided. The method includes: generating sensor data by performing a measurement using a sensor; receiving, from an electronic device, a request for sensor data including the sensed data; encoding the sensor data; and transmitting the encoded sensor data to the electronic device.
The request may include identification information of the electronic device. Encoding the sensor data may include encoding the sensor data using an encryption key corresponding to the identification information.
The method may further include transmitting identification information on the wireless sensor node to the electronic device.
The method may further include transmitting location information of the wireless sensor node to the electronic device.
The sensed data may include at least one of a temperature value or a humidity value.
The method may further include: identifying states of another sensor node communicating with the wireless sensor node; and transmitting a report on the states of the other sensor node to the electronic device. Identifying states of another sensor node may include: determining whether the other sensor node works normally; and identifying, if the other sensor node does not work normally, a cause generating a malfunction of the other sensor node. The report may include a value indicating the cause. Identifying states of another sensor node may include: determining whether the other sensor node works normally. The report may include location information or an identifier of the other sensor node that does not work normally.
In accordance with another example aspect of the present disclosure, a wireless sensor node is provided. The wireless sensor node may include: a sensor configured to perform a measurement and to generate sensor data; a communication unit comprising communication circuitry configured to receive, from an electronic device, a request for sensor data including the sensed data; and a processor configured to encode the sensor data. The communication unit may be configured to transmit the encoded sensor data to the electronic device.
The request may include identification information of the electronic device. The processor may be configured to encode the sensor data using an encryption key corresponding to the identification information.
The communication unit may be configured to transmit identification information of the wireless sensor node to the electronic device.
The communication unit may be configured to transmit sensor data including location information of the wireless sensor node to the electronic device.
The sensed data may include at least one of a temperature value or a humidity value.
The processor may be configured to identify states of another sensor node communicating with the wireless sensor node. The communication unit may be configured to transmit a report on the states of the other sensor node to the electronic device. The processor may be configured to determine whether the other sensor node works normally; and to identify, if the other sensor node does not work normally, a cause generating a malfunction of the other sensor node. The report may include a value indicating the cause. The processor may be configured to determine whether the other sensor node works normally. The report may include location information or an identifier of the other sensor node that does not work normally.
The above and other aspects, features and attendant advantages of the present disclosure will be more apparent and readily appreciated from the following detailed description, taken in conjunction with the accompanying drawings, in which like reference numerals refer to like elements, and wherein:
It should be understood that the various example embodiments of the present disclosure described herein may be altered, changed or modified in various ways, to include various modification, equivalents and/or alternatives. Example embodiments are illustrated in the drawings and described in greater detail in the description. However, this is not intended to limit the disclosure to particular modes of practice, and it should be understood that all changes, equivalents, and substitutes that do not depart from the spirit and technical scope of the disclosure are encompassed in the disclosure. Detailed descriptions of well-known functions and structures incorporated herein may be omitted to avoid obscuring the subject matter of the disclosure.
The terms such as “first” and “second” are used herein merely to describe a variety of elements, but the elements are not limited by the terms. The terms are used only for the purpose of distinguishing one element from another element.
The terms used in the present disclosure are used for explaining a specific embodiment and do not limit the scope of the disclosure. Thus, the expression of singularity in the present disclosure includes the expression of plurality unless clearly specified otherwise. Also, the terms such as “include” or “comprise” may be understood to denote a certain feature, number, step, operation, element, component or a combination thereof, but may not be understood to exclude the existence of or a possibility of addition of one or more other features, numbers, steps, operations, elements, components or combinations thereof.
In the example embodiments, a ‘module’ or ‘unit’ performs at least one function or operation and may be implemented with hardware, software, or a combination thereof. A number of modules or a number of units, except for a ‘module’ or a ‘unit’ which needs to be implemented with specific hardware, may be implemented in such a way that they are integrated into at least one module as at least one processor (not shown).
With reference to
The device extracts an image pattern (S230A). For example, if an image captured in operation S220A is an image of a red/green/blue (RGB) format, the device converts the image of an RGB format into a gray scale image, and then to a binary image. Part of the binary image generated by the binary image process may be an image of interest for the image process. In addition, the binary image may be further processed in such a way that parts of the area, which can be considered as clusters, are grouped. After that, a contour detect procedure for extracting the contours of the grouped parts, a vertex detect procedure for detecting vertexes of the contours to identify the rectangle area by pattern markers, and a normalization procedure for forming, from the contours identified as a rectangular area, a square with four congruent sides and four 90° angles are performed. Therefore, a code, identical to the code detected from a rectangular area when a pattern marker is first registered, may be extracted as a pattern code.
A location matrix (X, Y, Z) is calculated on the LCD screen (S240A). 3D model rendering by matrix is performed (S250A). Stored augmented reality content is loaded (S260A).
Meanwhile, the marker-based AR technology has a disadvantage because the augmented reality information is implemented only if it correctly recognizes the markers. If a marker is lost or the camera angle is not correct, the technology has difficulty in implementing content. As described above, when Augmented Reality (AR) technology is applied to large-scale space, such as buildings, etc., a marker-based AR scheme may not be suitable for the space. For example, in order to detect conditions of temperature, humidity, energy, wires, pipes, etc. at points of a building which are not seen (or visible), such as a point behind the ceiling panel, a point behind the wall, underground, etc., this requires complicated processes such as a process of installing markers to points that are seen (visible), a process of applying an AR technology to points that are not seen, based on the markers at the points that are seen, etc.
On the other hand, as illustrated in
Meanwhile, a marker-based AR technology may enable, if a camera is located on a pattern set by the camera, anybody to load data and apply the visualization process to the data. However, a wireless sensor node AR technology enables only a specific user with permitted authority (e.g., key information), e.g., a manager of a building, etc., to permit the access to the data and information, thereby blocking the access to sensitive information regarding pipe/electricity/energy data information, etc.
For example, with reference to
As illustrated in
Meanwhile, as described above, an existing marker-based AR technology requires processes of extracting a unique image pattern from a marker, and calculating a location matrix for the extracted result. In addition, the additionally visualized images are limited to images, animation effects, etc. which are stored in an electronic device. However, according to a system for providing augmented reality content according to various example embodiments of the present disclosure, wireless sensor nodes provide sensed data measured by wireless sensor nodes, radio frequency (RF) information (e.g., control information), application information, etc., for existing marker function and additional functions, and an electronic device displays augmented reality content, considering the information described above.
With reference to
When an electronic device moves in the vicinity of a wireless sensor node, it receives a signal from the wireless sensor node and identifies the wireless sensor node, based, for example, on RSSI of the reference signal and/or the ID of the wireless sensor node included in the received signal (S402).
Each of the wireless sensor nodes may use data in the frame structure illustrated in
With reference back to
As another example, information on a location of an electronic device may be derived based on the strength of signal (e.g., RSSI) received from a plurality of wireless sensor nodes. For example, three may be selected among the plurality of wireless sensor nodes in order of strongest signal strength, and then a location of an electronic device may be calculated (determined) based on the strength of signals received by the selected wireless sensor nodes.
Referring to
Referring to
The location of the electronic device M (e.g., [x, y, z]) in
The location of an electronic device may, for example, be calculated using the following equations.
In order to more precisely estimate locations of wireless sensor nodes and/or electronic device as in S403 of
The electronic device performs a process for the visualization of sensor information and a wireless sensor node (S404). For example, if a specific wireless sensor node is at a location corresponding to a display area of an electronic device (e.g., an image/photograph of an actual event), it makes a request for sensed data (e.g., temperature, humidity), based on a distance between the wireless sensor node and the electronic device and/or an ID of the wireless sensor node (which may be obtained by the advertisement and/or broadcast of the wireless sensor node), receives sensor data in response to the request, and visualizes and displays the received sensor data in 3D image. The electronic device is capable of calculating a size of an area to be displayed to scale. For example, the scaling factor may be determined, using d in [Equation 2]. Therefore, according to the correct locations of wireless sensor nodes and the distance from the user, augmented reality content may be rendered/expressed in realistic size. Meanwhile, augmented reality content generated in S404 of
Meanwhile, at least one filter may be used to remove noise from values of sensors. For example, Kalman filter is applied to values of a gyroscope in Case A of
In Equation 3, X denotes a corrected SensorValuet value (using a filter). SensorValuet is a raw data value of sensor data measured at time t. K denotes a system measurement vector at time t. P denotes a processing noise value at time t. Q denotes algorithm definition constant (a pre-defined value). R denotes an estimation noise value at time t. An averaging filter as one of the smoothing filters may be used. In this case, the average is obtained based on the total data. Therefore, as the value is accumulated, the change in the recent value is buried into the average, which is a problem. Therefore, a filter (e.g., Kalman filter) may be used to: smooth by covering the most recently measured samples with a window; and apply the change in the recent RSSI value as it is.
With reference to
The electronic device is capable of requesting sensor data from a wireless sensor node. The request may include authentication information such as identification information on a wireless sensor node and/or an electronic device, and a secure key. The wireless sensor node is capable of transmitting sensor data to an electronic device only if the electronic device has been authenticated.
The electronic device is capable of calculating (determining) its location based on the strength of signals received from a plurality of wireless sensor nodes. As described above, the electronic device is capable of using four or more wireless sensor nodes to calculate the height of a location where the electronic device is located, e.g., a z-coordinate value in the Cartesian coordination. As described above, according to an example embodiment at least one of the four or more wireless sensor nodes should be installed to a location that differs in height from the location of the other nodes.
In addition, a wireless sensor node has functions as a marker. An electronic device is capable of performing the reception of sensor information from a wireless sensor node and the recognition of a wireless sensor node as a marker, together or separately. If the electronic device recognizes a wireless sensor node as a marker, it is capable of processing provided augmented reality content differently. For example, if the electronic device receives only sensor information, it is capable of providing first information such as a malfunction occurrence guide. If the electronic device recognizes a wireless sensor node as a marker, it is capable of providing second information such as a malfunction repair/restoration guide. In this case, the second information may be generated based on information derived from the recognized marker.
In operation S1220, the electronic device obtains image information from an image taking unit, e.g., a camera, a camcorder, or the like. The image information is referred to as a real image such as a picture or a moving picture, taken by an image taking unit, e.g., a camera, a camcorder, etc. It should be understood that the image information may also be a virtual image where example embodiments of the present disclosure are applied to mixed reality.
In operation S1230, the electronic device generates augmented reality content based on the sensor data received as in operation S1210 and the image information obtained as in operation S1220. For example, the augmented reality content may be generated based on sensed values measured by a wireless sensor node, such as temperature, humidity and/or illuminance.
In operation S1240, the electronic device displays the augmented reality content generated as in operation S1230.
The displayed, augmented reality content may be updated based on the movement and the location of the electronic device, repeating operations from S1210 to S1240.
In operation S1410, a wireless sensor node performs a measurement using sensors and generates sensed data. For example, the sensed data may include at least one of: temperature, humidity and illuminance.
In operation S1420, the wireless sensor node receives, from the electronic device, a request for sensor data including sensed data. As described above, the request may include authentication information such as a secure key, identifiers of a wireless sensor node and/or an electronic device, etc. The wireless sensor node may transmit sensor data to an electronic device only if the electronic device has been authenticated.
In operation S1430, the wireless sensor node encodes the sensor data generated as in operation S1410. The wireless sensor node may encode the sensor data, using an encryption key corresponding to authentication information on an electronic device received from the electronic device. The wireless sensor node and the electronic device share a secret key with each other. The wireless sensor node encodes sensor data using the secret key, and transmits the encoded sensor data to the electronic device. The electronic device receives and decodes the encoded sensor data, using the secret key. Alternatively, if the wireless sensor node has a first encryption key and the electronic device has a second encryption key corresponding to the first encryption key, the wireless sensor node encodes sensor data using the first encryption key, and transmits the encoded sensor data to the electronic device. The electronic device is capable of decoding the received encoded sensor data, using the second encryption key. For example, the first encryption key and the second encryption key are a public key and a private key, respectively.
In operation S1440, the wireless sensor node transmits, to the electronic device, the sensor data encoded in operation S1430. The wireless sensor node may transmit, to the electronic device, its identification information and/or location information, separately or along with the sensor data. In addition, the wireless sensor node detects states of its ambient sensor nodes, and may transmit, to the electronic device, a report on the states of the ambient sensor nodes, separately or along with the sensor data. For example, the wireless sensor node identifies whether another sensor node normally works. If the other sensor node does not work normally, the wireless sensor node may report, to the electronic device, the identifier or location information on the other sensor node that does not work normally. Additionally or alternatively, the wireless sensor node detects a cause generating the malfunction of the other sensor node, and may report a value indicating the cause. In addition, the wireless sensor nodes may periodically exchange their sensed information with each other. Although the electronic device detects only part of a sensor node in a visible area, it is capable of obtaining information on the overall sensor network or information on sensor nodes out of the visible area which do not work normally. Therefore, the present disclosure can resolve the problem that a user has difficulty in detecting a correct location where an accident occurs in the real accident site, only viewing an indoor map.
An electronic device 1500 includes a transceiver 1510 communicating with a wireless sensor node, an image taking unit (e.g., including image taking circuitry) 1520 configured to take an image and generate image information, such as, for example, and without limitation, a camera or a camcorder, a processor (e.g., including processing circuitry) 1530 for processing data of a wireless sensor node-based AR technology proposed according to the present disclosure, and a display 1540 for displaying augmented reality content. Although it is not shown in
The image taking unit 1520 may include various image taking circuitry, such as, for example, and without limitation, a camera, a camcorder, or the like, and takes images and generates image information.
The processor 1530 may include various processing circuitry, such as, for example, and without limitation, a dedicated processor, a CPU, an application processor, an application-specific integrated circuit, or the like, and is functionally or operatively connected to the transceiver 1510 and the image taking unit 1520. The processor 1530 generates augmented reality content, based on the sensor data received by the transceiver 1510 and the image information generated by the image taking unit 1520.
The processor 1530 is capable of determining a location of a wireless sensor node, based on an identifier for identifying a wireless sensor node, received by the transceiver 1510. The processor 1530 is capable of calculating a location of the electronic device 1500, based on the strength of signals that the transceiver 1510 received from a plurality of sensor nodes. The processor 1530 is capable of updating augmented reality content based on the location and/or movement of the electronic device 1500. The processor 1530 is capable of transmitting identification information on the electronic device 1500 to request, to a specific sensor node, sensor data. If the sensor data is encoded, the processor 1530 is capable of decoding the encoded sensor data. For example, if the sensor data is encoded with a first encryption key, the processor 1530 is capable of decoding the encoded sensor data, using a second encryption key corresponding to the first encryption key. The display 1540 displays augmented reality content generated by the processor 1530.
A wireless sensor node 1600 includes a sensor 1610 for performing measurement and generating sensed data, a communication unit (e.g., including communication circuitry) 1620 for communicating with an electronic device 1500 and a processor (e.g., including processing circuitry) 1630 for generating sensor data for a wireless sensor node-based AR technology proposed according to the present disclosure. The sensor 1610 performs measurement and generates sensed data. For example, the sensor 1610 may measure temperature, humidity, illuminance, etc.
The communication unit 1620 may include various communication circuitry and receives a request for sensor data from the electronic device 1500 and transmits the sensor data to the electronic device 1500. The communication unit 1620 is capable of transmitting identification information and/or location information on a wireless sensor node 1600, separately or along with sensor data. The processor 1630 may include various processing circuitry and is capable of configuring sensor data with encryption. The request for sensor data, received from the electronic device 1500, may include identification information on the electronic device 1500. The wireless sensor node 1600 may be configured in such a way as to transmit sensor data to an electronic device 1500 only if the electronic device 1500 has been authenticated. The processor 1630 may include various processing circuitry and is capable of encoding sensor data using an encryption key corresponding to the identification information.
As described above, the wireless sensor node 1600 communicates with its ambient sensor nodes and identifies states of the ambient sensor nodes. For example, the processor 1630 detects a state of another sensor node, and enables the communication unit 1620 to transmit a report on the state of the other sensor node. If the processor 1630 ascertains that another sensor node does not work normally, it detects the malfunction cause and reports, to the electronic device 1500, the cause or the identifier of the sensor node that does not work normally.
The visualization of augmented reality according to various example embodiments of the present disclosure may visualize 3D image, etc. without using markers. The visualization of augmented reality according to embodiments of the present disclosure may perform, in real-time or substantially real-time, the reception, update, and visualization of sensor information from a wireless sensor node, and also update augmented reality content based on a user's location and/or movement (e.g., turning the head). The present disclosure may also confer the access authority to only a specific electronic device, thereby providing only a specific user visualize information on a wireless sensor node. For example, the present disclosure uses a user's identification information, and thus may differently process augmented reality content provided according to users. For example, the present disclosure is capable of additionally providing: a manager with information on a repair/restoration guide with respect to the occurrence of an abnormal situation; and normal people with information on an evacuation guide or a breakdown report/reception guide.
As described above, the marker-based AR technology has required taking markers from a charge-coupled device (CCD) area of a camera, and to calculate visualization locations of objects and a reference coordinate via the 3D matrix operation. However, various example embodiments of the present disclosure are capable of calculating a reference coordinate, using signals received from a wireless sensor node, instead of using markers.
The marker-based AR technology is a static system that visualizes images and animations, taken and stored via markers. However, the wireless sensor node-based AR system according to various example embodiments of the present disclosure receives, in real-time or substantially real-time, application sensor data (e.g., temperature/humidity, illuminance, etc.) from wireless sensor nodes and dynamically visualizes the data.
As described above, the embodiments of the present disclosure are capable of processing sensed data regarding a point that is not seen (e.g., a point behind the ceiling panel, a point behind the wall, the underground, etc.), measured by wireless sensor nodes, with 3D visualization, via an augmented reality (AR) technology. The embodiments of the present disclosure are capable of checking states and information regarding points that are not seen, without performing complicated works, e.g., separation or removal of ceiling panels, walls, floors, etc.
Meanwhile, the method of displaying augmented reality content and the method of transmitting sensor data, according to various embodiments described above, may be implemented with program codes that may be stored in a non-transitory computer readable medium. The non-transitory computer-recordable medium is an apparatus-readable medium configured to semi-permanently store data. For example, the above-described various applications or programs may be stored in the non-transitory apparatus-readable medium such as a compact disc (CD), a digital versatile disc (DVD), a hard disc, a Blu-ray disc, a universal serial bus (USB), a memory card, or a read only memory (ROM), and provided.
Although various example embodiments have been illustrated and described, it will be understood that the present disclosure is not limited thereto. It will be understood by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the disclosure, the scope of which is defined in the claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2017-0028082 | Mar 2017 | KR | national |