The present disclosure relates to a communication apparatus capable of switching moving image data to be transmitted.
In recent years, live distribution services are known to distribute moving image data and audio data in real time. The user can send data generated by using a camera and a microphone mounted on a communication apparatus, such as a smart phone or a digital camera, to other users in real time via a live distribution service. Japanese Patent Application Laid-Open No. 2020-106932 discusses a digital camera capable of performing live distribution, for example.
Examples of methods for attracting more viewers to view a live distribution include a method in which an opening moving image is distributed before the live distribution and an ending moving image is distributed after the live distribution to attract viewers. For example, in some cases, in a case where an opening moving image is to be distributed, a user acting as a distributor may complete the distribution of the opening moving image data and then perform an operation for manually switching to distribution of the moving image data currently being generated in real time.
However, in many cases, the user acting as a distributor does not wish to show the image of the user performing an operation for manually switching between a plurality of moving image data pieces, to other users who are viewers. Conventionally, it has been necessary for the user acting as a distributor to take certain measures to prevent such images of the distributor from being viewed by the other users who are viewers of the moving images. Thus, switching among a plurality of moving image data pieces in distribution of moving images has typically required additional work by the user performing the distribution work that is unnecessary for the distribution itself.
According to various embodiments of the present disclosure, a communication apparatus is provided that includes an imaging unit, a recording unit, a control unit, and a communication unit. The control unit controls the imaging unit to generate first moving image data. In starting transmission of the first moving image data currently being generated by the imaging unit to an external apparatus, the control unit controls the communication unit to automatically transmit second moving image data different from the first moving image data, before starting the transmission of the first moving image data, where the second moving image data is recorded in the recording unit.
Further features of the present disclosure will become apparent from the following description of example embodiments with reference to the attached drawings.
Example embodiment of the present disclosure will be described in detail below with reference to the accompanying drawings.
The following example embodiments are to be considered as illustrative examples for implementing features of the present disclosure, and may be corrected, modified, and combined as appropriate depending on the configuration of an apparatus to which the present disclosure is applied and other various conditions. The example embodiments may be appropriately combined.
A control unit 101 includes hardware component (e.g., a processor) for executing a program stored in a nonvolatile memory 103. The control unit 101 executes the program recorded in the nonvolatile memory 103 to control the digital camera 100. Instead of being controlled by the control unit 101, the entire apparatus may be controlled by a plurality of hardware components that share processing.
An imaging unit 102 includes, for example, a lens unit, an image sensor for converting the optical image of a subject formed on an imaging plane through the lens unit into an electrical signal, and an image processing unit that generates still image data or moving image data from the electrical signal generated by the image sensor. A Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge Coupled Device (CCD) sensor is generally used as the image sensor. According to the first example embodiment, a series of processes in which the imaging unit 102 generates still image data or moving image data and then outputs the image data is referred to as “image capturing”. The still image data or moving image data generated by the imaging unit 102 is recorded in a recording medium 110 in accordance with the Design rule for Camera File system (DCF) standard. The still image data and moving image data to be transmitted for live distribution (described below) are temporarily recorded in a work memory 104 and then transmitted. These pieces of data are transmitted to a distribution server 300 via a communication unit 111. The imaging unit 102 may be configured to be attachable to and detachable from the digital camera 100 or may be built in the digital camera 100. More specifically, it is only required for the digital camera 100 to include at least means for acquiring an electrical signal of moving image data and the like from the imaging unit 102.
The nonvolatile memory 103 records thereon programs to be executed by the control unit 101. The control unit 101 can record moving image data and/or still image data in the nonvolatile memory 103.
The work memory 104 is used as a buffer memory for temporarily storing still image data and moving image data imaged by the imaging unit 102, an image display memory for the display unit 106, and a work area for the control unit 101.
An operation unit 105 is a user interface (UI) for receiving instructions to the digital camera 100 from the user. The operation unit 105 can include a power switch for issuing an instruction to turn power of the digital camera 100 ON and OFF, a release switch for issuing an image-capturing instruction, and a reproduction button for issuing an instruction to reproduce still image data. The operation unit 105 can also include the touch panel formed on the display unit 106. The release switch includes switch (SW) 1 and SW 2. In response to the release switch being half-pressed, the SW 1 turns ON. Thus, the operation unit 105 receives a preparation instruction to perform imaging preparation operations, such as automatic focus (AF) processing, automatic exposure (AE) processing, automatic white balance (AWB) processing, and flash preliminary emission (EF) processing. In response to the release switch being fully pressed, the SW 2 turns ON. With these user operations, the digital camera 100 receives an imaging instruction to perform an imaging operation. The operation unit 105 also includes a wireless button for switching the wireless communication function ON and OFF via the communication unit 111.
The display unit 106 displays a through-the-lens image for image capturing, captured still image data, and texts for interactive operations. The display unit 106 is, for example, a liquid crystal display or a light emitting diode (LED) display. The display unit 106 may not be built in the digital camera 100 but may be externally connected to the digital camera 100. More specifically, the digital camera 100 can connect with the internal or external display unit 106 and may include at least a function of controlling the display of the display unit 106. The external display unit 106 is, for example, a view finder that can be connected to the digital camera 100.
A microphone 107 is a microphone apparatus that collects an acoustic wave such as voice and generates audio data. The control unit 101 can generate moving image data with sound from the moving image data generated by the imaging unit 102 and the audio data generated by the microphone 107 or an external microphone apparatus. The moving image data with sound generated by the control unit 101 is recorded in the recording medium 110 by the control unit 101. The control unit 101 can also record the still image data generated by the imaging unit 102 and the audio data generated by the microphone 107 in the recording medium 110 in association with each other. The audio data generated to be transmitted for live distribution (described below) is recorded in the work memory 104. The microphone 107 may be configured to be attachable to and detachable from the digital camera 100, or built in the digital camera 100. More specifically, the digital camera 100 needs to have at least a means for receiving an electrical signal from the microphone 107. Processing performed by the microphone 107 to generate audio data from an acoustic wave may be partly borne by other hardware components (e.g., the control unit 101).
A speaker 108 is an electroacoustic transducer capable of outputting electronic sound data. Examples of electronic sound data include musical pieces, warning sounds, focusing sounds, electronic shutter sounds, and operation sounds.
These pieces of electronic sound data are recorded in the nonvolatile memory 103. The speaker 108 can output electronic sound data selected by the control unit 101. By hearing the sound output from the speaker 108, the user can notice the in-focus state of a subject and an error occurring in the digital camera 100.
A power source unit 109 under the control of the control unit 101 can supply power to each element of the digital camera 100. The power source unit 109 is, for example, a lithium-ion battery or an alkali manganese dry battery.
The recording medium 110 can record, for example, the still image data output from the imaging unit 102. Examples of the recording medium 110 include a Secure Digital (SD) card, a Compact Flash (CF) card, and an XQD (registered trademark) card. The recording medium 110 may be configured to be attachable to and detachable from the digital camera 100, or built in the digital camera 100. More specifically, it is only required for the digital camera 100 to have at least means for accessing the recording medium 110.
The communication unit 111 is an interface for wirelessly connecting to an external apparatus. The digital camera 100 according to the present example embodiment can perform data transmission and reception with an external apparatus via the communication unit 111. For example, the digital camera 100 can transmit the still image data generated by the imaging unit 102 and the moving image data recorded in the nonvolatile memory 103, to an external apparatus via the communication unit 111. The external apparatus according to the present example embodiment is, for example, a communication apparatus, such as an external server, a smart phone, and a personal computer (PC). According to the present example embodiment, the communication unit 111 includes an interface that communicates with a relay apparatus and an external apparatus in accordance with a wireless Local Area Network (LAN) conforming to the IEEE 802.11 standard. According to the present example embodiment, the communication unit 111 of the digital camera 100 is provided with a client mode in which the communication unit 111 operates as a client in an infrastructure mode. By operating the communication unit 111 in the client mode, the digital camera 100 according to the present example embodiment can operate as a client apparatus in the infrastructure mode.
Connecting the digital camera 100 operating as a client apparatus to a peripheral access point apparatus enables the digital camera 100 to participate in a LAN formed by the access point apparatus. The control unit 101 implements wireless communication with a relay apparatus and an external apparatus by controlling the communication unit 111. The communication method is not limited to a wireless LAN. Examples of applicable communication methods include public wireless communication methods, such as 4th Generation Mobile Communication System (4G), Long Term Evolution (LTE), and 5th Generation Mobile Communication System (5G), wire lined communication methods conforming to Ethernet, and wireless communication methods such as Bluetooth®.
An example of an outer appearance of the digital camera 100 will be described below.
This completes the description of an example configuration of the digital camera 100.
A network router 200 serves as an access point of a wireless LAN and forms a network. The digital camera 100 participates in the network formed by the network router 200, as a client, using the communication unit 111. The digital camera 100 connects to the distribution server 300 via the network router 200. While an example will be described below in which the digital camera 100 wirelessly connects to the network router 200 in the present example embodiment, the digital camera 100 may connect to the network router 200 using wire. For example, the network router 200 may also be an information processing apparatus, such as a smart phone, a tablet device, and a personal computer. In such a case, the network formed by the network router 200 is formed by the tethering function of the information processing apparatus.
The distribution server 300 provides cloud services. In particular, the user demands a provision of content with a small delay amount from this cloud service. According to the present example embodiment, the distribution server 300 provides live distribution services.
Live distribution is a process in which the user acting as a distributor distributes moving image data and/or audio data to users as viewers in real time via the Internet, by using a streaming technique. The users as viewers can view the moving image data and/or audio data in real time, as in a live program of television and radio broadcasting. As a live distribution service, the distribution server 300 transmits, as content, moving image data and/or audio data transmitted from the user acting as a distributor, to the viewers. According to the present example embodiment, the digital camera 100 successively transmits the moving image data generated in real time to the distribution server 300, and the distribution server 300 successively transmits the received moving image data to the users as viewers. In addition, the distribution server 300 also provides such a service that the users as viewers can view the live distribution through web pages. Hereinafter, moving image data and/or audio data generated by the digital camera 100 and then distributed in the live distribution are collectively referred to as distribution data.
Before starting the live distribution, the user acting as a distributor makes live distribution settings on the distribution server 300 by using a PC or a smart phone. The live distribution settings include data to be used to set moving image data as the content to be provided in the live distribution, and to transmit and receive the moving image data. Examples of the live distribution settings include the identifier of the live distribution, distribution destination Uniform Resource Locator (URL), stream key, the title of the live distribution, the frame rate of the moving image data, and the bit rate of the moving image data. The distribution destination URL is, for example, the transmission destination of the distribution data generated by the digital camera 100. The stream key is information used by the distribution server 300 to associate the user with the distribution data.
The user can generate a plurality of live distribution settings in the distribution server 300. For example, in a case where the user acting as a distributor performs the live distribution with different distribution content in succession, the user acting as a distributor makes the live distribution settings on the distribution server 300 for each live distribution. The user can also make the live distribution settings by using the digital camera 100. In this case, the digital camera 100 transmits the made live distribution settings to the distribution server 300, and the distribution server 300 records the live distribution settings.
In performing the live distribution, the digital camera 100 transmits the distribution data in accordance with the live distribution settings received from the distribution server 300. User operations for starting the live distribution will be described below with reference to
The communication protocol used by the digital camera 100 and the distribution server 300 to transmit and receive data for live distribution preparation processing is different from the communication protocol used to transmit and receive distribution data in the live distribution. The former protocol is a communication protocol that enables data communication. The former communication protocol is assumed to be a widely used communication protocol. The latter communication protocol is a communication protocol intended for communication with a small delay amount. Examples of the former communication protocol include Hypertext Transfer Protocol (HTTP), and examples of the latter communication protocol include Real-Time Messaging Protocol (RTMP). Communications conforming to the latter communication protocol (RTMP) have a feature that the delay amount is smaller than that in communications conforming to the former communication protocol (HTTP). According to the present example embodiment, the digital camera 100 transmits through RTMP the distribution data to be distributed in the live distribution and transmits other pieces of data (e.g., the live distribution settings) through HTTP.
In the present example embodiment, in a case where the digital camera 100 starts the live distribution, prior to the successive transmission of the moving image data generated by the imaging unit 102, as distribution data, the digital camera 100 can transmit the moving image data recorded in the recording medium 110 to the distribution server 300, as distribution data.
Hereinafter this function is referred to as an opening distribution function. The moving image data recorded in the recording medium 110 is, for example, moving image data for the opening moving image. The opening moving image is used to notify the users as viewers that the live distribution will soon be started. In a case where the user changes the distribution data, normally, the user is to perform operations for selecting and changing the distribution data to be transmitted. However, with the foregoing opening distribution function, the user does not need to perform such an operation since the opening distribution function of the digital camera 100 automatically changes the distribution data from the moving image data of the opening moving image to the moving image data generated by the imaging unit 102. This enables the user to naturally change the distribution data from the moving image data of the opening moving image to the moving image data generated by the imaging unit 102 by performing an operation for starting the live distribution alone. A method for the user to use the opening distribution function according to the present example embodiment will be described below.
According to the present example embodiment, in a case where the opening distribution function is used, the user selects the item 303 illustrated in
The screen in
Further, in a case where detailed settings of the opening distribution function is to be changed, the user presses a “Detailed Settings” button. In response to this button depression, the digital camera 100 displays a screen for setting details of the opening distribution function.
In a case where the user selects an item 305, the digital camera 100 displays a screen for selecting the moving image data to be distributed using the opening distribution function, as illustrated in
In the screen illustrated in
Thus, the user can use the opening distribution function.
An item “Ending Moving Image” in
In step S401, the user operates the digital camera 100 to enter an instruction to start a live distribution. For example, as described above, the user selects any one of live distribution settings and presses the ENTER button to finalize the setting by performing a touch operation in the screen illustrated in
In step S402, the digital camera 100 displays a standby screen. The user can adjust the arrangement and image capturing settings, such as the angle of view and white balance, of the digital camera 100 while referring to the standby screen.
In a case where the live distribution preparation is completed after the processing in step S402 is performed, the user performs the operation in step S403 to start the live distribution. In this sequence, a description will be provided below of a case where the opening distribution function is enabled.
In step S403, the user operates the digital camera 100 to enter an instruction to start the distribution data transmission. For example, in the screen illustrated in
In step S404, the digital camera 100 transmits a live distribution start request to the distribution server 300. The live distribution start request is a request to instruct the distribution server 300 to start the distribution data transmission to the users as viewers. This request includes the identifier and stream key of the live distribution.
In step S405, the digital camera 100 encodes the moving image data to be served as the opening moving image based on the live distribution settings and then generates the distribution data. For example, this live distribution settings relate to the moving image data, such as the resolution and frame rate that is receivable by the distribution server 300. In a case where the digital camera 100 determines that encoding is not necessary, the digital camera 100 may not omit the operation in this step.
In step S406, the distribution server 300 starts the live distribution based on the request received in step S404. For example, the distribution server 300 starts the transmission of the distribution data received from the digital camera 100, to the users as viewers. The processing in steps S405 and S406 is performed in parallel.
In step S407, the digital camera 100 transmits the distribution data of the opening moving image generated in step S405 to the distribution server 300. Subsequently, the digital camera 100 performs the processing in step S408 in parallel with the processing in step S407.
In step S408, the digital camera 100 displays the remaining time to completion of the transmission of the distribution data of the opening moving image. In the present example embodiment, the digital camera 100 superimposes the time on the moving image data currently being generated by the imaging unit 102 as illustrated in
In a case where the transmission of the distribution data of the opening moving image is completed, the digital camera 100 starts the transmission of the moving image data currently being generated by the imaging unit 102.
In step S409, the digital camera 100 erases the display of remaining time to completion of the transmission of the distribution data of the opening moving image because the transmission of the distribution data of the opening moving image is completed.
In step S410, the digital camera 100 transmits the moving image data currently being generated by the imaging unit 102 and the audio data currently being generated by the microphone 107 to the distribution server 300 as distribution data. In this step and subsequent steps, the digital camera 100 generates the moving image data currently being generated by the imaging unit 102 and the audio data currently being generated by the microphone 107, as successive distribution data, and then transmits the successive distribution data to the distribution server 300. The digital camera 100 transmits the distribution data in a predetermined amount of data. For example, in a case where moving image data for five seconds is recorded in the work memory 104, the digital camera 100 transmits previously captured moving image data for one second to the distribution server 300 as distribution data.
Here,
In the present example embodiment, in a case where data related to the number of viewers is received from the distribution server 300, the digital camera 100 displays the number of viewers at the bottom left of the display unit 106 as illustrated in
The distribution data to be transmitted to the distribution server 300 includes the moving image data generated by the imaging unit 102 of the digital camera 100 and the audio data generated by the microphone 107 thereof. As illustrated in
Subsequently, the user performs the live distribution by using the digital camera 100. In a case where the live distribution is to be ended, the user performs the operation in step S411 to end the live distribution. This sequence will be described below based on a case where the ending distribution function is enabled.
In step S411, the user operates the digital camera 100 to enter an instruction to end the transmission of the distribution data. In the screen illustrated in
In step S412, the digital camera 100 encodes the moving image data to be served as the ending moving image based on the live distribution settings to generate the distribution data. These live distribution settings relate to the moving image data are, for example, the resolution and frame rate that can be received by the distribution server 300. In a case where the digital camera 100 determines that encoding is not necessary, the digital camera 100 may omit the operation in this step.
In step S413, the digital camera 100 transmits the distribution data of the ending moving image generated in step S412 to the distribution server 300. Here, the digital camera 100 displays the remaining time to completion of the transmission of the distribution data of the ending moving image as illustrated in
In step S414, the digital camera 100 transmits a live distribution end request to the distribution server 300.
In step S415, the distribution server 300 terminates the live distribution based on the request received in step S404. For example, the distribution server 300 terminates the distribution data transmission to the users as viewers.
This completes the description of the live distribution processing according to the present example embodiment.
As described above, in response to a live distribution start instruction, the digital camera 100 transmits the opening moving image as the distribution data to the distribution server 300 before transmitting the moving image data generated by the imaging unit 102. In response to a live distribution end instruction, the digital camera 100 transmits the ending moving image as the distribution data to the distribution server 300 before transmitting a live distribution end request to the distribution server 300. This saves the user from having to perform an operation for selecting and changing the distribution data to be transmitted.
In step S601, the control unit 101 displays a standby screen on the display unit 106.
For example, as illustrated in
In step S602, the control unit 101 determines whether to start the live distribution. For example, in a case where the user performs the touch operation on the button 501 illustrated in
In step S603, the control unit 101 transmits a live distribution start request to the distribution server 300 via the communication unit 111. This request includes the identifier and stream key of the live distribution. The operation in this step is equivalent to, for example, the processing in step S404 in
In step S604, the control unit 101 determines whether the opening distribution function is enabled. For example, the control unit 101 determines whether the item 304 for the opening moving image in
In step S605, the control unit 101 generates the distribution data based on the moving image data that is finalized as the opening moving image. The moving image data finalized as the opening moving image is, for example, one selected by the user in the screen in
In step S606, the control unit 101 transmits the distribution data generated in step S605 to the distribution server 300, via the communication unit 111. The operation in this step is equivalent to, for example, the processing in step S407 in
In step S607, the control unit 101 displays the remaining time to completion of the distribution data transmission on the display unit 106. For example, as illustrated in
In step S608, the control unit 101 determines whether to terminate the live distribution. For example, in the screen illustrated in
In step S609, the control unit 101 determines whether the transmission of the distribution data generated from the opening moving image is completed. For example, in a case where the time to completion of the transmission based on the distribution data amount that has not yet been transmitted to the distribution server 300 and the current transmission rate is to be displayed as the remaining time, the control unit 101 determines that the transmission of the distribution data is completed in response to all of the distribution data having been transmitted. For example, in a case where the reproduction time of the moving image data of the opening moving image is to be displayed as the remaining time, the control unit 101 determines that the transmission of the distribution data is completed in response to the reproduction time of the moving image data having elapsed since the start of the distribution data transmission. If the control unit 101 determines that the transmission of the distribution data generated based on the opening moving image is not completed (NO in step S609), the processing returns to step S606. In step S606, the control unit 101 continues the transmission of the remaining distribution data. If the control unit 101 determines that the transmission of distribution data generated based on the opening moving image is completed (YES in step S609), the processing proceeds to step S610.
In step S610, the control unit 101 erases the display of the remaining time to completion of the distribution data transmission from the display unit 106. The processing in this step is equivalent to, for example, the operation in step S409 in
In step S611, the control unit 101 generates the distribution data from the moving image data currently being generated by the imaging unit 102 and the audio data currently being generated by the microphone 107 and then transmits the distribution data to the distribution server 300 via the communication unit 111. The operation in this step is equivalent to, for example, the processing in step S410 in
In step S612, the control unit 101 determines whether to terminate the live distribution. For example, in a case where the user performs the touch operation on the button 502 in the screen illustrated in
In step S613, the control unit 101 determines whether the ending distribution function is enabled. For example, the control unit 101 determines whether the item of the ending moving image in
In step S614, the control unit 101 generates the distribution data from the moving image data that is finalized as the ending moving image. The moving image data finalized as the ending moving image is, for example, one selected as the ending moving image by the user, as in the method for determining the opening moving image. The control unit 101 encodes the moving image data based on the live distribution settings to generate the distribution data. The processing in this step is equivalent to, for example, the processing in step S412 in
In step S615, the control unit 101 transmits the distribution data generated in step S614 to the distribution server 300, using the communication unit 111.
The operation in this step is equivalent to, for example, the operation in step S413 in
In step S616, the control unit 101 transmits a request for the termination of the live distribution to the distribution server 300 via the communication unit 111. The operation in this step is equivalent to, for example, the operation in step S414 in
This completes the description of operations of the digital camera 100 according to the present example embodiment.
In a case where the live distribution settings are changeable during the live distribution, the control unit 101 may operate in the following way. In performing the operation in step S605, the control unit 101 changes the live distribution settings according to the moving image data of the opening moving image. In a case where the operation in step S611 is performed, the control unit 101 changes the live distribution settings according to the moving image data generated by the imaging unit 102. This eliminates the need for the control unit 101 to subject the moving image data to image processing according to the live distribution settings, thus reducing the processing load in the live distribution processing. A similar operation can also apply to the transmission of the moving image data of the ending moving image.
The control unit 101 may reproduce the audio data included in the moving image data of the opening moving image by using the speaker 108 in parallel with the operation in step S607. For example, as illustrated in
The control unit 101 may display the opening moving image in a wipe screen on the display unit 106 in parallel with the operation in step S607. For example, as illustrated in
The operation for starting the live distribution and the operation for terminating the live distribution have been described to be performed by the operation unit 105 of the digital camera 100. The digital camera 100 may receive these operations with other methods. For example, in a case where a remote control is connected via the communication unit 111, the digital camera 100 may determine that the operation for starting the live distribution and the operation for terminating the live distribution are received in response to receiving the signal transmitted from the remote control.
The digital camera 100 switches between ON and OFF of the opening distribution function in the screen illustrated in
Various embodiments of the present disclosure can also be realized when a program for implementing at least one of the functions according to the above-described example embodiments is supplied to a system or apparatus via a network or storage medium, and at least one processor in a computer of the system or apparatus reads and executes the program. Various embodiments of the present disclosure can also be achieved by a circuit (for example, an Application Specific Integrated Circuit (ASIC)) for implementing at least one function.
Embodiments of the present disclosure are not limited to the above-described example embodiments. Rather, in the implementation stage, the described components can be modified and embodied without departing from the spirit and scope of the invention. Diverse embodiments can be formed by suitably combining the plurality of components disclosed in the above-described example embodiments. For example, some of the components may be removed from some of the example embodiments. Further, the components of different example embodiments may be suitably combined.
Various embodiment(s) of the present disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While example embodiments have been described, it is to be understood that the invention is not limited to the disclosed example embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2021-058483, filed Mar. 30, 2021, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2021-058483 | Mar 2021 | JP | national |