The present application claims priority of Korean Patent Application No. 10-2022-0128063 filed on Oct. 6, 2022, the entire contents of which is incorporated herein for all purposes by this reference.
The present specification relates to a method of providing a user with an environment for performing visual coding and a convergent production apparatus for supporting the visual coding.
In general, there are two ways to generate content using coding: a classical coding method of using a development language, and a coding method of combining blocks to which coding is applied.
However, the coding method of using the development language requires expert knowledge. The block coding method may provide an intuitive user interface (UI) to a user, but unlike the coding method of using the development language, the block coding method hardly implements complex or various interactions.
Therefore, there are problems in that the coding method of combining blocks to which coding is applied produces a simple form of content or mainly produces unidirectional content. Further, there are problems in that too many resources (e.g., time, effort) are required for a user with no development knowledge to create dynamic content.
The present specification has been made in an effort to solve the above-mentioned problems, and an object of the present specification is to provide, through visual coding, a shallow learning curve to a user who intends to create content.
The present specification has also been made in an effort to provide a method of creating, through visual coding, content capable of being interactive, even though a user does not have expert knowledge.
Technical problems to be solved by the present specification are not limited to the above-mentioned technical problems, and other technical problems, which are not mentioned above, may be clearly understood from the following descriptions by those skilled in the art to which the present specification pertains.
In an embodiment of the present specification, a method for a terminal performing visual coding may include: generating a page for the visual coding and placing an asset on the page; setting a target asset that is a target of the visual coding on the basis of the placed asset; setting a user motion associated with an interaction with a user; setting a result associated with the target asset on the basis of the user motion; and displaying the result of the target asset on the basis of the user motion inputted, wherein the result includes (1) a “function” for controlling a size, a position and a state value of the target asset, (2) a “computation” for computing variables associated with the target asset, and (3) a “function page” representing a movement of the page.
In addition, setting the user motion may include: receiving 1) a type of the user motion and 2) a range value by which the type of the user motion is to be performed; and setting the user motion on the basis of the type of the user motion and the range value.
In addition, the type of the user motion may include an operation of the user clicking, double-clicking, holding, or dragging-and-dropping the target asset.
In addition, setting the result may include: displaying an item representing the function; receiving from the user 1) the item representing the function and 2) an operating time value corresponding to the item representing the function; receiving the selection of the user for an icon representing a result target asset to set a resulting state in which the function has been performed; receiving the resulting state of the target asset on the basis of the selection of the icon representing the result target asset; and setting the result on the basis of 1) the item representing the function, 2) the operating time value, and 3) the resulting state of the target asset.
In addition, setting the result may include: displaying an item representing the computation; receiving from the user 1) a computation target, 2) the item representing the computation, and 3) a condition that is a purpose of the item representing the computation; and setting the result on the basis of 1) the computation target, 2) the item representing the computation, and 3) the condition that is a purpose of the item representing the computation.
In addition, the method according to the embodiment of the present specification may further include: setting the computation target as a global variable.
In addition, setting the result may include: displaying an item representing an operation associated with the page; receiving from the user 1) the item representing the operation associated with the page and 2) a page that is a purpose of the operation associated with the page; and setting the result on the basis of 1) the item representing the operation associated with the page and 2) the page that is the purpose of the operation associated with the page.
In another embodiment of the present specification, a terminal for visual coding may include: a communication module; a memory; a display unit; and a processor configured to functionally control the communication module, the memory, and the display unit, in which the processor generates a page for the visual coding, places an asset on the page, sets a target asset that is a target of the visual coding on the basis of the placed asset, sets a user motion associated with an interaction with a user, sets a result associated with the target asset on the basis of the user motion, and displays the result of the target asset on the display unit on the basis of the user motion inputted, in which the result includes: (1) a “function” for controlling a size, a position and a state value of the target asset, (2) a “computation” for computing variables associated with the target asset, and (3) a “function page” representing a movement of the page.
According to the embodiment of the present specification, it is possible to provide the shallow learning curve to the user who intends to create content using a method for visual coding.
In addition, according to the embodiment of the present specification, it is possible to enable the user to create content capable of being interactive even though the user does not have expert knowledge.
The effects obtained by the present specification are not limited to the aforementioned effects, and other effects, which are not mentioned above, will be clearly understood by those skilled in the art from the following description.
The accompanying drawings included as a part of the detailed description for helping to understand the present specification provide exemplary embodiments of the present specification, and the technical features of the present specification will be described together with the detailed description.
Hereinafter, embodiments disclosed in the present specification will be described in detail with reference to the accompanying drawings. The same or similar constituent elements are assigned the same reference numerals regardless of reference numerals, and the repetitive description thereof will be omitted. The suffixes ‘module’, ‘unit’, ‘part’, and ‘portion’ used to describe constituent elements in the following description are used together or interchangeably in order to facilitate the description, but the suffixes themselves do not have distinguishable meanings or functions. In addition, in the description of the embodiment disclosed in the present specification, the specific descriptions of publicly known related technologies will be omitted when it is determined that the specific descriptions may obscure the subject matter of the embodiment disclosed in the present specification. In addition, it should be interpreted that the accompanying drawings are provided only to assist easy understanding of the embodiments disclosed in the present specification, and the technical spirit disclosed in the present specification is not limited by the accompanying drawings, and includes all alterations, equivalents, and alternatives that are included in the spirit and the technical scope of the present specification.
The terms including ordinal numbers such as “first,” “second,” and the like may be used to describe various constituent elements, but the constituent elements are not limited by the terms. These terms are used only to distinguish one constituent element from another constituent element.
When one constituent element is described as being “coupled” or “connected” to another constituent element, it should be understood that one constituent element can be coupled or connected directly to another constituent element, and an intervening constituent element can also be present between the constituent elements. When one constituent element is described as being “coupled directly to” or “connected directly to” another constituent element, it should be understood that no intervening constituent element is present between the constituent elements.
Singular expressions include plural expressions unless clearly described as different meanings in the context.
In the present application, it should be understood that terms “including” and “having” are intended to designate the existence of characteristics, numbers, steps, operations, constituent elements, and components described in the specification or a combination thereof, and do not exclude a possibility of the existence or addition of one or more other characteristics, numbers, steps, operations, constituent elements, and components, or a combination thereof in advance.
The electronic device 100 may include a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, and a power supply unit 190. However, the constituent elements illustrated in
More specifically, among the constituent elements, the wireless communication unit 110 may include one or more modules that enable wireless communication between the electronic device 100 and a wireless communication system, between the electronic device 100 and another electronic device 100, or between the electronic device 100 and an external server. In addition, the wireless communication unit 110 may include one or more modules that connect the electronic device 100 to one or more networks.
The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a near-field communication module 114, and a position information module 115.
The input unit 120 may include a camera 121 or a video input unit for video signal input, a microphone 122 or an audio input unit for audio signal input, and a user input unit 123 (e.g., a touch key, a mechanical key, etc.) for receiving information from a user. Voice data or image data collected by the input unit 120 may be analyzed and processed as the user's control instructions.
The sensing unit 140 may include one or more sensors for sensing at least one of information in the electronic device, surrounding environment information around the electronic device, and user information. For example, the sensing unit 140 may include at least one of a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor (IR sensor), a finger scan sensor, an ultrasonic sensor, an optical sensor (e.g., a camera (see reference numeral 121)), a microphone (see reference numeral 122), a battery gauger, an environmental sensor (e.g., a barometer, a hygrometer, a thermometer, a radiation sensing sensor, a thermal sensing sensor, a gas sensing sensor, etc.), and a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the electronic device disclosed in the present specification may utilize a combination of pieces of information sensed by at least two of these sensors.
The output unit 150 may generate an output associated with sight, hearing or touch, and may include at least one of a display unit 151, an acoustic output unit 152, a haptic module 153, and an optical output unit 154. The display unit 151 may be configured to form a layer structure or be integrated with the touch sensor, thereby implementing a touch screen. The touch screen may not only serve as the user input unit 123 that provides an input interface between the electronic device 100 and the user, but also provide an output interface between the electronic device 100 and the user.
The interface unit 160 serves as a passage with various kinds of external devices connected to the electronic device 100. The interface unit 160 may include at least one of a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting a device equipped with an identification module, an audio I/O (Input/Output) port, a video I/O (Input/Output) port, and an earphone port. The electronic device 100 may perform appropriate control related to the connected external device in response to the connection of the external device to the interface unit 160.
In addition, the memory 170 stores data that support various functions of the electronic device 100. The memory 170 may store a plurality of application programs (or applications) to be executed by the electronic device 100 and data and instructions for the operation of the electronic device 100. At least some of the application programs may be downloaded from an external server through wireless communication. In addition, at least some of the application programs may exist in the electronic device 100 from the time of shipment for the basic functions of the electronic device 100 (e.g., call receiving/sending function and a message receiving/sending function). Meanwhile, the application program may be stored in the memory 170, installed in the electronic device 100, and executed by the control unit 180 so that the operation (or function) of the electronic device is performed.
The control unit 180 typically controls the overall operation of the electronic device 100 in addition to the operations associated with the application program. The control unit 180 may provide the user with appropriate information or functions or process the information or functions by processing signals, data, information, and the like inputted or outputted through the constituent elements described above or by executing the application program stored in the memory 170.
In addition, the control unit 180 may control at least some of the constituent elements described with
Under the control of the control unit 180, the power supply unit 190 receives external power and internal power and supplies the power to each constituent element included in the electronic device 100. The power supply unit 190 includes a battery. The battery may be a built-in battery or a replaceable battery.
At least some of the above constituent elements may operate in cooperation with each other to implement an operation, control, or control method of the electronic device according to various embodiments described below. In addition, the operation, control, or control method of the electronic device may be implemented in the electronic device by executing at least one application program stored in the memory 170.
In the present specification, the electronic device 100 may include a terminal and a visual coding apparatus.
Referring to
Referring to
One piece of content may include one or more pages. In addition, the user may set events for assets included in the page through the second area 310 and select the detailed configurations of the events through the first area 500.
Referring back to
The terminal sets a target asset that is a target of visual coding on the basis of the placed assets (S2020). For example, the user may select the target asset by clicking among the assets placed on the page.
Referring to
Referring back to
Referring to
Referring to
The terminal, through the first area 500, may display to the user items of the user motion that the visual coding apparatus supports. The user motion for an initial target asset is empty, and the user may select and add user motion among the items of the user motion displayed in the first area 500 (e.g., see the input window 520 in
Referring back to
Referring to
Referring to
Referring to
The user may select an icon 930 representing a result target asset to set a resulting state in which the function has been performed.
Referring to
Referring to
Referring to
For example, the user may identify through a ‘preview’ or ‘play mode’ from the terminal that the computation target 1230 changes
In addition, the terminal may set the computation target as a global variable. The global variable may mean the computation target that is applicable not only to the page currently displayed in the main area 300, but also to the entire page of the corresponding content. This enables other pages to refer to the computation target on the current page. To set the computation target as the global variable, the user may select a checkbox 1211 for the global variable.
Referring to
Referring to
Referring back to
Referring to
Through the present specification, the user may create the content that is capable of easily interacting with the user through the visual coding with the shallow learning curve, and may use the visual coding apparatus as an educational tool for understanding the way a computer actually works.
The present specification described above may be implemented as a computer-readable code on a medium on which a program is recorded. The computer-readable medium includes all kinds of storage devices for storing data readable by a computer system. Examples of the computer-readable media include hard disk drive (HDD), solid state disk (SSD), silicon disk drives (SDD), ROM, RAM, CD-ROM, magnetic tape, floppy disc, and optical data storage device and also include one implemented in the form of a carrier wave (e.g., transmission over the Internet). Therefore, it should be appreciated that the detailed description is interpreted as being illustrative in every aspect, not restrictive. The scope of the present specification should be determined on the basis of the reasonable interpretation of the appended claims, and all of the modifications within the equivalent scope of the present specification belong to the scope of the present specification.
In addition, the service and embodiments have been mainly described above, but the embodiments are just for illustrative purposes and not intended to limit the present specification. It can be appreciated by those skilled in the art that various modifications and alterations, which are not described above, may be made without departing from the intrinsic features of the present service and embodiments. For example, the respective constituent elements specifically described in the embodiments may be modified and then carried out. Further, it should be interpreted that the differences related to the modifications and applications are included in the scope of the present specification defined by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2022-0128063 | Oct 2022 | KR | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/KR2022/015074 | 10/7/2022 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2024/075870 | 4/11/2024 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
10089082 | Douglas | Oct 2018 | B2 |
20040264428 | Choi et al. | Dec 2004 | A1 |
20050249117 | Gerkins | Nov 2005 | A1 |
20070280156 | Kwon et al. | Dec 2007 | A1 |
20080151848 | Fischer | Jun 2008 | A1 |
20080192713 | Mighani et al. | Aug 2008 | A1 |
20090067389 | Lee et al. | Mar 2009 | A1 |
20090271512 | Jorgensen | Oct 2009 | A1 |
20100061350 | Flammer, III | Mar 2010 | A1 |
20100135177 | Liu | Jun 2010 | A1 |
20110103352 | Wentink | May 2011 | A1 |
20110207466 | Hegge | Aug 2011 | A1 |
20120063433 | Wentink | Mar 2012 | A1 |
20130028243 | Wentink et al. | Jan 2013 | A1 |
20130136066 | Kim et al. | May 2013 | A1 |
20140003414 | Choudhury | Jan 2014 | A1 |
20140226651 | Lim | Aug 2014 | A1 |
20170316114 | Bourhani | Nov 2017 | A1 |
20180349108 | Brebner | Dec 2018 | A1 |
Number | Date | Country |
---|---|---|
1020100092618 | Aug 2010 | KR |
20110015759 | Feb 2011 | KR |
Number | Date | Country | |
---|---|---|---|
20240264806 A1 | Aug 2024 | US |