This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2011-166822, filed on Jul. 29, 2011, the entire contents of which are incorporated herein by reference.
The present embodiments relate to a drawing device.
A drawing device that draws a three-dimensional image etc. generates graphic information on a two-dimensional display surface based on, for example, vertex information of a graphic. Related arts are discussed in Japanese Laid-open Patent Publication No. 11-31236, Japanese Laid-open Patent Publication No. 10-222695, Japanese Laid-open Patent Publication No. 09-180000, and Japanese Laid-open Patent Publication No. 2000-30081. The drawing device generates an image to be displayed on the two-dimensional display surface based on the generated graphic information. For example, the drawing device has a frame buffer storing the color of a pixel and a depth buffer storing the depth (Z value) of a pixel.
In a general drawing device, the frame buffer and the depth buffer require a memory size for the number of pixels of the display surface. For example, when the display surface includes 800×600 pixels and the data size (sum of color and depth) for one pixel is 8 bytes, the frame buffer and the depth buffer require a memory size of about 3.7 MB in total.
In order to store data corresponding to the number of pixels of the display surface, a buffer (frame buffer and depth buffer) having a large memory size is required. For example, when a frame buffer etc. having a large memory size is formed by an SRAM within the drawing device, the circuit area and cost increase considerably. Furthermore, when a frame buffer etc. having a large memory size is formed by a DRAM outside the drawing device, there are such problems that power consumption increases due to input and output of data to and from the external DRAM and that the cost increases because the DRAM is mounted as another chip.
According to one aspect of embodiments, a drawing device includes a coordinate transformation unit receiving vertex information of a graphic and generating graphic information including at least positional information indicative of coordinates on a two-dimensional display surface of the graphic based on the vertex information; a selection unit receiving the graphic information from the coordinate transformation unit, calculating a drawing range in a predetermined direction of the graphic based on the graphic information, and outputting the graphic information of the graphic to be drawn in divided areas for each of the divided areas obtained by dividing the two-dimensional display surface; an image generating unit generating image data of the divided areas based on the graphic information output from the selection unit; and a line buffer storing the image data generated by the image generating unit.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
Hereinafter, embodiments will be explained using the drawings.
The coordinate transformation unit 100 performs, for example, geometry processing. For example, the coordinate transformation unit 100 receives vertex information VINF of a graphic and generates graphic information GINF of the graphic on the two-dimensional display surface DIS. The graphic is, for example, a triangle. The graphic may be a graphic other than a triangle. In addition, the vertex information VINF is stored, for example, in a memory of a system in which the drawing device 10 is mounted. For example, the vertex information VINF has three-dimensional coordinate information (hereinafter, also referred to as coordinate information) of each vertex of a graphic, color information of each vertex, texture coordinate information, normal vector information, etc. The coordinate information of the vertex information VINF may be, for example, two-dimensional coordinate information.
Furthermore, the graphic information GINF has, for example, coordinate information on the two-dimensional display surface DIS of each vertex of a graphic (hereinafter, also referred to as positional information), equation information of each side of a graphic (equations of the three sides of a triangle), color information, texture coordinate information, normal vector information, Z-direction information (depth information), etc. Meanwhile, each piece of information, such as color information, Texture coordinate information, and normal vector information, within the graphic information GINF has, for example, an amount of change in value for an increment in an X-direction, an amount of change in value for an increment in the Y-direction, and an offset value. For example, the color information within the graphic information GINF has an amount of change in color for an increment in the X-direction, an amount of change in color for an increment in the Y-direction, and an offset value.
It may also be possible for the coordinate transformation unit 100 to receive only the coordinate information of a graphic in the vertex information VINF. Then, the coordinate transformation unit 100 performs a part of the geometry processing using the coordinate information and generates the graphic information GINF including vertex numbers of the graphic, positional information of the graphic (coordinates of the graphic on the two-dimensional display surface DIS), and front and back information of the graphic. That is, it may also be possible for the coordinate transformation unit 100 to receive coordinate information of a graphic as the vertex information VINF and to generate the graphic information GINF including at least vertex numbers and positional information of the graphic based on the coordinate information. The vertex numbers of a graphic are, for example, a set of numbers corresponding to the vertexes of the graphic, respectively. For example, for a triangle, vertex numbers of the graphic are a set of numbers corresponding to each of the three vertexes of the triangle.
The selection unit 200 receives the graphic information GINF from the coordinate transformation unit 100 and calculates a drawing range in a predetermined direction (in the example of
Because of this, the image generating unit 300 receives, from the selection unit 200, information indicative of the divided area DAR in which drawing is performed and the graphic information GINF of the graphic to be drawn in the divided area DAR in which drawing is performed. Meanwhile, it may also be possible to receive the information indicative of the divided area DAR in which drawing is performed from a module other than the selection unit 200. For example, it may also be possible for the image generating unit 300 to receive the information indicative of the divided area DAR in which drawing is performed, from a module that controls the drawing device 10.
The image generating unit 300 performs, for example, rendering processing. For example, the image generating unit 300 generates the image data GDATA of the divided area DAR based on the graphic information GINF output from the selection unit 200. That is, the image generating unit 300 generates the image data GDATA for each divided area DAR. In the image data GDATA, for example, pixel color information etc. is included. Meanwhile, it may also be possible for the image generating unit 300 to perform both the geometry processing and the rendering processing.
For example, when the coordinate transformation unit 100 receives only coordinate information of a graphic in the vertex information VINF, the image generating unit 300 acquires the vertex information VINF corresponding to the vertex number within the graphic information GINF received from the selection unit 200. Then, the image generating unit 300 performs the geometry processing and the rendering processing using the acquired vertex information VINF and generates the image data GDATA of the divided area DAR. That is, it may also be possible for the image generating unit 300 to acquire the vertex information VINF corresponding to the vertex number within the graphic information GINF received from the selection unit 200 and to generate the image data GDATA based on the acquired vertex information.
The line buffer 400 stores the image data GDATA generated by the image generating unit 300. That is, the line buffer 400 stores the image data GDATA corresponding to one divided area DAR of the divided areas DAR obtained by dividing the two-dimensional display surface DIS. Consequently, in the present embodiment, it is possible to reduce the memory size of the line buffer 400 in comparison with the frame buffer storing data of all the pixels (pixel data) of the two-dimensional display surface DIS.
The line depth buffer 410 stores, for example, the depth (Z value) of a pixel. For example, the image generating unit 300 refers to the setting information, the Z value stored in the line depth buffer 410, and the like, when generating the image data GDATA. The setting information is, for example, a transformation matrix of a graphic, material information such as reflectance, positional information of a light source, and the like. The display circuit 500 sequentially reads the image data GDATA from the line buffer 400 and displays the image on the two-dimensional display surface DIS.
As described above, in the present embodiment, the drawing device 10 has the selection unit 200 outputting the graphic information GINF of a graphic to be drawn in the divided area DAR to the image generating unit 300 for each of the divided areas DAR obtained by dividing the two-dimensional display surface DIS. Because of this, the image generating unit 300 generates the image data GDATA for each divided area DAR. Consequently, in the present embodiment, it is possible to reduce the memory size of the line buffer 400 to the same size as the amount of image data of one divided area DAR. That is, in the present embodiment, it is possible to reduce the memory size of the line buffer 400 in comparison with the frame buffer storing data of all the pixels of the two-dimensional display surface DIS. Furthermore, in the present embodiment, the image data GDATA is generated for each divided area DAR, and therefore, it is possible to reduce the memory size of the line depth buffer 410 in accordance with the line buffer 400. That is, in the present embodiment, it is possible to reduce the memory size of the buffer storing pixel data.
In addition, in
The coordinate transformation unit 102 performs, for example, a part of geometry processing. For example, the coordinate transformation unit 102 reads the coordinate information VINFc of a graphic in the vertex information VINF and performs processing relating to coordinates. That is, the coordinate transformation unit 102 may skip processing of reading information about color, normal line, etc., other than the vertex information VINFc of the graphic. Furthermore, the coordinate transformation unit 102 may skip processing relating to parameters (for example, color and normal line) other than coordinates.
For example, the coordinate transformation unit 102 receives the coordinate information VINFc of a graphic as the vertex information VINF and generates, based on the coordinate information VINFc, graphic information GINFc including at least the vertex number and positional information (coordinates on the two-dimensional display surface DIS of the graphic) of the graphic. The coordinate transformation unit 102 has, for example, a vertex read unit 110, a vertex processing unit 120, a graphic creation unit 130, and a graphic removal unit 140.
The vertex read unit 110 receives the coordinate information VINFc of a graphic in the vertex information VINF and outputs the coordinate information VINFc to the vertex processing unit 120. For example, the vertex read unit 110 reads the coordinate information VINFc from a memory etc. in which the vertex information VINF is stored and outputs the read coordinate information VINFc to the vertex processing unit 120.
The vertex processing unit 120 performs vertex processing relating to coordinates such as rotation based on the coordinate information VINFc. The result of the vertex processing is input to the graphic creation unit 130. The graphic creation unit 130 converts the result of the vertex processing by the vertex processing unit 120 into graphic information. For example, when the graphic is a triangle, the graphic creation unit 130 receives information of three vertexes (vertexes of the triangle) as the result of the vertex processing and converts the result of the vertex processing into information of the triangle. Because of this, the graphic information of the graphic corresponding to the coordinate information VINFc is generated.
The graphic removal unit 140 removes graphic information of a graphic not drawn on the two-dimensional display surface DIS from the graphic information generated by the graphic creation unit 130. For example, the graphic removal unit 140 performs clipping processing and culling processing of removing unnecessary graphic information. For example, by the clipping processing, the graphic removal unit 140 removes a graphic outside the display area. In addition, by the culling processing, for example, the graphic removal unit 140 makes a front and back determination of a graphic and removes a graphic determined to be the back surface. Furthermore, at the time of the setting to display the back surface of a graphic, the graphic removal unit 140 adds front and back information indicative of the back surface such as a flag to the graphic information.
Then, the graphic removal unit 140 outputs graphic information GINFc of a graphic to be drawn on the two-dimensional display surface DIS to a drawing area determining unit 210 of the selection unit 202. The graphic information GINFc has, for example, vertex numbers of the graphic, positional information indicative of coordinates on the two-dimensional display surface DIS of the graphic, and front and back information of the graphic.
The selection unit 202 has the drawing area determining unit 210, a memory unit 220, and a read unit 230. The drawing area determining unit 210 calculates a drawing range RINF in a predetermined direction (for example, in the Y-direction of
For example, the drawing area determining unit 210 determines that a graphic is drawn in a processing area when the next equation (1) is satisfied. Meanwhile, Ymin and Ymax in the equation (1) are the minimum Y-coordinate and the maximum Y-coordinate of the graphic of determination target, and Amin and Amax are the minimum Y-coordinate and the maximum Y-coordinate of the processing area of determination target.
(Amin<Ymax)&&(Amax>Ymin) (1)
That is, the drawing area determining unit 210 determines that the graphic is drawn in the processing area when the minimum Y-coordinate Amin of the processing area of determination target is smaller than the maximum Y-coordinate Ymax of the graphic of determination target, and when the maximum Y-coordinate Amax of the processing area of determination target is larger than the minimum Y-coordinate Ymin of the graphic of the determination target.
The drawing area determining unit 210 removes the graphic information GINFc of a graphic not drawn in the processing area (in the examples of
The memory unit 220 stores the graphic information GINFc and the drawing range RINF output from the drawing area determining unit 210. That is, the memory unit 220 stores the graphic information GINFc and the drawing range RINF of the graphic to be drawn in the processing area.
The read unit 230 determines whether or not a graphic is drawn in the divided area DAR based on the drawing range RINF. Then, the read unit 230 transfers, for each divided area DAR, the graphic information GINFc of the graphic to be drawn in the divided area DAR from the memory unit 220, to the image generating unit 302. For example, the read unit 230 has a read control unit 232 and a read buffer 234.
The read control unit 232 first performs processing of calculating the divided area DAR in which a graphic is drawn, on for all the graphics to be drawn in the processing area. For example, the read control unit 232 substitutes the minimum Y-coordinate and the maximum Y-coordinate of each divided area DAR for Amin and Amax of the equation (1) and determines whether or not the equation (1) is satisfied. Because of this, the divided area DAR in which a graphic is drawn is calculated. The calculation result is stored in the read buffer 234 for each divided area DAR. For example, the read control unit 232 stores, for each divided area DAR, an index INDX indicative of a graphic to be drawn in the divided area DAR in the read buffer 234. The index INDX is, for example, an address etc. of the memory unit 220 in which the graphic information GINFc of the graphic to be drawn in the divided area DAR is stored.
For example, the read control unit 232 reads the graphic information GINFc stored in the memory unit 220 for each divided area DAR based on the index INDX stored in the read buffer 234. Then, the read control unit 232 outputs the read graphic information GINFc to a vertex read unit 310 of the image generating unit 302. Furthermore, the read control unit 232 outputs information indicative of the divided area DAR in which drawing is performed (for example, information indicative of a range in the Y-direction of the divided area DAR), to the image generating unit 302.
As described above, after performing the processing of calculating the divided area DAR in which a graphic is drawn, on all the graphics to be drawn in the processing area, the read unit 230 transfers, for each divided area DAR, the graphic information GINFc of a graphic to be drawn in the divided area DAR from the memory unit 220 to the image generating unit 302.
The image generating unit 302 acquires, for example, the vertex information VINFa corresponding to the vertex number within the graphic information VINFc received from the selection unit 202 and performs the geometry processing and the rendering processing for each divided area DAR. For example, the image generating unit 302 has the vertex read unit 310, a vertex processing unit 320, a graphic creation unit 330, a pixel generating unit 340, a pixel processing unit 350, and a pixel removal unit 360.
The vertex read unit 310 receives the graphic information GINFc from the read control unit 232. Then, the vertex read unit 310 reads the vertex information VINFa corresponding to the vertex number within the graphic information GINFc, from the memory etc. in which the vertex information VINFa is stored. The vertex information VINFa read by the vertex read unit 310 has, for example, coordinate information of the three-dimensional coordinates of each vertex of a graphic, color information of each vertex, texture coordinate information, normal vector information, etc. The vertex read unit 310 outputs the read vertex information VINFa to the vertex processing unit 320.
The vertex processing unit 320 performs vertex processing based on the vertex information VINFa. The vertex processing performed by the vertex processing unit 320 includes, for example, processing relating to coordinates such as rotation, lighting, calculation of color of each vertex, calculation of texture coordinates, calculation of normal vector of each vertex, etc. The result of the vertex processing is input to the graphic creation unit 330. The graphic creation unit 330 converts the result of the vertex processing by the vertex processing unit 320 into graphic information. In the examples of
The graphic information output from the graphic creation unit 330 has, for example, positional information of the graphic (coordinates on the two-dimensional display surface DIS), information of equation of each side of the graphic, color information, texture coordinate information, normal vector information. Z-direction information (depth information), etc.
The pixel generating unit 340 generates pixel information based on the graphic information received from the graphic creation unit 330. Then, the pixel generating unit 340 outputs the pixel information to the pixel processing unit 350. The pixel processing unit 350 makes calculation of color, calculation of texture coordinates, etc., in units of pixels based on the pixel information received from the pixel generating unit 340. For example, the pixel processing unit 350 or the like refers to the setting information etc. about the divided area DAR when generating the pixel information of the divided area DAR. The setting information is, for example, a transformation matrix of the graphic, material information such as reflectance, positional information of a light source, etc.
The pixel removal unit 360 removes the pixel information not drawn on the two-dimensional display surface DIS of the pixel information processed by the pixel processing unit 350. For example, the pixel removal unit 360 performs the Z test based on the Z value (depth of a pixel) stored in the line depth buffer 410 and removes unnecessary pixel information. The pixel information left without being removed corresponds to the image data GDATA of the divided area DAR of drawing target. The pixel removal unit 360 stores the image data GDATA of the divided area DAR in the line buffer 400.
As described above, the image generating unit 302 generates the image data GDATA for each divided area DAR. Because of this, the line buffer 400 stores the image data GDATA corresponding to one divided area DAR of the divided areas DAR obtained by dividing the two-dimensional display surface DIS. Consequently, in the present embodiment, it is possible to reduce the memory size of the line buffer 400 in comparison with the frame buffer storing data (image data) of all the pixels of the two-dimensional display surface DIS. The display circuit 500 sequentially reads the image data GDATA from the line buffer 400 and displays the image on the two-dimensional display surface DIS.
For example,
In the examples of
First, as illustrated in
The graphic creation unit 130 receives information of the three vertexes (vertexes of the triangle) from the vertex processing unit 120 as the result of the vertex processing and converts the result of the vertex processing into information (graphic information GINFc) of the triangles TR10, TR20, TR30, and TR40. The graphic information GINFc has, for example, vertex numbers of the triangles TR10, TR20, TR30, and TR40, positional information of the triangles TR10, TR20, TR30, and TR40 (coordinates on the two-dimensional display surface DIS), and front and back information of the graphic. The graphic removal unit 140 performs clipping processing and culling processing, to remove the unnecessary graphic information GINFc.
Then, as illustrated in
Next, as illustrated in
Then, after storing the index INDX in the read buffer 234 for each of the divided area DAR1 and DAR2, the read control unit 232 transfers the graphic information GINFc of the triangles TR30 and TR40 to be drawn in the divided area DAR1, from the memory unit 220 to the image generating unit 302. The vertex read unit 310 of the image generating unit 302 reads the vertex information VINFa corresponding to the vertex number within the graphic information GINFc. Then, by the vertex processing unit 320 and the graphic creation unit 330, the graphic information of the triangles TR30 and TR40 to be drawn in the divided area DAR1 is generated.
The pixel generating unit 340 generates pixel information based on the graphic information received from the graphic creation unit 330. The pixel processing unit 350 performs calculation of color, calculation of texture coordinates, etc., in units of pixels based on the pixel information received from the pixel generating unit 340. The pixel removal unit 360 performs, for example, the Z test based on the Z value (depth of a pixel) stored in the line depth buffer 410, to remove unnecessary pixel information. Then, the pixel removal unit 360 stores the image data GDATA of the divided area DAR1 in the line buffer 400. The display circuit 500 reads the image data GDATA from the line buffer 400 and displays the image in the divided area DAR1 of the two-dimensional display surface DIS.
After the image is displayed in the divided area DAR1, as illustrated in
Next, as illustrated in
Then, as illustrated in
Next, as illustrated in
Furthermore, for example, the combined use of the spaces for the divided areas DAR1 and DAR3 of the read buffer 234 may be made. Similarly, the combined use of the spaces for the divided areas DAR2 and DAR4 of the read buffer 234 may be made. The read control unit 232 transfers the graphic information GINFc of the triangles TR20 and TR40 to be drawn in the divided area DAR3 from the memory unit 220 to the image generating unit 302 based on the index INDX stored in the space for the divided area DAR3 of the read buffer 234. Then, the image generating unit 302 generates the image data of the divided area DAR3 of the two-dimensional display surface DIS based on the graphic information GINFc of the triangles TR20 and TR40 received from the read control unit 232.
After the image is displayed in the divided area DAR3, as illustrated in
The configuration and operation of the drawing device 12 are not limited to this example. For example, in place of the index INDX, the graphic information GINFc may be stored in the read buffer 234. At this time, the read control unit 232 transfers the graphic information GINFc of a graphic to be drawn in the divided area DAR, from the read buffer 234 to the image generating unit 302.
Furthermore, for example, when the processing area PAR agrees with the divided area DAR, the memory unit 220 and the read unit 230 of the selection unit 202 may be omitted. At this time, instead of storing, for each processing area PAR, the graphic information GINFc in the memory unit 220, the drawing area determining unit 210 outputs the graphic information GINFc to the image generating unit 302 for each processing area PAR (divided area DAR).
Alternatively, when the entire screen of the two-dimensional display surface DIS is one processing area PAR, the drawing area determining unit 210 of the selection unit 202 may be omitted. At this time, the graphic information GINFc of a graphic to be drawn on the two-dimensional display surface DIS is stored in the memory unit 220. Then, the drawing range RINF in a predetermined direction (for example, in the Y-direction of
As described above, in the present embodiment also, it is possible to obtain the same effect as that of the above-described embodiment. Note that in the present embodiment, the graphic information GINFc to be stored in the memory unit 220 is generated by the vertex processing relating to coordinates. That is, the graphic information GINFc does not include the result of the vertex processing relating to the parameters (for example, color and normal line) other than coordinates. Because of this, in the present embodiment, it is possible to suppress an increase in the memory size of the memory unit 220.
Moreover, in the present embodiment, the coordinate transformation unit 102 generates the graphic information GINFc without dividing a graphic. Here, for example, when the graphic information GINFc is generated after dividing the triangle TR40 into four areas in accordance with the divided area DAR, the amount of data of the graphic information GINFc of the triangle TR40 is about four times the amount of data when generating the graphic information GINFc without dividing the graphic. In each processing area PAR, the amount of data of the graphic information GINFc of the triangle TR40 is about twice the amount of data when generating the graphic information GINFc without dividing the graphic. In contrast to this, in the present embodiment, the graphic information GINFc to be stored in the memory unit 220 is generated without dividing the graphic. Because of this, in the present embodiment, it is possible to suppress an increase in the memory size of the memory unit 220.
The drawing device 14 has, for example, the coordinate transformation unit 102, a selection unit 204, the image generating unit 302, the line buffer 400, the line depth buffer 410, and the display circuit 500. The coordinate transformation unit 102, the selection unit 204, and the image generating unit 302 correspond to the coordinate transformation unit 100, the selection unit 200, and the image generating unit 300, respectively, illustrated in
The selection unit 204 has the read unit 236 in place of the read unit 230 illustrated in
The read determining unit 238 determines, for all the graphics to be drawn in the processing area PAR, whether or not a graphic is drawn in the divided area DAR for each divided area DAR. Then, the read determining unit 238 transfers the graphic information GINFc of a graphic to be drawn in the divided area DAR, from the memory unit 220 to the vertex read unit 310 of the image generating unit 302.
For example, in the operation illustrated in
Then, the read determining unit 238 determines whether or not the triangle TR40 is drawn in the divided area DAR1 based on the drawing range RINF of the triangle TR40. Because the triangle TR40 is drawn in the divided area DAR1, the read determining unit 238 transfers the graphic information GINFc of the triangle TR40 from the memory unit 220 to the image generating unit 302. Also when transferring the graphic information GINFc of a graphic to be drawn in the divided area DAR2 to the image generating unit 302, the read determining unit 238 determines whether or not each of the triangles TR10, TR30, and TR40 is drawn in the divided area DAR2.
Furthermore, for example, in the operation illustrated in
In this way, the read unit 236 determines whether or not a graphic is drawn in the divided area DAR based on the drawing range RINF and transfers the graphic information GINFc of the graphic to be drawn in the divided area DAR from the memory unit 220 to the image generating unit 302 for each divided area DAR.
The configuration and operation of the drawing device 14 are not limited to this example. For example, when the entire screen of the two-dimensional display surface DIS is one processing area PAR, the drawing area determining unit 210 of the selection unit 204 may be omitted. At this time, the graphic information GINFc of a graphic to be drawn on the two-dimensional display surface DIS is stored in the memory unit 220. Then, the drawing range RINF in a predetermined direction (for example, in the Y-direction of
As described above, in the present embodiment also, it is possible to obtain the same effect as that of the above-described embodiments.
The drawing device 12A has, for example, the coordinate transformation unit 102A, the selection unit 202A, the image generating unit 302A, the line buffer 400, the line depth buffer 410, and the display circuit 500. The coordinate transformation unit 102A, the selection unit 202A, and the image generating unit 302A correspond to the coordinate transformation unit 100, the selection unit 200, and the image generating unit 300, respectively, illustrated in
The coordinate transformation unit 102A generates graphic information GINFa including information necessary to generate the image data GDATA based on the vertex information VINFa. For example, the coordinate transformation unit 102A performs geometry processing. The configuration and operation of the coordinate transformation unit 102A are the same as those of the coordinate transformation unit 102 illustrated in
The vertex read unit 110A reads the vertex information VINFa from, for example, a memory etc. in which the vertex information VINFa is stored and outputs the read vertex information VINFa to the vertex processing unit 120A. The vertex information VINFa read by the vertex read unit 110A has, for example, coordinate information of the three-dimensional coordinates of each vertex of a graphic, color information of each vertex, texture coordinate information, normal vector information, etc.
The vertex processing unit 120A performs vertex processing based on the vertex information VINFa. The vertex processing performed by the vertex processing unit 120A includes, for example, processing relating to coordinates such as rotation, lighting, calculation of color of each vertex, calculation of texture coordinates, calculation of the normal vector of each vertex, etc. The result of the vertex processing is input to the graphic creation unit 130A. The graphic creation unit 130A transforms the result of the vertex processing by the vertex processing unit 120A into graphic information. In the examples of
The graphic removal unit 140A removes the graphic information of a graphic not drawn on the two-dimensional di play surface DIS of the graphic information generated by the graphic creation unit 130A. For example, the graphic removal unit 140A performs clipping processing and culling processing of removing unnecessary graphic information. Then, the graphic removal unit 140A outputs the graphic information GINFa of a graphic to be drawn on the two-dimensional display surface DIS, to a drawing area determining unit 210A of the selection unit 202A. The graphic information GINFa output from the graphic removal unit 140A has, for example, positional information of the graphic (coordinates on the two-dimensional display surface DIS), information of the equation of each side of the graphic, color information, texture coordinate information, normal vector information, Z-direction information (depth information), etc.
The selection unit 202A has the drawing area determining unit 210A, a memory unit 220A, and a read unit 230A. The configuration and operation of the selection unit 202A are the same as those of the selection unit 202 illustrated in
The memory unit 220A stores the graphic information GINFa, the drawing range RINF, etc., output from the drawing area determining unit 210A. For example, the memory unit 220A stores the graphic information GINFa of a graphic to be drawn in the processing area PAR, the drawing range RINF, and the setting information about the processing area PAR.
The read unit 230A determines whether or not a graphic is drawn in the divided area DAR based on the drawing range RINF. Then, the read unit 230A transfers, for each divided area DAR, the graphic information GINFa of a graphic to be drawn in the divided area DAR, from the memory unit 220A to the image generating unit 302A. For example, the read unit 230A has a read control unit 232A and a read buffer 234A.
The read control unit 232A first performs processing of calculating the divided area DAR in which a graphic is drawn, on all the graphics to be drawn in the processing area PAR. The calculation result is stored in the read buffer 234A for each divided area DAR. For example, the read control unit 232A stores, for each divided area DAR, the index INDX indicative of the graphic to be drawn in the divided area DAR in the read buffer 234A. After that, the read control unit 232A reads the graphic information GINFa stored in the memory unit 220A for each divided area DAR based on the index INDX stored in the read buffer 234A.
Then, the read control unit 232A outputs the read graphic information GINFa to the pixel generating unit 340 of the image generating unit 302A. Furthermore, the read control unit 232A outputs information indicative of the divided area DAR in which drawing is performed (for example, information indicative of the range in the Y-direction of the divided area DAR), to the image generating unit 302A.
In this way, after performing the processing of calculating the divided area DAR in which a graphic is drawn, on all the graphics to be drawn in the processing area PAR, the read unit 230A transfers, for each divided area DAR, the graphic information GINFa of a graphic to be drawn in the divided area DAR from the memory unit 220A to the image generating unit 302A.
The image generating unit 302A has the pixel genera ing unit 340, the pixel processing unit 350, and the pixel removal unit 360. That is, the configuration and operation of the pixel generating unit 340, the pixel processing unit 350, and the pixel removal unit 360 are the same as those of the pixel generating unit 340, the pixel processing unit 350, and the pixel removal unit 360 of the image generating unit 302 illustrated in
For example, the drawing area determining unit 210A sequentially receives setting information SINF1 and SINF2, graphic information GINFa10 of the triangle TR10, graphic information GINFa20 of the triangle TR20, setting information SINF3, graphic information GINFa30 of the triangle TR30, and graphic information GINFa40 of the triangle TR40, from the coordinate transformation unit 102A. The setting information SINF is, for example, a transformation matrix of a graphic, material information such as reflectance, positional information of a light source, etc.
Then, the drawing area determining unit 210A sequentially outputs the setting information SINF1 and SINF2, the graphic information GINFa10, a drawing range RINF10, the setting information SINF3, the graphic information GINFa30, a drawing range RINF30, the graphic information GINFa40, and a drawing range RINF40, to the memory unit 220A. To the data output from the drawing area determining unit 210A, the drawing ranges RINF10, RINF30, and RINF40 of the triangles TR10, TR30, and TR40 to be drawn in the processing area PAR1 of determination target are added. The graphic information GINFa20 of the triangle TR20 not drawn in the processing area PAR1 is not output to the memory unit 220A.
The configuration and operation of the drawing device 12A are not limited to this example. For example, in the read buffer 234A, the graphic information GINFa may be stored in place of the index INDX. At this time, the read control unit 232A transfers the graphic information GINFa of a graphic to be drawn in the divided area DAR, from the read buffer 234A to the image generating unit 302A.
Furthermore, for example, when the processing area PAR agrees with the divided area DAR, the memory unit 220A and the read unit 230A of the selection unit 202A may be omitted. At this time, the drawing area determining unit 210A outputs the graphic information GINFa to the image generating unit 302A for each processing area PAR (divided area DAR) instead of storing the graphic information GINFa in the memory unit 220A for each processing area PAR.
Alternatively, when the entire screen of the two-dimensional display surface DIS is one processing area PAR, the drawing area determining unit 210A of the selection unit 202A may be omitted. At this time, the graphic information GINFa of a graphic to be drawn on the two-dimensional display surface DIS is stored in the memory unit 220A. Then, the drawing range RINF in a predetermined direction (for example, in the Y-direction of
As described above, in the present embodiment also, it is possible to obtain the same effect as that of the above-described embodiments.
The drawing device 14A has, for example, the coordinate transformation unit 102A, a selection unit 204A, the image generating unit 302A, the line buffer 400, the line depth buffer 410, and the display circuit 500. The coordinate transformation unit 102A, the selection unit 204A, and the image generating unit 302A correspond to the coordinate transformation unit 100, the selection unit 200, and the image generating unit 300, respectively, illustrated in
The selection unit 204A has the read unit 236A in place of the read unit 230A illustrated in
For example, the read determining unit 238A determines, for each divided area DAR, whether or not a graphic is drawn in the divided area DAR for all the graphics to be drawn in the processing area PAR. Then, the read determining unit 238A transfers the graphic information GINFa of a graphic to be drawn in the divided area DAR, from the memory unit 220A to the pixel generating unit 340 of the image generating unit 302A.
The configuration and operation of the drawing device 14A are not limited to this example. For example, when the entire screen of the two-dimensional display surface DIS is one processing area PAR, the drawing area determining unit 210A of the selection unit 204A may be omitted. At this time, the graphic information GINFa of a graphic to be drawn on the two-dimensional display surface DIS is stored in the memory unit 220A. Then, the drawing range RINF in a predetermined direction (for example, in the Y-direction of
As described above, also in the present embodiment, it is possible to obtain the same effect as that of the above-described embodiments.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiment of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2011-166822 | Jul 2011 | JP | national |