EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

Special Issue on Embedded Software Design for 3D Graphics Visualization (CFP)

Started by Nadia October 4, 2010
Call for Papers

Special Issue on
Embedded Software Design for 3D Graphics Visualization

In the field of Computer Graphics, one of the big challenges is to
produce photo-quality images from a three-dimensional model. The
challenge is especially bigger when this production has to occur in
real time, i.e. at a rate of at least 60 frames per second (60 fps) to
ensure interactivity between the user and the image preview. These
goals are still far from being achieved, especially when it comes to
combining performance and image quality as these are conflicting, i.e.
better quality implies lower performance throughput and higher
performance yields bad quality rendering results.

Graphics processors using algorithms based on Local Illumination
(rasterization) dominate the market for at least two decades now.
Rasterization is used to identify the process of converting a vector
information (e. g. geometric descriptions) into an image. Within this
implementation model, the hardware is not aware of the whole scene and
its main focus is on how to project or convert polygons to image
coordinate at a high speed, i. e. in an order of magnitude of hundreds
of thousands of polygons per second, using a pipelined processing.
However, without knowledge of the scene=92s characteristics and details
about the position the other objects in the scene, the generated
images lack of important features, like shadowing and reflection. The
methods that can produce better and more realistic images are those
that fall within the category of algorithms that use the principle of
Global Illumination, in which the whole scene is evaluated with
respect to each of its element before yielding the final image, as in
the algorithm Ray tracing.

The Elsevier Embedded Software Design (JSA) Journal seeks original
manuscripts for a Special Issue on Embedded Software for 3D Graphics
Visualization scheduled to appear in the second half of 2011. The
papers must present novel embedded software designs, architectures and
models used for rendering 3D scenes efficiently.

Submission Guidelines
The submitted papers must be written in English and describe original
research which is neither published, nor currently under review by
other journals or conferences. The author guidelines for preparation
of manuscript can be found at http://www.elsevier.com/locate/sysarc.
All manuscripts and any supplementary material should be submitted to
the via the Elsevier online system of the journal, available at
http://ees.elsevier.com/jsa/. Please, send a letter of intent and all
enquiries regarding this special issue to Guest Editors.

Important Dates
Submission deadline: September 27, 2010 Extended to October 31st.,
2010
First author notification: November 29, 2010
Revisions due by: January 31, 2011
Final Notification: March 29, 2011

Guest Editors
Nadia Nedjah
Dept. of Electronics Engineering and Telecommunications
State University of Rio de Janeiro, Brazil
nadia@eng.uerj.br

Felipe Maia Galv=E3o Fran=E7a
Systems Engineering and Computer Science Program
Federal University of Rio de Janeiro, Brazil
Felipe@cos.ufrj.br

Luiza de Macedo Mourelle
Dept. of Systems Engineering and Computation
State University of Rio de Janeiro, Brazil
ldmm@eng.uerj.br
In article <309f17fb-2c77-476b-bfcc-
4d4abf0a1eee@c10g2000yqh.googlegroups.com>, nadia@eng.uerj.br says...
> Call for Papers >=20 > Special Issue on > Embedded Software Design for 3D Graphics Visualization
Hmmm, I wonder if they consider an XBOX-360 as an embedded system? Last time I played Halo or Modern Combat, the=20 real-time 3D graphics seemed pretty good. I guess it also depends on the definition of "photo-quality". If it has to be done at 60FPS, I don't think it's going to be printed on paper, so it might be better to require full high-definition 1080P graphics.
>=20 > In the field of Computer Graphics, one of the big challenges is to > produce photo-quality images from a three-dimensional model. The > challenge is especially bigger when this production has to occur in > real time, i.e. at a rate of at least 60 frames per second (60 fps) to > ensure interactivity between the user and the image preview. These > goals are still far from being achieved, especially when it comes to > combining performance and image quality as these are conflicting, i.e. > better quality implies lower performance throughput and higher > performance yields bad quality rendering results. >=20 > Graphics processors using algorithms based on Local Illumination > (rasterization) dominate the market for at least two decades now.
Is that what they use in the latest NVIDIA cards?
> Rasterization is used to identify the process of converting a vector > information (e. g. geometric descriptions) into an image. Within this > implementation model, the hardware is not aware of the whole scene and > its main focus is on how to project or convert polygons to image > coordinate at a high speed, i. e. in an order of magnitude of hundreds > of thousands of polygons per second, using a pipelined processing. > However, without knowledge of the scene=3D3Fs characteristics and details > about the position the other objects in the scene, the generated > images lack of important features, like shadowing and reflection. The > methods that can produce better and more realistic images are those > that fall within the category of algorithms that use the principle of > Global Illumination, in which the whole scene is evaluated with > respect to each of its element before yielding the final image, as in > the algorithm Ray tracing.
It sounds like the author is years behind the curve for PC and game=20 console graphics. Way back in 2007, they were talking about 500,000,000=20 polygons per second on the XBOX-360.
>=20 > The Elsevier Embedded Software Design (JSA) Journal seeks original > manuscripts for a Special Issue on Embedded Software for 3D Graphics > Visualization scheduled to appear in the second half of 2011. The > papers must present novel embedded software designs, architectures and > models used for rendering 3D scenes efficiently. >=20 > Submission Guidelines > The submitted papers must be written in English and describe original > research which is neither published, nor currently under review by > other journals or conferences. The author guidelines for preparation > of manuscript can be found at http://www.elsevier.com/locate/sysarc. > All manuscripts and any supplementary material should be submitted to > the via the Elsevier online system of the journal, available at > http://ees.elsevier.com/jsa/. Please, send a letter of intent and all > enquiries regarding this special issue to Guest Editors. >=20 > Important Dates > Submission deadline: September 27, 2010 Extended to October 31st., > 2010 > First author notification: November 29, 2010 > Revisions due by: January 31, 2011 > Final Notification: March 29, 2011 >=20 > Guest Editors > Nadia Nedjah > Dept. of Electronics Engineering and Telecommunications > State University of Rio de Janeiro, Brazil > nadia@eng.uerj.br >=20 > Felipe Maia Galv=E3o Fran=E7a > Systems Engineering and Computer Science Program > Federal University of Rio de Janeiro, Brazil > Felipe@cos.ufrj.br >=20 > Luiza de Macedo Mourelle > Dept. of Systems Engineering and Computation > State University of Rio de Janeiro, Brazil > ldmm@eng.uerj.br >=20
Mark Borgerson

The 2024 Embedded Online Conference