30.11.2012 Views

Automotive User Interfaces and Interactive Vehicular Applications

Automotive User Interfaces and Interactive Vehicular Applications

Automotive User Interfaces and Interactive Vehicular Applications

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

multiple views or how the individual HMI components interact<br />

with each other within a single view.<br />

3. WIDGETS IN 3D-GUIs<br />

Conventional widgets for 2D GUIs come with a pre-defined look.<br />

This look is defined in a drawing routine that renders a widget<br />

instance to a given canvas object while taking into account the<br />

current state of the widget. However, this approach is not suitable<br />

for 3D GUIs. In such GUIs the user interacts with objects in a<br />

three dimensional space. These objects might look partly<br />

transparent <strong>and</strong> can intersect or illuminate each other. As a<br />

consequence, a single widget cannot draw itself independently<br />

because its appearance is influenced by other objects. A central<br />

renderer is needed to h<strong>and</strong>le the overall rendering taking all<br />

objects into account.<br />

This calls for new concepts in widget implementation. A possible<br />

solution would be to modify the widget’s drawing routine to<br />

report the objects that reflect its present state to the central<br />

renderer. The central renderer would then perform the rendering<br />

of the scene after it has collected this information from all<br />

widgets. In this approach the look of the widget is defined<br />

programmatically just like with conventional 2D widgets. As<br />

previously described this leads to duplication of work which<br />

makes design changes costly <strong>and</strong> time consuming.<br />

Having a central renderer that knows the complete structure of the<br />

scene in the form of a scene graph is the key to implement 3D<br />

GUIs. Moreover, it can also be useful in conventional 2D GUIs<br />

because it also enables implementing appealing effects such as<br />

camera movements or filters.<br />

4. CONTROLLER-WIDGETS<br />

In order to be able building seamlessly integrated toolchains it<br />

makes no sense to dictate visual designers what tool they have to<br />

use to create design drafts. GUI builders in particular offer limited<br />

capabilities compared to professional graphic editors <strong>and</strong><br />

therefore hinder the creative process. Furthermore, it is more<br />

difficult to try out different design variants using such tools<br />

because the widgets required for each variant have to be<br />

implemented beforeh<strong>and</strong>. Instead, visual designers should be able<br />

to use tools that offer maximum design possibilities without being<br />

restrained by implementation details. Ideally, st<strong>and</strong>ard graphics<br />

editing programs can be used such as Adobe Photoshop or<br />

Autodesk Maya.<br />

Analogous to the approach proposed in EDONA, the designs<br />

created in these tools should be saved to a st<strong>and</strong>ard format which<br />

then can be imported to subsequent tools <strong>and</strong> reused for the<br />

implementation of the final HMI. However, in order to be<br />

applicable in the development of three dimensional GUIs, too,<br />

other image formats than SVG have to be used. In addition to the<br />

static GUI layout the format should also include animations.<br />

COLLADA <strong>and</strong> FBX are examples of formats that can match<br />

these requirements.<br />

After the GUI has been imported into the HMI development tool<br />

it is represented as a scene graph. This scene graph can be<br />

supplemented with technical information that are not relevant for<br />

the graphical design but are required to implement an efficient<br />

rendering in an embedded system. Such information cannot be<br />

imported because they are not specified by the visual designers.<br />

Up to this point in time the imported GUI design is still static.<br />

Widgets can be used to make it become dynamic. In this case the<br />

widgets serve as reusable components of HMI behavior that can<br />

be configured using the widgets’ properties. Just like in<br />

conventional GUI builders these properties also define the<br />

interface between the GUI <strong>and</strong> the application logic. However, the<br />

appearance of these widgets is not programmed. Instead of that,<br />

the widgets are assigned to elements of the imported scene graph.<br />

They can manipulate these elements according to their current<br />

state. As a result, programming widgets is reduced to the<br />

definition of this behavior as well as its properties for<br />

parameterization <strong>and</strong> data exchange with the application. Instead<br />

of using ESTEREL other programming languages for embedded<br />

systems should be used that are familiar to more developers.<br />

With regard to the popular model-view-controller pattern 5 [22] the<br />

widget’s task is reduced to the controller-role while the view is<br />

derived directly from the visual designer’s drafts (Figure 4).<br />

Figure 4 – Controller-Widgets vs. Conventional Widgets<br />

The described approach enables the adoption of new design drafts<br />

with little effort. The static appearance of new GUI design can be<br />

evaluated in prototypes before the necessary widgets need to be<br />

available, since they are only required to add the dynamic<br />

behavior <strong>and</strong> to connect the GUI to the application logic.<br />

5. IMPLEMENTATION IN MODERN 3D-<br />

HMI-DEVELOPMENT TOOLS<br />

A new generation of HMI-development tools is needed to<br />

overcome the difficulties discussed above. Such development<br />

tools allow the direct reuse as well as the progressive refinement<br />

of design drafts <strong>and</strong> also the definition of static <strong>and</strong> dynamic<br />

behavior. Static behavior is created at design-time using<br />

animations while dynamic behavior is stimulated by events at runtime.<br />

These tools are based on the concept of the model-view-controller<br />

pattern which masters complexity with its specialized roles in<br />

HMI development. Separation of different development aspects is<br />

also considered, such as ergonomic designs, run-time behavior,<br />

<strong>and</strong> cross-divisional tasks like configuration management, quality<br />

management <strong>and</strong> quality assurance. The complexity <strong>and</strong> specialty<br />

of each individual aspect leads to organizational <strong>and</strong> technical<br />

partitioning of the development process. The individual tasks are<br />

supported by separate tools which are integrated into an end-toend<br />

toolchain. These tools can provide deeper expertise in this<br />

aspect compared to extensive multi-functional tools, since they<br />

focus on a single development aspect. Ideally, the highly<br />

specialized tools which form the toolchain can be selected<br />

independently from each other. This requires interface <strong>and</strong><br />

workflow compatibility which is facilitated by supporting<br />

established industrial st<strong>and</strong>ards.<br />

5 Model-View-Controller is a concept of the Smalltalk<br />

programming language [22] <strong>and</strong> is now often referred to as a<br />

design pattern. It separates the user interface into 3 parts with<br />

different roles. The model holds the state of the application <strong>and</strong><br />

manages its data. The view renders a model into a form suitable<br />

for interaction with the user. The controllers mediate between<br />

both parts <strong>and</strong> control the user interaction [23].

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!